identifier
stringlengths 1
43
| dataset
stringclasses 3
values | question
stringclasses 4
values | rank
int64 0
99
| url
stringlengths 14
1.88k
| read_more_link
stringclasses 1
value | language
stringclasses 1
value | title
stringlengths 0
200
| top_image
stringlengths 0
125k
| meta_img
stringlengths 0
125k
| images
listlengths 0
18.2k
| movies
listlengths 0
484
| keywords
listlengths 0
0
| meta_keywords
listlengths 1
48.5k
| tags
null | authors
listlengths 0
10
| publish_date
stringlengths 19
32
⌀ | summary
stringclasses 1
value | meta_description
stringlengths 0
258k
| meta_lang
stringclasses 68
values | meta_favicon
stringlengths 0
20.2k
| meta_site_name
stringlengths 0
641
| canonical_link
stringlengths 9
1.88k
⌀ | text
stringlengths 0
100k
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
4768
|
dbpedia
|
2
| 48
|
https://spf13.com/presentation/what-should-a-modern-practical-programming-language-look-like/
|
en
|
What should a modern practical programming language look like
|
[
"https://spf13.com/p/spf13-google/mantas-hesthaven-_g1WdcKcV3w-unsplash.jpg",
"https://spf13.com/p/infoq-interview-go-language-at-13-years/infoq-go-at-13-screenshot.jpg",
"https://spf13.com/p/gophers-say-gophercon-edition/front.jpg",
"https://d33wubrfki0l68.cloudfront.net/0f8244e60770ae5e1510aac9a132602f436e9664/8bc30/media/spf13-tiny.jpg"
] |
[] |
[] |
[
""
] | null |
[] |
2022-07-18T16:49:39+00:00
|
Keynote delivered at The Landing Festival – Berlin. This presentation describes the many inspirations and influences of the Go programming language and goes into detail about how we shaped and created the language.
|
en
|
/favicon.ico
| null |
What should a modern, practical programming language look like Landing Festival – Berlin – April 4, 2019 Steve Francia Google @spf13
David McCullough 2012 History is who we are and why we are the way we are. We are each all shaped by the history of what came before. To understand where we are, we must understand what brought us here.
In the late 1950s people were becoming uneasy how each new computer spawned its own distinct language. At the time, programming languages were provided by the hardware manufacturers and even differed from model to model. The first language to be consistent across models was Fortran but it was still limited to its manufacturer, IBM. A committee was formed to design the first truly universal, machine-independent programming language.
Arc de Triomphe , Champs-Elysées, Paris, 1960. Photo: Charles Weever Cushman (via University of Indiana)
In January 1960, 13 computer scientists met in Paris for an unprecedented meeting with the goal of developing such a language. There were 6 delegates sent from America and 7 delegates from Europe.
The meetings were exhausting, interminable, and exhilarating. One became aggravated when one’s good ideas were discarded along with the bad ones of others. Nevertheless, diligence persisted during the entire period. The chemistry of the 13 was excellent. Alan Perlis
Algol
Here is a language so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors. Tony Hoare Hints on Programming Language Design – 1973
Algol Algol W In spite of that statement, Tony Hoare along with Niklaus Wirth went on to create a successor of ALGOL 60 called ALGOL W
Algol Algol W Pascal which, in turn, led Wirth to go on to create Pascal which was initially published in 1970.
Algol Algol W Pascal CPL Independently CPL was created at the University of Cambridge as the “Cambridge Programming Language”
Algol Algol W Pascal CPL BCPL it led to BCPL or Basic CPL, a much simpler language based on CPL.
Algol Algol W Pascal CPL BCPL B At Bell Labs in the US Ken Thompson and Dennis Ritchie created B, based mainly on BCPL
Algol Algol W Pascal CPL BCPL B C and shortly after, it’s successor, C.
Algol Algol W Pascal CPL BCPL B C
Pascal Modula Oberon The Pascal branch flourished in Europe, with many successors including Modula and Oberon.
C++ C# C Java C proliferated in the US inspiring and enabling C++, C#, Java as well as JavaScript, Python, Perl, PHP and many many more languages.
Oberon 2007 Pascal C C++ C# Java Perl Python PHP JS Obj C AWK Modula Ruby Shell By 2007 dozens of languages existed that can all trace their roots back to the common ancestor Algol
Why Another Language SECTION TWO
Growing at Google Scale SECTION 2.1 To understand the answer to this question we need to look at the environment from which Go was born.
At scale everything breaks no matter what you do and you have to deal reasonably cleanly with that and try to hide it from the people actually using your system. Urs Hölzle Google: ‘At scale, everything breaks’ – 2011 A bit more than 10 years after Google launched, it’s head of infrastructure Urs said…
In its first 10 years Google had experienced scale unlike the world had ever seen before. Growth of hundreds to 10s of thousands times.
In addition to machine scale, Google headcount also grew significantly from well under 1000 employees in 2000 to over 24k in 2010.
Urs Hölzle Google: ‘At scale, everything breaks’ – 2011 Keeping things simple and yet scalable is actually the biggest challenge. And through all this growth, Urs identified that …
Google’s Scale brought Complexity
Massive codebases with many dozens of programmers
Very large build farms
45+ min builds
Google was struggling with this challenge, especially within its engineering efforts internally.
Brian Kernighan Software Tools – 1976 Controlling complexity is the essence of computer programming. So when faced with complexity a good computer scientist knows what to do.
During a 45 minute C++ build… SECTION 2.2
Rob Pike Paraphrase
Rob Pike’s words: Back around September 2007, I was doing some minor but central work on an enormous Google C++ program, one you’ve all interacted with, and my compilations were taking about 45 minutes on our huge distributed compile cluster. An announcement came around that there was going to be a talk presented by a couple of Google employees serving on the C++ standards committee. They were going to tell us what was coming in C++11.
In the span of an hour at that talk we heard about something like 35 new features that were being planned. … At this point I asked myself a question:
Did the C++ committee really believe that was wrong with C++ was that it didn’t have enough features? Surely…, it would be a greater achievement to simplify the language rather than to add to it. Rob Pike 2007 Rob returned to his desk with his office mates. This really got them thinking about…
What should a modern, practical programming language look like? By the time that 45 minute build was done they already had a whiteboard full of ideas.
Ken Thompson Ken comes from US and C & Unix
Robert Griesemer Europe Smalltalk & Pascal
Studied under Niklaus Wirth
Algol Algol W Pascal CPL BCPL B C Go is the first language that can claim heritage of both the European and US branches of language design. In a very real sense it has unified these two branches as well as a third branch we’ll talk about later.
We built from scratch, borrowing [from C] only minor things like operators and brace brackets and a few common keywords. And of course we also borrowed ideas from other languages we knew. … Rob Pike Less is Exponentially More – 2012 In speaking about Go’s inspiration, Rob Pike states…
Go’s Many Ancestors and Influences Alef Algol APL BCPL B C C++ CLU CSP Java Limbo Modula Newsqueak
Oberon Occam Pascal Python Simula Smalltalk
I would claim that there has never been a set of language designers with broader or deeper language design expertise than these three. They had a rich knowledge of what came before and they knew just what to cherry pick.
Go’s Design Principles SECTION 2.3 Now I’d like to share with you the 4 design principles that guided the development of the Go language.
Evolution not revolution Most ideas come from previous ideas Principle 1. The idea is that most ideas are not new at all.
Evolution not revolution New languages should consolidate, not invent features
Waiting for Good design No is temporary, Yes is forever Principle 2. There are many instances of this throughout Go’s history. The general idea is that when you are designing a language there is no “undo”. If you say “No” today, you can always say “Yes” tomorrow, but if you say “Yes” today you are stuck with that forever
Joshua Bloch Joshua Bloch: A conversation about design – 2002 When in doubt, leave it out Author of Effective Java
Consensus driven design As simple as possible, but no simpler Principle 3.
When the three of us got started, it was pure research. … We started off with the idea that all three of us had to be talked into every feature in the language, so there was no extraneous garbage put into the language for any reason.
Ken Thompson Interview – 2011 Ken brought this practice in from Bell Labs
There are two ways of constructing a software design. One way is to make it so simple that there are obviously no deficiencies. And the other way is to make it so complicated that there are no obvious deficiencies. C.A.R. Hoare The emperor’s old clothes – 1981
Go Doesn’t have Header files Classes Inheritance Constructors Pointer Arithmetic Uninitialized values
Annotations Templates Exceptions Globals Void … By design Go’s approach is minimalist. With a goal of simplicity we tried to create a language with as small a feature set as we could while keeping it productive and useful. A lot of features common in other languages were left out of Go including…
Rapid Iteration Expect & enable massive changes The last principle is that of Rapid iteration. That when you are in the design phase of a language you will need to make frequent and sometimes dramatic changes. Go forward with that expectation and build your process around it.
Russ Cox Ian Lance Taylor Within a few months of that 45 minute whiteboard session the team had grown to 5 with the addition of Russ Cox and Ian Lance Taylor.
At this point all the Go source that existed was in a single repo. Whenever they made a change they would fix all 1000 or so Go files in existence and change them all at once and then commit them back to the repo.
The mechanism they used worked quite well, but they knew wouldn’t scale for long. Russ would open all the Go files that existed in the world and made the changes with Rob, Ken & Robert looking over his shoulder.
It was a lot of regexp search and replace and then nodding that it looked right.
Over a year of refining Months of discussion, planning, prototyping, experimenting and whiteboarding Added one feature at a time Built two implementations (go, gccgo)
Nearly 5 years of refinement 2007 2009 2012 Started at Google as a 20% project 2 years later, Go is open sourced After nearly 5 years of development, Go 1.0 is released and attention shifts to using Go Open Source Birth 1.0
We spent 5 years developing Go
SECTION 2.3
Concurrent
Simple
Powerful
Fast
Expressive
Readable
Type-safe
Compiled
Compiled very quickly
Dynamic feeling
Go built for the world
Go was well received especially by the open source community who first embraced Go with notable projects including Docker, Kubernetes and Hugo.
JetBrains 2018 Developer Survey https://www.jetbrains.com/research/devecosystem-2018/
Go’s adoption has been steady since the 1.0 release. Last year Jetbrains awarded Go the title of most promising programming language.
Go Users Worldwide 2018
1.5 – 2 Million
Gophers span the world
3 Features of SECTION THREE
I’d like to now turn our attention to three specific features of Go which I think you will find interesting to explore a bit deeper.
Object-OrientedProgramming SECTION 3.1 The first is Go’s approach to Object Orientation.
C++/Java style OO is brittle This may not be a popular opinion, but it was our experience that when you write programs as inheritance heavy subclass systems it forces you to make decisions early that are very hard to change later
If any part of a system depends on the internals of another part, then complexity increases as the square of the size of the system
Dan Ingalls Object Oriented Programming — 1989
This should look familiar to anyone who has programmed in an semi recent OO language. The relationships lines are everywhere. http://www.ejb3.org/jar_file_reverse/jar_file_reverse.html
Inheritance And we see this exponential complexity in inheritance
Rich Hickey Clojure for Java Programmers – 2012 It is my opinion that object oriented programming, as delivered by Java, etc., is not a good default way to structure your program.
OO existed long before C++/Java However, before C++ and Java redefined Object Orientation there was…
Simulation Begin Class FittingRoom; Begin Ref (Head) door; Boolean inUse; Procedure request; Begin If inUse Then Begin Wait (door); door.First.Out; End; inUse:= True; End; Procedure leave; Begin inUse:= False; Activate door.First; End; door:- New Head; End;
Procedure report (message); Text message; Begin OutFix (Time, 2, 0); OutText (": " & message); OutImage; End;
Process Class Person (pname); Text pname; Begin While True Do Begin Hold (Normal (12, 4, u)); report (pname & " is requesting the fitting room"); fittingroom1.request; report (pname & " has entered the fitting room"); Hold (Normal (3, 1, u)); fittingroom1.leave; report (pname & " has left the fitting room"); End; End;
Integer u; Ref (FittingRoom) fittingRoom1;
fittingRoom1:- New FittingRoom; Activate New Person (“Sam”); Activate New Person (“Sally”); Activate New Person (“Andy”); Hold (100); End;
1965 Simula Simula took Algol and added to it objects, classes, inheritance and subclasses
It is considered the first object-oriented programming language and was influential in the development of Smalltalk and all following OO languages.
Code from – https://en.wikipedia.org/wiki/Simula
[Simula] change[d] from the procedural view, … they flipped it around to … the object-oriented one which is that within every type of object you have all the procedures that work on it.
Dan Ingalls Object Oriented Programming — 1989 Dan Ingalls, the implementer of Small Talk said…
Object subclass: #Philosophers instanceVariableNames: ‘forks philosophers randy eating’ classVariableNames: '' poolDictionaries: '' category: ‘Examples-Processes’!
!Philosophers class methodsFor: ‘dining’!
new self shouldNotImplement !
new: quantity ^super new initialize: quantity ! !
!Philosophers methodsFor: ‘dining’!
dine self dine: 15 !
dine: seconds (Delay forSeconds: seconds) wait. philosophers do: [ :each | each terminate ]. self initialize: self size !
leftFork: n ^forks at: n !
rightFork: n ^n = self size ifTrue: [ forks at: 1 ] ifFalse: [ forks at: n + 1 ] !
initialize: n eating := Semaphore new. n - 1 timesRepeat: [ eating signal ].
randy := Random new. forks := (1 to: n) collect: [ :each | Semaphore forMutualExclusion ].
Smalltalk 1980 Next came Smalltalk where everything is an object and objects are only communicated with via the sending of messages
Code from : https://raw.githubusercontent.com/gnu-smalltalk/smalltalk/master/examples/Dinner.st
Alan Kay A to Z of programming languages: Smalltalk-80 – 2010 I did make up this term [object oriented] and it was a bad choice because it under-emphasized the more important idea of message sending.
Smalltalk OO is about message sending 1980 An oversimplification, but a not inaccurate one
‘s OO
is patterned after Smalltalk
Go isn’t pure OO like smalltalk, but the fundamentals are there.
Go’s OO Methods & Interfaces There are only two mechanisms in Go that provide the object oriented functionality.
Methods (on any type) It took the creators a while to realize the only way to do this was to focus only on methods (and not structure) and to permit methods being defined on any type.
Go Methods type Phone []int
func (p Phone) String() string { text := "" for _, digit := range p { text += string(‘0’ + digit) } return text }
Go Methods type Phone []int
func (p Phone) String() string { text := "" for _, digit := range p { text += string(‘0’ + digit) } return text }
Interfaces Interface types are the mechanism to support an object-oriented programming style.
Interfaces give you a form of dynamic dispatch
Go Interfaces type Stringer interface { String() string }
Go Interfaces type Stringer interface { String() string }
Go Interfaces type Stringer interface { String() string }
func Print(s Stringer) { fmt.Println(s.String()) }
Go Interfaces type Stringer interface { String() string }
func Print(s Stringer) { fmt.Println(s.String()) }
Go Methods type Phone []int
func (p Phone) String() string { text := "" for _, digit := range p { text += string(‘0’ + digit) } return text }
Interfaces are implicit, not explicit. Implementation is entirely independent from each other.
Go Methods & Interfaces
func main() { p := Phone{8,0,0,5,5,5,1,3,1,3} Print(p) }
When you try to break a complex problem down you want to try to break it down into as few parts as you can and you want them to be as independent as they can be.
Dan Ingalls Object Oriented Programming — 1989 Go’s approach of interfaces and methods is as independent as possible. Any type can satisfy any interface as long as the right methods are added. An interface can be defined before or after the types that satisfy the interface. It just works and it works well.
Go’s OO Methods provide message sending mechanism on any type
Interfaces provide reusability through dynamic dispatch polymorphism
Go is OO in a very real sense as defined by smalltalk even though it doesn’t include classes, objects or inheritance.
Concurrency SECTION 3.2
Rob Pike Concurrency is not Parallelism – 2013 Concurrency is not parallelism. … Concurrency is about dealing with lots of things at once. Parallelism is about doing lots of things at once. Parallelism means running code on multiple CPUs at once.
Concurrency is about writing clear programs, so that it’s easy to explain how the program does many things at once.
If you have only one processor, your program can still be concurrent to make it easy to write, but it cannot run in parallel. But then if you do have multiple processors, there’s an obvious way to parallelize a concurrent program.
Joe Armstrong Programming Erlang — 2007
The world is parallel. If we want to write programs that behave as other objects in the real world, then these programs will have a concurrent structure. Joe Armstrong, the creator of Erlang says…
And even though we have shifted topics to concurrency, Joe is making a case here that a truly object oriented language must be concurrent.
Doug McIlroy http://facesofopensource.com/doug-mcilroy/
Like virtually all of Go’s features, our concurrency story begins a long time ago with Doug McIlroy
Unix Pipes 1964 Who back in 1964 came up with the idea that became Unix pipes.
1964 We should have some ways of coupling programs like garden hose–screw in another segment when it becomes necessary to massage data in another way. This is the way of IO also.
Paraphrase story.
And over a period from 1970 to 1972, I’d from time to time say, “How about making something like this?”, and I’d put up another proposal, another proposal, another proposal. And one day I came up with a syntax for the shell that went along with the piping, and Ken said, “I’m going to do it!” He was tired of hearing all this stuff… [and] he said, “I’m going to do it.” He didn’t do exactly what I had proposed for the pipe system call; he invented a slightly better one that finally got changed once more to what we have today. He put pipes into Unix [and he did it]… all in one night.
Quoting McIlroy: There’s a paper that’s hanging on Brian’s wall still, [which] he dredged out somewhere, where I talked about screwing together streams like garden hoses. So this idea had been banging around in my head for a long time.
At the same time that Thompson and Ritchie were on their blackboard, sketching out a file system, I was sketching out how to do data processing on this blackboard by connecting together cascades of processes and looking for a kind of prefix notation language for connecting processes together, and failing because it’s very easy to say “cat into grep into …”, or “who into cat into grep”, and so on; it’s very easy to say that, and it was clear from the start that that was something you’d like to say. But there are all these side parameters that these commands have; they don’t just have input and output arguments, but they have the options, and syntactically it was not clear how to stick the options into this chain of things written in prefix notation, cat of grep of who [i.e. cat(grep(who …))]. Syntactic blinders: didn’t see how to do it. So I had these very pretty programs written on the blackboard in a language that wasn’t strong enough to cope with reality. So we didn’t actually do it.
Tony Hoare By 1978, there were many proposed methods in use for communication and synchronization in the context of programming multiprocessors. Shared memory was the most common communication mechanism.
Tony Hoare published a paper which changed everything. It was decades ahead of its time. He called his paper, communicating sequential processes.
CSP 1978 Or as it’s better know, CSP.
CSP 1978 Processes: unit of execution Sequential: each runs as an ordinary single-thread program Communicating: how processes coordinate No sharing of memory No threads, no mutexes Hoare’s paper proposed a language with processes each running sequentially (or as an ordinary single threaded program) communicating with each other over unbuffered channels.
Hoare’s communicating processes are more general than typical Unix shell pipelines, since they can be connected in arbitrary patterns.
Prime Sieve 200 BC Hoare got the idea to include this from Doug McIlroy and included it in his paper.
In mathematics, the sieve of Eratosthenes is a simple, ancient algorithm for finding all prime numbers up to any given limit. Doug McIlroy came up with the concurrent version of the Prime Sieve.
CSP (from Hoare’s 1978 paper) 1978 [SIEVE(i:1..100):: p,mp:integer; SIEVE(i - 1)?p; print!p; mp := p; comment mp is a multiple of p; *[m:integer; SIEVE(i - 1)?m → [m > mp → mp := mp + p]; [m = mp → skip ||m < mp → SIEVE(i + 1)!m ] ] ||SIEVE(0)::print!2; n:integer; n := 3; [n < 10000 → SIEVE(1)!n; n := n + 2] ||SIEVE(101)::[n:integer;SIEVE(100)?n → print!n] ||print::[(i:0..101) n:integer; SIEVE(i)?n → …] ]
https://rosettacode.org/wiki/Sieve_of_Eratosthenes
CSP Erlang Go Occam Three branches of languages emerged from the CSP paper.
Occam in 1983 was close to CSP paper (advised by Hoare) Erlang in late 80s focused on functional side of CSP and used mailboxes to communicate between processes
Rob Pike Go’s story begins with Rob pike… but not this Rob Pike…
Rob Pike This much younger one.. back in 1989
Newsqueak 1989 Rob worked on a successor to an even earlier toy language called Squeak while at Bell Labs
Newsqueak 1989 Research language Make concurrency in Squeak practical Syntactically like C Like CSP, used channels as rendezvous points for processes.
Rob Pike Newsqueak (1989) looked syntactically like C but was applicative and concurrent.
Idea: a research language to make the concurrency ideas of Squeak practical.
Newsqueak addresses the same problems but in a broader context: Squeak was for designing devices such as menus and scroll bars; Newsqueak is for writing entire applications, and in particular a window system.
Newsqueak’s communication mechanisms are as in CSP, with channels acting as rendezvous points for processes.
Fun for after the talk: https://swtch.com/~rsc/thread/squint.pdf.
Newsqueak — Prime Sieve pt. 1 1988 counter := prog(end: int, c: chan of int) { i: int; for(i = 2; i<end; i++) c<-=i; };
filter := prog(prime: int, listen, send: chan of int) { i: int; for(;;) if((i=<-listen)%prime) send<-=i; };
Newsqueak — Prime Sieve pt. 2 1988 sieve := prog(c: chan of int) { for(;;) { prime := <-c; print(prime, " “); newc := mk(chan of int); begin filter(prime, c, newc); c = newc; } };
count := mk(chan of int);
begin counter(10000, count); sieve(count); Unlike in CSP and Squeak, Newsqueak treats communications channels as first-class objects: channels can be stored in variables, passed as arguments to functions, and sent across channels.
Also <-c (receive) is an expression introduce here for the first time (it would later surface in Go)
Alef, Limbo 1990s
Alef & Limbo 1990s Used for writing real systems software Built on concurrency ideas in Newsqueak Alef wasn’t GC’ed and it was painful Limbo was GC’ed and it was wonderful
both languages that Rob used for writing real systems software Alef was done for Plan 9 (Bell labs successor to UNIX) Limbo was done for Inferno, successor to Plan 9 convinced Rob and the others at Bell Labs that CSP was a great way to write concurrent programs like network servers
. 2008
Go 2008 Goroutines & Channels Go has
Goroutines https://play.golang.org/p/IXkwo_-ruTC func main() { go printer(99) time.Sleep(1 * time.Second) }
func printer(n int) { for i := 0; i < n; i++ { fmt.Println(i) time.Sleep(5 * time.Millisecond) } } A Goroutine is a function that executes independently, launched by a go statement.
It has its own call stack, which grows and shrinks as required. It’s very cheap. It’s practical to have thousands, even hundreds of thousands of goroutines.
It isn’t a thread. The Go runtime multiplexes goroutines onto threads as needed. There might be only one thread in a program with thousands of goroutines.
Ryan Dahl Interview with Ryan Dahl, Creator of Node.js – 2017 I like the programming model of Go. Using goroutines is so easy and fun… if you’re building a server, I can’t imagine using anything other than Go. Goroutines are what make concurrency in Go simple.
Channels // Declaring and initializing c := make(chan int)
// Sending a value on a channel c <- 1 // Receiving a value from a channel x = <-c // Data flows the way the arrow points.
Next is channels which are going to look quite similar to the last few languages we just discussed.
Go — Prime Sieve Pt. 1 2009 func Generate(ch chan<- int) { for i := 2; ; i++ { ch <- i // Send ‘i’ to channel ‘ch’. } } func Filter(src <-chan int, dst chan<- int, prime int) { for i := range src { // Loop over values received if i%prime != 0 { dst <- i // Send ‘i’ to channel ‘dst’. } } }
Go — Prime Sieve Pt. 1 2009 func Generate(ch chan<- int) { for i := 2; ; i++ { ch <- i // Send ‘i’ to channel ‘ch’. } } func Filter(src <-chan int, dst chan<- int, prime int) { for i := range src { // Loop over values received if i%prime != 0 { dst <- i // Send ‘i’ to channel ‘dst’. } } }
Go — Prime Sieve Pt. 1 2009 func Generate(ch chan<- int) { for i := 2; ; i++ { ch <- i // Send ‘i’ to channel ‘ch’. } } func Filter(src <-chan int, dst chan<- int, prime int) { for i := range src { // Loop over values received if i%prime != 0 { dst <- i // Send ‘i’ to channel ‘dst’. } } }
Go — Prime Sieve Pt. 1 2009 func Generate(ch chan<- int) { for i := 2; ; i++ { ch <- i // Send ‘i’ to channel ‘ch’. } } func Filter(src <-chan int, dst chan<- int, prime int) { for i := range src { // Loop over values received if i%prime != 0 { dst <- i // Send ‘i’ to channel ‘dst’. } } }
Go — Prime Sieve Pt. 1 2009 func Generate(ch chan<- int) { for i := 2; ; i++ { ch <- i // Send ‘i’ to channel ‘ch’. } } func Filter(src <-chan int, dst chan<- int, prime int) { for i := range src { // Loop over values received if i%prime != 0 { dst <- i // Send ‘i’ to channel ‘dst’. } } }
Go — Prime Sieve Pt. 2 2009 – https://play.golang.org/p/s88CRWdCrOz func main() { src := make(chan int) // Create a new channel. go Generate(src) // Launch Generate goroutine. for i := 0; i < 100; i++ { // Find 100 primes prime := <-src println(prime) dst := make(chan int) go Filter(src, dst, prime) src = dst } }
Go — Prime Sieve Pt. 2 2009 – https://play.golang.org/p/s88CRWdCrOz func main() { src := make(chan int) // Create a new channel. go Generate(src) // Launch Generate goroutine. for i := 0; i < 100; i++ { // Find 100 primes prime := <-src println(prime) dst := make(chan int) go Filter(src, dst, prime) src = dst } }
Go — Prime Sieve Pt. 2 2009 – https://play.golang.org/p/s88CRWdCrOz func main() { src := make(chan int) // Create a new channel. go Generate(src) // Launch Generate goroutine. for i := 0; i < 100; i++ { // Find 100 primes prime := <-src println(prime) dst := make(chan int) go Filter(src, dst, prime) src = dst } }
Go — Prime Sieve Pt. 2 2009 – https://play.golang.org/p/s88CRWdCrOz func main() { src := make(chan int) // Create a new channel. go Generate(src) // Launch Generate goroutine. for i := 0; i < 100; i++ { // Find 100 primes prime := <-src println(prime) dst := make(chan int) go Filter(src, dst, prime) src = dst } }
// 2
Go — Prime Sieve Pt. 2 2009 – https://play.golang.org/p/s88CRWdCrOz func main() { src := make(chan int) // Create a new channel. go Generate(src) // Launch Generate goroutine. for i := 0; i < 100; i++ { // Find 100 primes prime := <-src println(prime) dst := make(chan int) go Filter(src, dst, prime) src = dst } }
Go — Prime Sieve Pt. 2 2009 – https://play.golang.org/p/s88CRWdCrOz func main() { src := make(chan int) // Create a new channel. go Generate(src) // Launch Generate goroutine. for i := 0; i < 100; i++ { // Find 100 primes prime := <-src println(prime) dst := make(chan int) go Filter(src, dst, prime) src = dst } }
Go — Prime Sieve Pt. 2 2009 – https://play.golang.org/p/s88CRWdCrOz func main() { src := make(chan int) // Create a new channel. go Generate(src) // Launch Generate goroutine. for i := 0; i < 100; i++ { // Find 100 primes prime := <-src println(prime) dst := make(chan int) go Filter(src, dst, prime) src = dst } }
Go — Prime Sieve Pt. 2 2009 – https://play.golang.org/p/s88CRWdCrOz func main() { src := make(chan int) // Create a new channel. go Generate(src) // Launch Generate goroutine. for i := 0; i < 100; i++ { // Find 100 primes prime := <-src println(prime) dst := make(chan int) go Filter(src, dst, prime) src = dst } }
Go — Prime Sieve Pt. 2 2009 – https://play.golang.org/p/s88CRWdCrOz func main() { src := make(chan int) // Create a new channel. go Generate(src) // Launch Generate goroutine. for i := 0; i < 100; i++ { // Find 100 primes prime := <-src println(prime) dst := make(chan int) go Filter(src, dst, prime) src = dst } }
Go — Prime Sieve Pt. 2 2009 func main() { src := make(chan int) // Create a new channel. go Generate(src) // Launch Generate goroutine. for { // Find all primes prime := <-src println(prime) dst := make(chan int) go Filter(src, dst, prime) src = dst } }
go fmt & go fix SECTION 3.3
go fmt
// Ext returns the file name extension used by path. func Ext(path string) string { for i:= len( path ) - 1; i >= 0 && path[i] != ‘/'; i– { if path[ i] == ‘.’ { return path[i: ] } } return "” } Unformatted Go Code
// Ext returns the file name extension used by path. func Ext(path string) string { for i := len(path) - 1; i >= 0 && path[i] != ‘/'; i– { if path[i] == ‘.’ { return path[i:] } } return "" } Formatted Go Code
go fmt Parses Go source into syntax trees Prints syntax trees back into source code Uses the std lib support for these actions Ended all style debates in Go before they even happened
// Read implements the io.Reader interface. func (r *Reader) Read(b []byte) (n int, err error) { if r.i >= int64(len(r.s)) { return 0, io.EOF } r.prevRune = -1 n = copy(b, r.s[r.i:]) r.i += int64(n) return }
No fingerprints left on code It doesn’t look like generated code, but in reality all Go is machine generated code An unforeseen and wonderful artifact of GoFmt is that there are no fingerprints on Go code. Looking through all of the source code written on our team it’s impossible to tell who’s code is whose. It’s also impossible to tell which code is generated code vs code written by hand.. And that’s because all of it is generated but in a way that looks and feels natural.
Not what you’d think as a language feature
Planned from the beginning
Last thing we did before open sourcing Go
It turned out to be one of the best things we did
Now other languages are doing it (C++, Rust, etc)
Russ Cox Mailing List November 2009 Once you have gofmt, it becomes very easy to insert mechanical processing between parsing and printing. So we have all the hard parts of a program manipulation tool just sitting waiting to be used.
Which takes us to …
go fix
go fix Rewrite programs that use old APIs to use newer ones Much more intelligent than regex rewriting Allows API changes to be shipped along with code changes
os.Open(a, os.O_RDONLY, 0) os.Open(a, os.O_RDONLY, 0666) os.Open(a, os.O_RDWR, 0) os.Open(a, os.O_CREAT, 0666) os.Open(a, os.O_CREAT|os.O_TRUNC, 0664) os.Open(a, os.O_CREATE, 0666) os.Open(a, os.O_CREATE|os.O_TRUNC, 0664) os.Open(a, os.O_WRONLY|os.O_CREATE|os.O_TRUNC, 0666) os.Open(a, os.O_WRONLY|os.O_CREATE|os.O_APPEND, 0666) os.Open(a, os.O_WRONLY|os.O_CREATE|os.O_EXCL, 0666) os.Open(a, os.O_SURPRISE|os.O_CREATE, 0666) _ = os.O_CREAT
Originally there was just a multi-arg os.Open: Originally there was just a multi-arg os.Open which exposed the UNIX api to our users.
Go Fix let us take this bad API and convert it to one that is dramatically better.
os.Open(a) os.Open(a) os.OpenFile(a, os.O_RDWR, 0) os.Create(a) os.Create(a) os.Create(a) os.Create(a) os.Create(a) os.OpenFile(a, os.O_WRONLY|os.O_CREATE|os.O_APPEND, 0666) os.OpenFile(a, os.O_WRONLY|os.O_CREATE|os.O_EXCL, 0666) os.OpenFile(a, os.O_SURPRISE|os.O_CREATE, 0666) _ = os.O_CREATE
Go Fix let us take an API and convert it to a better one No regEx could have made these changes. They required a deep knowledge of the internal structure. It was easy with Go Fix.
Later we used it to make more significant changes to reflect Too many words
Russ Cox Introducing Gofix – 2011 The recent reflect changes would have been unpalatable without automated conversion, and the reflect API badly needed to be redone. Gofix gives us the ability to fix mistakes or completely rethink package APIs without worrying about the cost of converting existing code.
We’ve only scratched the surface Go upgrades Look for patterns that are problematic and fix it. Lots more ways for computer to help you write and maintain your programs.
Go 1.11 (Aug 18) introduced package versioning
Go 1.13 (Aug 19) modules become default Beyond this we are looking into incorporating go fix or similar into the package release process.. Meaning you can release a new version of a package with a new API and a corresponding “go fix” to automatically upgrade users to this new API. There’s a lot of details to figure out still but the foundation is there.
The Legacy of Go SECTION FIVE
Confucius
Study the past if you would define the future.
Martin Heidegger
The great thinker is one who can hear what is greatest in the work of other “greats” and who can transform it in an original manner.
What should a modern, practical programming language look like?
Go’s Many Ancestors and Influences Alef Algol APL BCPL B C C++ CLU CSP Java Limbo Modula Newsqueak
Oberon Occam Pascal Python Simula Smalltalk
Through combining and simplifying the very best ideas from the past 60 years Go simultaneously feels new and original yet also like a familiar reflection of the past.
Thank You Landing Festival – Berlin – April 4, 2019 Steve Francia Google @spf13
Marie Antoinette
There is nothing new except what has been forgotten.
Abraham Lincoln
Books serve to show a man that those original thoughts of his aren’t very new after all.
|
|||||
4768
|
dbpedia
|
1
| 8
|
https://retrocomputing.stackexchange.com/questions/15988/what-did-the-dod-think-fortran-lacked
|
en
|
What did the DoD think Fortran lacked?
|
[
"https://i.sstatic.net/kK3cz.gif?s=64",
"https://www.gravatar.com/avatar/5aedd802dca0d3b7397664dfac645ae8?s=64&d=identicon&r=PG&f=y&so-version=2",
"https://i.sstatic.net/kK3cz.gif?s=64",
"https://www.gravatar.com/avatar/04ce27f0ebe4713b51911cd8475c3b42?s=64&d=identicon&r=PG",
"https://lh3.googleusercontent.com/-yspgfz81fFw/AAAAAAAAAAI/AAAAAAAAD5o/7CFgAMaSHSE/photo.jpg?sz=64",
"https://retrocomputing.stackexchange.com/posts/15988/ivc/acfa?prg=af9d4221-fecc-4aa8-9a85-5d05c2a00c12"
] |
[] |
[] |
[
""
] | null |
[] |
2020-08-27T21:45:33
|
According to https://en.wikipedia.org/wiki/COBOL the project to design COBOL began when
On 28 and 29 May 1959 (exactly one year after the Zürich ALGOL 58 meeting), a meeting was held at the Pentag...
|
en
|
https://cdn.sstatic.net/Sites/retrocomputing/Img/favicon.ico?v=9b6f1e4fd156
|
Retrocomputing Stack Exchange
|
https://retrocomputing.stackexchange.com/questions/15988/what-did-the-dod-think-fortran-lacked
|
The meeting that defined the requirements of the new language took place on May 28–29, 1959. Charles Phillips prepared a memo several months later summarizing the decisions made at that meeting. Its listing of requirements is reprinted on page 201 of the ACM’s History of Programming Languages.
a. Majority of group supported maximum use of simple English language; even though some participants suggested there might be advantage from using mathematical symbolism.
b. A minority suggested we steer away from problem-oriented language because English is not a panacea as it cannot be manipulated as algebraic expressions can.
c. The need is for a programming language that is easier to use, even if somewhat less powerful.
d. We need to broaden the base of those who can state problems to computers.
e. The [Common Business Language] should not be biased by present compiler problems.
The committee did not consider FORTRAN as an alternative. According to Jean E. Sammet, who was chair (She describes herself as the “chairman”) of two of the committees that developed COBOL and served on a third, the major inspirations were FLOW-MATIC (developed by Grace Hopper and others for Remington-Rand Univac), AIMACO (developed by the Air Materiel Command based on Hopper’s work, and described by Sammet as “a minor modification of FLOW-MATIC”) and COMTRAN (Commercial Translator, which at that time existed as a manual at IBM, and had never been implemented). Sammet claims that FACT, developed at Honeywell, had far less influence on COBOL than some people believed.
The entire chapter I link to has extensive notes that Sammet took at the time from the committee that developed COBOL, and the decisions it made.
She makes the particularly interesting admission on page 221:
I felt there was a strong anti-IBM bias in this committee from me, and from some (but certainly not all) of the others. Since I was not working for IBM at the time, I can freely (although not with pride) admit that in some cases suggestions or decisions were made on the basis of doing things differently from how IBM did it. For example, we felt that the verb for loop control should not be called DO because that was how FORTRAN did it.
Sammet lists among the ideas COBOL took from FLOW-MATIC, “It used full data names rather than short symbolic names (as in FORTRAN)” e.g. SOCIAL-SECUR instead of SOCSEC, and used English words as commands. Less cosmetically, it allowed fields to be packed into a data word. She says, “Note that Fortran assumes that every number is in a single machine word.” It separated data definitions from instructions, which she says became so commonplace that it’s difficult to appreciate what a conceptual breakthrough it was.
Among the ideas she lists as coming from COMTRAN are nested data structures, expressions, and conditionals. It was controversial at that time to allow mathematical formulas and even Boolean expressions, as some committee members believed those were only needed in a few edge cases.
She also states that IAL, which developed into ALGOL, had a significant influence, by convincing the committee not to follow its example, and instead allow in its source code only characters that actually exist.
FORTRAN was, at the time(*1), lacking almost everything, from string handling to all I/O beside reading numbers from cards or tape. Heck, not even integer size was guaranteed across machines.
No real way of structuring or flow control beside GOTO — even subroutines/functions were only integrated a year before with FORTRAN II. For most parts, FORTRAN is a symbolic assembler with a focus on mathematics, that made it easy to write formulas, but not much else.
But real world computing is about data handling and I/O. This is especially true for a huge organization as the US military, represented by its bureaucratic arm, the DoD. Being able to easily write down some complex calculation is nice, but useless in an environment there it's about managing inventory, ordering supplies, calculating pay and have all of that delivered in time.
An army is like a huge corporation, not a scientific institute and the task at hand is data processing, not number crunching.
Data Processing is vastly different from number crunching — it's a completely different world. It's about the classic "read card, process item, write output" loop, the stuff that got automated with punch cards. It is the core reason why the /360 has survived until today as a successful architecture. Its instruction set is perfectly tailored to shovel data, being built to support these principles. The fact that IBM tried to make it an all around (360 degree) architecture by including FP and even trying to make it fit for process control didn't really matter on the long run — others were much better for either.
And the DoD did need data processing to fight the wars at the time (Korea just ended and Vietnam was coming up), and a language to support writing data processing programs in a machine independent way. That's why COBOL was developed as a result of the mentioned conference.
*1 - FORTRAN did improve a lot since back then, still, it didn't really change the basic nature.
|
||||
4768
|
dbpedia
|
0
| 10
|
https://www.coursehero.com/file/p7f9e1am/Based-on-the-experience-of-Fortran-I-Algol-58-was-announced-in-1958-Two/
|
en
|
[] |
[] |
[] |
[
""
] | null |
[] | null | null | ||||||||||
4768
|
dbpedia
|
3
| 45
|
https://jmvdveer.home.xs4all.nl/en.post.perspective-of-algol-68.html
|
en
|
A perspective of Algol 68
|
[
"https://jmvdveer.home.xs4all.nl/images.banner.png",
"https://jmvdveer.home.xs4all.nl/images.algol-68-genie.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.nas.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.veldpost.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.sals-sans.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.endgame.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.sun.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.pooh.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.pleiades.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.nas.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.vm-mvs.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.a68c.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.cd.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.interface.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.language-banner.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.thermodynamics.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.cwi.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.interface.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.download.marcel-van-der-veer-thesis.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.download.marcel-van-der-veer-foaralder.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.a68c.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.eeyore.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png",
"https://jmvdveer.home.xs4all.nl/images.veer.png"
] |
[] |
[] |
[
""
] | null |
[
"Marcel van der Veer"
] |
2024-07-03T15:03:26+00:00
|
To better understand the position of Algol 68 among todayâs plethora of programming languages, we should consider their development …
|
images.icons.a68g.ico
|
https://jmvdveer.home.xs4all.nl/
|
https://jmvdveer.home.xs4all.nl//en.post.perspective-of-algol-68.html
|
The development of Algol played an important role in establishing computer science as an academic discipline. The former Mathematisch Centrum Amsterdam, now CWI, Centrum voor Wiskunde & Informatica was a leading institute in the design of Algol 68, as it was in later years in the development of Python. To better understand the position of Algol 68 among todayâs plethora of programming languages, we should consider their development. First I will give a chronological overview discussing the period 1950-1960 in which a number of programming languages evolved, then the period 1980-1990 that saw consolidation of imperative languages and later the recombination and maturation of existing ideas in the 1990âs, and finally current trends. After that, the particular history of Algol 68 is discussed, and its current position. You will of course be aware that this post is an Algol 68 implementer's perspective.
In the period 1950-1960 a number of programming languages evolved, the descendants of which are still widely used. The most notable are Fortran by Backus et al., Lisp by McCarthy et al., Cobol by Hopper et al. and Algol 60 by a committee of European and American academics including Backus. Algol 60 was particularly influential in the design of later languages since it introduced nested block structure, lexical scope, and a syntax in Backus-Naur form (BNF). Nearly all subsequent programming languages have used a variant of BNF to describe context-free syntax.
At the time of the development of Algol 68, programming languages were required to serve two purposes. They should provide concepts and statements allowing a precise formal description of computing processes and facilitate communication between programmers, and they should provide a tool to solve small to medium-sized problems without specialist help. The context of Algol 68âs development is perhaps adequately illustrated by a quote [1] from Edsger Dijkstra: The intrinsic difficulty of the programming task has never been refuted ⦠I vividly remember from the late 60âs the tendency to blame the programming languages in use and to believe in all naivety that, once the proper way of communicating with the machines had been found, all programming ills would have been cured.
The early procedural programming languages served above purposes required for them. However, the evolving need to build complex interactive systems asked for decomposition of a problem into "natural" components, resulting in object oriented programming languages starting as early as the 1960âs. The object oriented and procedural paradigms each have strengths and weaknesses and it is not always clear which paradigm is best suited to certain problems. In numerical and scientific computing for instance, the benefit of object oriented languages over procedural languages is controversial since in heavy number crunching, efficiency is a top priority. The latter is a likely reason that software packages coded in Fortran are operational today, despite the rise of for instance Python in this field.
The period 1960 to 1980 produced most of the major language paradigms now in use. Algol 68 was conceived as a successor to Algol 60. Its syntax and semantics became even more orthogonal and were defined by a Van Wijngaarden grammar, a formalism designed specifically for this purpose. Simula by Nygaard and Dahl was a superset of Algol 60 supporting object oriented programming, while Smalltalk by Kay, Ingalls and Kaehler, was a newly designed object oriented language. C, the Unix system programming language, was developed by Ritchie and Thompson at Bell Laboratories between 1969 and 1973. Prolog by Colmerauer, Roussel, and Kowalski was the first logic programming language. ML by Milner built a polymorphic type system on top of Lisp, pioneering statically typed functional programming languages. Each of these languages spawned a family of descendants, and most modern languages count at least one of them in their ancestry. Other important languages that were developed in this period include Pascal, Forth, Scheme and SQL.
During the 1980's imperative languages consolidated. Although major new paradigms for imperative languages did not appear, many researchers elaborated on existing ideas from the 1970âs, for example object oriented programming, and adapting them to new contexts, for example to distributed systems. C++ combined object oriented programming and system programming. The United States government standardised Ada as a system programming language for defense contractors. Mainly in Japan major efforts were spent investigating so-called fifth-generation programming languages that incorporated logic programming constructs. The functional languages community standardised ML and Lisp. Research in Miranda, a functional language with lazy evaluation, began in this decade. An important trend in 1980âs language design was increased focus on programming large-scale systems through the use of modules, reflected in the development of Modula, Ada and ML. Some other notable languages from the 1980âs are Objective C and Perl.
In the 1990âs recombination and maturation of existing ideas continued. An important motivation in this period was productivity. Many rapid application development (RAD) languages emerged, which usually were descendants of older, typically object oriented, languages that were equipped with an IDE and garbage collection. These languages included Object Pascal, Visual Basic, and Java. Java in particular received much attention. More radical and innovative were new scripting languages. These did not directly descend from other languages and featured new syntax and liberal incorporation of features. Many consider these scripting languages as more productive than RAD languages, though others will forward that scripting languages may make small programs simpler but large programs are more difficult to write and maintain. Nevertheless, scripting languages came to be the most prominent ones used in connection with the internet. Some important languages that were developed in the 1990âs are Haskell, Python and PHP.
Some current trends in programming languages are mechanisms for security and reliability verification, alternative mechanisms for modularity, component-oriented software development, constructs to support concurrent and distributed programming, metaprogramming, and integration with databases. This century has so far seen the introduction of for example C#, Visual Basic.NET and Go.
Algol, ALGOrithmic Language, is a family of imperative computer programming languages which greatly influenced many other languages and became the de facto way algorithms were described in textbooks and academic works for almost three decades. The two specifications relevant to this publication are Algol 60, revised in 1963, and Algol 68, revised in 1976. Algol 58, originally known as IAL (International Algebraic Language), was an early member of the Algol family soon superseded by Algol 60. Algol 58 introduced a compound statement which was restricted to flow of control only and did not relate to lexical scope as do Algol 60âs blocks.
Ideally, a programming language supports systematic expression of algorithms by offering appropriate control structures and data structures, and a precise, consistent formal definition to avoid surprises and portability issues resulting from details that are implementation dependent. Members of the Algol family (Algol 60 and Algol 68, Simula, Pascal and also Ada, …) are considered reasonable approximations of such "ideal" languages, although all of them have strong points as well as disadvantages. Algol 68 offers appropriate means of abstraction and exemplary control structures that lead to a good understanding of structured programming. Its orthogonality results in economic use of language constructs making it a beautiful tool for programming.
The design of Algol was firmly rooted in the computing community, at the time a term for the small but growing group of computer professionals and scientists. It formed an international platform for discussing programming languages, compiler construction, program construction, et cetera, and thus Algol had an important part in establishing computer science as an academic discipline in its own right. Algol 60 was designed by and for numerical mathematicians; in its day it was the Lingua Franca of computer science. The language introduced block structure with lexical scope and a concise BNF definition that were appreciated by people with a background in mathematics, but it lacked compilers and industrial support which gave the advantage to languages as Fortran and Cobol. To promote Algol, its application range had to be extended. IFIP [2] Working Group 2.1 Algorithmic Languages and Calculi (WG 2.1), that to this day has continuing responsibility for Algol 60 and Algol 68, assumed the task of developing a successor to Algol 60.
In the early 1960âs WG 2.1 discussed this successor and in 1965 descriptions of a language Algol X based on these discussions were invited. This resulted in various language proposals by Wirth, Seegmüller and Van Wijngaarden [3] and other significant contributions by Hoare and Naur. Van Wijngaardenâs paper Orthogonal design and description of a formal language [4] featured a new technique for language design and definition and formed the basis for what would develop into Algol 68. Many features found in Algol 68 were first proposed in Algol Bulletin by the original authors of Algol 60 like Peter Naur, by new members of WG 2.1 like Tony Hoare and Niklaus Wirth, and by many others from the world-wide computing community.
Reference [5] gives a first hand account of the events leading to Algol 68. Algol 68 has had a large influence on the development of programming languages since it addressed many issues; for example orthogonality, a strong type system, procedures as types, memory management, treatment of arrays, a rigorous description of syntax, and parallel processing, but also ideas that caused debate over the years such as context-sensitive coercions and quite complicated input-output formatting. After various, apparently lively, meetings WG 2.1 had not reached unanimous consent. Eventually, Algol 68 was produced by those members wanting a new milestone in language design. Dijkstra, Hoare, Seegmüller and others wrote a brief "Minority Report", more a letter than a report actually, outlining their view on what Algol 60's successor should have been; many years later it was commented that no programming language developed since, would have satisfied that vision. Yet others, for example Wirth and Hoare, opted to shorten the development cycle by improving Algol 60, which eventually produced Algol W and later Pascal.
Algol 60 syntax is in BNF form whereas Algol 68 syntax is described by a two-level W-grammar (W for Van Wijngaarden) that can define a context-sensitive grammar. Simply put, in a W-grammar a high-level grammar produces a low-level grammar for all correct particular programs. The concept of context-sensitive grammar was introduced by Chomsky in the 1950âs to describe the syntax of natural language where a word may or may not be appropriate in a certain position, depending on context. Analogously, Algol 68 syntax defines syntactic restrictions; for example demanding that applied-identifiers or operators be declared (but not necessarily before they are applied), or demanding that modes result in finite objects that require finite coercion, et cetera. To reject incorrect programs, a context-free syntax must be complemented with extra rules formulated in natural language. The rigorous Algol 68 specification had a few loose ends, while the C specification had a few hundred.
Probably because of the formal character of the Revised Report, which takes some study to comprehend, the misconception got hold that Algol 68 is a complex language. In fact it is rather lean, and the alleged obscurity of description is toned down by most who studied it [5]. Since it is argued that the specification of many contemporary languages, including that of modern C, is more complex than that of Algol 68 [6], one can ask whether the latter just looked complex at the time, fifty years ago.
Algol 68 was defined in a formal document, first published in January 1969, and later published in Acta Informatica and also printed in Sigplan Notices. A Revised Report was issued in 1976 [7]. Algol 68 was the first major language for which a full formal definition was made before it was implemented. Though known to be terse, the Revised Report does contain humour solis sacerdotibus â to quote Koster from [5]: The strict and sober syntax permits itself small puns, as well as a liberal use of portmanteau words. Transput is input or output. âStowedâ is the word for structured or rowed. Hipping is the coercion for the hop, skip and jump. MOID is MODE or void. All metanotions ending on ETY have an empty production. Just reading aloud certain lines of the syntax, slightly raising the voice for capitalized words, conveys a feeling of heroic and pagan fun (â¦) Such lines cannot be read or written with a straight face.
Algol 68 was designed for programmers, not for compiler writers, in a time when the field of compiler construction was not as advanced as it is today. Implementation efforts based on formal methods generally failed; Algol 68âs context-sensitive grammar required some invention to parse [8]. At the time compilers usually were made available on mainframes by computing centres, which may explain why Algol 68 was popular in locations rather than areas, for instance Amsterdam, Berlin or Cambridge. It was relatively popular in the United Kingdom, where the ALGOL68R , ALGOL68RS and ALGOL68C compilers were developed. Commercial initiatives were relatively unsuccessful; for instance the FLACC compiler sold just twenty-two copies [9].
Industry did not adopt Algol 68, with a few exceptions. There may be as many opinions on this as there are experts you might ask. IBM would not support it (though CDC did, on demand of academia), it was not as fast as Fortran (but might have been), there was no community outside academia, implementations were lagging ⦠All too often, the timing must be right, the environment must be open to acceptance and adequate marketing should be in place, for an innovation to have a chance. But this is my two cents.
Once university mainframes were phased out, the compilers on them were decommissioned. For scientific work, many migrated in the 1990's from mainframes to workstations that did not have Algol 68. In fact, me leaving academia in 1991 led me to write my own Algol 68 implementation so I could continue programming in it.
Algol 68 apparently influenced many who were exposed to it, and it may still do so. A few individuals wonder, even today, whether it would have a future. In my opinion, the world has moved on. Although the language was innovative in its days, development stopped so it reflects the era it was developed in. Probably there is a newer language like Python that better suits your modern application on modern infrastructure, and has a large community. Algol 68 will not catch up anymore and further development would probably convert it into an anachronism. On the other hand, Algol 68 appears very familiar to young people, which is a testament to the influence it had on how we think about programming languages.
Therefore the Algol 68 Genie project preserves Algol 68 out of educational as well as scientific-historical interest. Those interested in the language may study it and actually use it to understand the language and the influence it had, and experience what computer science and computing was like in its pioneering years, half a century ago. Considering the reactions to the project, I think it is fair to say that it succeeds in this purpose. Of course, people also use it for programming, for instance in numerical mathematics as I like to do. Despite its age, even today Algol 68 is a beautiful means for writing algorithms.
Notes
[1] Transcript from keynote delivered at the ACM 1984 South Central Regional Conference. Source: E. W. Dijkstra Archive-the manuscripts of Edsger W. Dijkstra.
[2] IFIP, the International Federation for Information Processing is an umbrella organisation for national information processing organisations. It was established in 1960 under the auspices of UNESCO.
[3] Adriaan van Wijngaarden (1916-1987) is considered by many to be the founding father of computer science in the Netherlands. He was co-founder of IFIP and one of the designers of Algol 60 and later Algol 68. As leader of the Algol 68 committee, he made a profound contribution to the field of programming language design, definition and description.
[4] A. van Wijngaarden, Orthogonal design and description of a formal language. Available online from CWI.
[5] Proceedings, Conference on the history of Algol 68 [1993]. Available online from CWI.
[6] K. Henney. Procedural Programming: Itâs Back? It Never Went Away. Available for viewing on YouTube.
[7] A. van Wijngaarden et al., Revised report on the algorithmic language Algol 68 [1976]. Available online from CWI, or in HTML as a previous blog post, or as part of Learning Algol 68 Genie .
[8] B. Mailloux, On the implementation of Algol 68 [1968]. Available online from CWI. Algol 68 Genie employs a multi-pass scheme to parse Algol 68. Extracting declarations before parsing as proposed by Mailloux, renders the W-grammar LALR.
[9] Source: Chris Thomson, formerly with Chion Corporation, on comp.lang.misc [1988].
|
|||||
4768
|
dbpedia
|
2
| 3
|
https://betterprogramming.pub/programming-history-the-influence-of-algol-on-modern-programming-languages-81a90107946a
|
en
|
Programming History: The Influence of Algol on Modern Programming Languages (Part 1)
|
[
"https://miro.medium.com/v2/resize:fill:64:64/1*dmbNkD5D-u45r44go_cf0g.png",
"https://miro.medium.com/v2/resize:fill:88:88/2*u5JJwU4l_faX_5CCFuJu3w.jpeg",
"https://miro.medium.com/v2/resize:fill:48:48/1*QNoA3XlXLHz22zQazc0syg.png",
"https://miro.medium.com/v2/resize:fill:144:144/2*u5JJwU4l_faX_5CCFuJu3w.jpeg",
"https://miro.medium.com/v2/resize:fill:64:64/1*QNoA3XlXLHz22zQazc0syg.png"
] |
[] |
[] |
[
""
] | null |
[
"Mike McMillan"
] |
2020-04-23T15:15:46.307000+00:00
|
Many of the features of currently popular programming languages, such as C, C++, Java, and JavaScript, were first implemented in the Algol programming language. Ask a programmer using a language such…
|
en
|
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
|
Medium
|
https://betterprogramming.pub/programming-history-the-influence-of-algol-on-modern-programming-languages-81a90107946a
|
Many of the features of currently popular programming languages, such as C, C++, Java, and JavaScript, were first implemented in the Algol programming language. Ask a programmer using a language such as C++, Java, or JavaScript which language influenced their current programming language and they will probably say C, since the syntax for that language is mostly the same syntax C uses. But not many programmers know what languages influenced the development of C.
The language that had the most influence on C, through a couple of intermediate steps, is Algol. In this and my next article, I’m going to discuss many features of the programming languages we use today and how and why they were first implemented in Algol.
If you’re interested in learning more about this history, you should consult the book I used in my research, “History of Programming Languages,” edited by Richard L. Wexelblat. This book compiles the sessions from the SIGPLAN History of Programming Languages conference. I used the chapters from the Fortran session and the Algol session in my research.
A Brief History of…
|
||||
4768
|
dbpedia
|
0
| 11
|
https://medium.com/swlh/programming-history-the-influence-of-algol-on-modern-programming-languages-part-2-49ae6d30c8c0
|
en
|
Programming History: The Influence of Algol on Modern Programming Languages (Part 2)
|
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
|
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
|
[
"https://miro.medium.com/v2/resize:fill:64:64/1*dmbNkD5D-u45r44go_cf0g.png",
"https://miro.medium.com/v2/resize:fill:88:88/2*u5JJwU4l_faX_5CCFuJu3w.jpeg",
"https://miro.medium.com/v2/resize:fill:48:48/1*pKOfOAOvx-fWzfITATgGRg.jpeg",
"https://miro.medium.com/v2/resize:fill:144:144/2*u5JJwU4l_faX_5CCFuJu3w.jpeg",
"https://miro.medium.com/v2/resize:fill:64:64/1*pKOfOAOvx-fWzfITATgGRg.jpeg"
] |
[] |
[] |
[
""
] | null |
[
"Mike McMillan"
] |
2020-05-05T03:23:20.310000+00:00
|
In an earlier article I presented a brief history of high-level, algebraic programming languages starting with Fortran 58 and 60, leading up to the initial development of the Algol language in the…
|
en
|
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
|
Medium
|
https://medium.com/swlh/programming-history-the-influence-of-algol-on-modern-programming-languages-part-2-49ae6d30c8c0
|
In an earlier article I presented a brief history of high-level, algebraic programming languages starting with Fortran 58 and 60, leading up to the initial development of the Algol language in the years 1958 through 1960. In this article I’m going to discuss several of the technical features of Algol that influenced subsequent high-level programming languages such as BCPL and ultimately C.
Data Types
High-level programming languages before Algol did not have data types. Variables simply sprang into existing by assigning them a value, either an integer or a real number. Fortran did have a bit of data typing in that integers were to be defined using the letters I, J, K, L, M, and N. Any other variable that existed would be a real variable.
Algol began the practice of declaring the type of variable as part of its declaration. This allowed the compiler to perform more efficient by knowing the type of number that was going to be stored in the variable before a number was assigned to the variable.
However, as I discovered reading through a history of Algol that there wasn’t a real theory of data type built into the language until Algol 68.
Here is an example of how data types are used in Algol 60. This program fragment reads some values from a data card(!):
|
||
4768
|
dbpedia
|
1
| 9
|
https://medium.com/swlh/programming-history-the-influence-of-algol-on-modern-programming-languages-part-2-49ae6d30c8c0
|
en
|
Programming History: The Influence of Algol on Modern Programming Languages (Part 2)
|
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
|
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
|
[
"https://miro.medium.com/v2/resize:fill:64:64/1*dmbNkD5D-u45r44go_cf0g.png",
"https://miro.medium.com/v2/resize:fill:88:88/2*u5JJwU4l_faX_5CCFuJu3w.jpeg",
"https://miro.medium.com/v2/resize:fill:48:48/1*pKOfOAOvx-fWzfITATgGRg.jpeg",
"https://miro.medium.com/v2/resize:fill:144:144/2*u5JJwU4l_faX_5CCFuJu3w.jpeg",
"https://miro.medium.com/v2/resize:fill:64:64/1*pKOfOAOvx-fWzfITATgGRg.jpeg"
] |
[] |
[] |
[
""
] | null |
[
"Mike McMillan"
] |
2020-05-05T03:23:20.310000+00:00
|
In an earlier article I presented a brief history of high-level, algebraic programming languages starting with Fortran 58 and 60, leading up to the initial development of the Algol language in the…
|
en
|
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
|
Medium
|
https://medium.com/swlh/programming-history-the-influence-of-algol-on-modern-programming-languages-part-2-49ae6d30c8c0
|
In an earlier article I presented a brief history of high-level, algebraic programming languages starting with Fortran 58 and 60, leading up to the initial development of the Algol language in the years 1958 through 1960. In this article I’m going to discuss several of the technical features of Algol that influenced subsequent high-level programming languages such as BCPL and ultimately C.
Data Types
High-level programming languages before Algol did not have data types. Variables simply sprang into existing by assigning them a value, either an integer or a real number. Fortran did have a bit of data typing in that integers were to be defined using the letters I, J, K, L, M, and N. Any other variable that existed would be a real variable.
Algol began the practice of declaring the type of variable as part of its declaration. This allowed the compiler to perform more efficient by knowing the type of number that was going to be stored in the variable before a number was assigned to the variable.
However, as I discovered reading through a history of Algol that there wasn’t a real theory of data type built into the language until Algol 68.
Here is an example of how data types are used in Algol 60. This program fragment reads some values from a data card(!):
|
||
4768
|
dbpedia
|
2
| 49
|
https://www.computerhistory.org/timeline/software-languages/
|
en
|
Timeline of Computer History
|
https://www.computerhistory.org/timeline/_assets/wrapper/favicon.ico
|
https://www.computerhistory.org/timeline/_assets/wrapper/favicon.ico
|
[
"https://www.computerhistory.org/_wrapper/exhibit/img/chm-logo-text.svg",
"https://www.computerhistory.org/_wrapper/exhibit/img/chm-logo-short.svg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1945.zuse.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1948.shannon.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1952.hopper.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1953.speed.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1957.fortran.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1957.mathmatic.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1959.erma.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1960.cobol.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1961.ctss.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1962.apl.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1963.ascii.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1963.sketchpad.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1964.sabre.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1964.basic.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1965.simula.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1967.logo.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1968.gotoharmful.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1968.cics.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1969.unix.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1969.rs-232-c.2.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1970.wirth.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1972.c.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1976.cpm.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1978.wordstar.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1979.visicalc.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1981.msdos.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1982.lotus123.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1983.microsoft.word.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1983.gnustallman.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1984.matlab.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1984.verilog.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1985.c++.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1985.pagemaker.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1987.perl.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1987.hypercard.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1988.mathematica.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1990.windows.3.0.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1990.photoshop.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1991.linux.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1991.prettygoodprivacy.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1993.freebsd1.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1993.windowsnt.jpg",
"https://images.computerhistory.org/timeline/MEA-Acrobat-Reader-1.0-1024x577.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1995.java.1.0.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1995.javascript.jpg",
"https://images.computerhistory.org/timeline/win95-1-1.png",
"https://images.computerhistory.org/timeline/timeline_sw.languages_1997.visualstudio.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_2000.y2Kbug.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_2001.bittorrent.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_2001.macosx.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_2001.windowsXP.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_2001.itunes.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_2005.hadoop.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_2007.scratch2.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_2010.stuxnet.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_2011_adobe_creative.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_2012_instagram.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_2013_office365.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_2014_ipay.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_2014_html5.jpg",
"https://images.computerhistory.org/timeline/timeline_sw.languages_2014_heartbleed.jpg",
"https://www.computerhistory.org/timeline/_assets/wrapper/img/chm-logo-short.svg"
] |
[] |
[] |
[
""
] | null |
[] | null |
en
|
/timeline/_assets/wrapper/favicon.ico
| null |
Konrad Zuse begins work on Plankalkül (Plan Calculus), the first algorithmic programming language, with the goal of creating the theoretical preconditions for the solution of general problems. Seven years earlier, Zuse had developed and built the world´s first binary digital computer, the Z1. He completed the first fully functional program-controlled electromechanical digital computer, the Z3, in 1941. Only the Z4 â the most sophisticated of his creations â survived World War II.
John Backus completes Speedcode for IBM´s first large-scale scientific computer, the IBM 701. Although using Speedcode demanded a significant amount of scarce memory, it greatly reduced the time required to write a program. In 1957, Backus became project leader of the IBM FORTRAN project, which became the most popular scientific programming language in history and is still in use today.
An IBM team led by John Backus develops FORTRAN, a powerful scientific computing language that uses English-like statements. Some programmers were skeptical that FORTRAN could be as efficient as hand coding, but that sentiment disappeared when FORTRAN proved it could generate efficient code. Over the ensuing decades, FORTRAN became the most often used language for scientific and technical computing. FORTRAN is still in use today.
SRI International designs ERMA (Electronic Recording Machine, Accounting), for Bank of America. At the time, accounts were posted manually, a method that would quickly be outstripped by the growth in check writing after World War II. The ERMA project digitized checking by creating a computer-readable font. A special scanner read account numbers preprinted on checks using magnetic ink character recognition. In just one hour, ERMA could process the number of accounts that would have taken a well-trained banker nearly 17 workdays to complete.
A team drawn from several computer manufacturers and the Pentagon develop COBOLâan acronym for Common Business-Oriented Language. Many of its specifications borrow heavily from the earlier FLOW-MATIC language. Designed for business use, early COBOL efforts aimed for easy readability of computer programs and as much machine independence as possible. Designers hoped a COBOL program would run on any computer for which a compiler existed with only minimal modifications.
Howard Bromberg, an impatient member of the committee in charge of creating COBOL, had this tombstone made out of fear that the language had no future. However, COBOL survives to this day. A study in 1997 estimated that over 200 billion lines of COBOL code was still in existence, accounting for 80% of all business software code.
The increasing number of users needing access to computers in the early 1960s leads to experiments in timesharing computer systems. Timesharing systems can support many users â sometimes hundreds â by sharing the computer with each user. CTSS was developed by the MIT Computation Center under the direction of Fernando Corbató and was based on a modified IBM 7090, then later 7094, mainframe computer. Programs created for CTSS included RUNOFF, an early text formatting utility, and an early inter-user messaging system that presaged email. CTSS operated until 1973.
Kenneth Iversonâs book A Programming Language details a form of mathematical notation that he had developed in the late 1950s while an assistant professor at Harvard University. IBM hired Iverson and it was there that APL evolved into a practical programming language. APL was widely used in scientific, financial, and especially actuarial applications. Powerful functions and operators in APL are expressed with special characters, resulting in very concise programs.
ASCII â American Standard Code for Information Interchange â permits machines from different manufacturers to exchange data. The ASCII code consisted of 128 unique strings of ones and zeros. Each sequence represented a letter of the English alphabet, an Arabic numeral, an assortment of punctuation marks and symbols, or a function such as a carriage return. ASCII can only represent up to 256 symbols, and for this reason many other languages are better supported by Unicode, which has the ability to represent over 100,000 symbols.
Ivan Sutherland publishes Sketchpad, an interactive, real-time computer drawing system, as his MIT doctoral thesis. Using a light pen and Sketchpad, a designer could draw and manipulate geometric figures on a computer screen. Blossoming into the best known of the early drawing applications, Sketchpad influenced a generation of design and drafting programs. Although used mostly for engineering drawings, it had some artistic applications, including a famous drawing of Nefertiti that could be animated to a limited extent.
Thomas Kurtz and John Kemeny create BASIC (Beginnerâs All-purpose Symbolic Instruction Code), an easy-to-learn programming language, for their students at Dartmouth College who had no prior programming experience. Its use spread widely to schools all over the world. Over a decade later, most early personal computers were shipped with a version of BASIC embedded in their system, which opened up programming to an entirely new audience.
Seymour Papert designs LOGO as a computer language for children. Initially a drawing program, LOGO controlled the actions of a mechanical "turtle," which traced its path with pen on paper. Electronic turtles made their designs on a video display monitor.
Papert emphasized creative exploration over memorization of facts: "People give lip service to learning to learn, but if you look at curriculum in schools, most of it is about dates, fractions, and science facts; very little of it is about learning. I like to think of learning as an expertise that every one of us can acquire."
Edsger Dijkstra´s "GO TO considered harmful" letter is published in Communications of the ACM, fires the first salvo in the structured programming wars. He called for abolishing the unrestricted GOTO statements used in higher-level languages, and argued that they complicated programming. The ACM considered the resulting acrimony sufficiently harmful that it established a policy of no longer printing articles taking such an assertive position against a coding practice.
CICS (Customer Information Control System), an IBM transaction processing system, is released. Before CICS was introduced, many industries used punched card batch processing for high-volume customer transactions. As it allowed online transaction processing, CICS was able to replace this method and greatly sped up the way that companies interacted with their customers.
It was first used in the public utility industry for access to customer information and transactions, but soon after its release it was quickly adopted by a wide spectrum of industries including banking, oil, insurance and even smaller companies. Although it was originally intended to only last a few years, CICS is still in use today.
AT&T Bell Labs programmers Kenneth Thompson and Dennis Ritchie develop the UNIX operating system on a spare DEC minicomputer. UNIX combined many of the timesharing and file management features offered by Multics, from which it took its name. (Multics, a project of the mid-1960s, represented one of the earliest efforts at creating a multi-user, multi-tasking operating system.) The UNIX operating system quickly secured a wide following, particularly among engineers and scientists, and today is the basis of much of our worldâs computing infrastructure.
The RS-232-C standard for communications is adopted by the Electronic Industries Association. The standard permits computers and peripheral devices to transmit information serially â that is, one bit at a time. RS-232-C compatible ports were widely used for equipment like printers and modems. Compared to more modern interfaces, serial connections had slow transmission speeds, were bulky and have been largely replaced by USB ports on new PCs and peripheral equipment.
The Pascal programming language, named after Blaise Pascal, a French physicist, mathematician and inventor turned philosopher, is introduced by Professor Niklaus Wirth. His aim with Pascal was to develop a programming language applicable to both commercial and scientific applications, and which could also be used to teach programming techniques to college students. It was closely based on ALGOL 60, which Wirth had also helped to develop.
Gary Kildall develops the first commercially successful operating system for microcomputers, CP/M. He and his wife established Intergalactic Digital Research (modestly dropping âIntergalacticâ later) to market it. CP/M made it possible for one version of a program to run on a variety of computers built around eight-bit microprocessors. At one point Digital Research and Microsoft were approached by IBM about providing an operating system for its PC. Microsoft won the competition with its own operating system, called MS-DOS.
Rob Barnaby creates WordStar while at MicroPro International. Among the first popular word processing systems for personal computers, WordStar originally ran on the CP/M operating system, and later on DOS and Windows. In 1981, it had significant market share, in part because it came bundled with the Osborne 1 computer. WordStar retained a loyal following well after Microsoft Word surpassed it in sales.
Microsoft announces Word, originally called Multi-Tool Word. In a marketing blitz, Microsoft distributed 450,000 disks containing a demonstration version of its Word program in the November issue of PC World magazine, giving readers a chance to try the program for free. It competed with WordPerfect for market share as a word processing program, and it was not until Microsoft Word for Windows was introduced in 1989 that it became a global standard.
Richard Stallman, a programmer at MITâs Artificial Intelligence Lab, set out to develop a free alternative to the popular Unix operating system. This operating system called GNU (for Gnu's Not Unix) was going to be free of charge but also allow users the freedom to change and share it. Stallman founded the Free Software Foundation (FSF) based on this philosophy in 1985.
While the GNU work did not immediately result in a full operating system, it provided the necessary tools for creating another Unix-type system known as Linux. The software developed as part of the GNU project continues to form a large part of Linux, which is why the FSF asks for it to be called GNU/Linux.
Matlab (Matrix Laboratory), a high-level programming language, is released. It was designed by Professor Cleve Moler of the University of New Mexico and was initially intended to help students use mathematical software libraries without requiring knowledge of the scientific programming language FORTRAN. Its roots began in the academic community, but it spread quickly to many other areas of technical computing and is widely used today.
Phil Moorby and Prabhu Goel of Gateway Design Automation create Verilog, a hardware description language that is used in the design of digital circuitry. Initially designed for Gatewayâs Verilog XL Design Logic Simulator, it was a vast improvement over methods being used by circuit designers at the time.
Gateway Design Automation was acquired in 1989 by Cadence Design, which released the Verilog Hardware Description Language (HDL) into the public domain the following year. Verilog is now one of two hardware description languages used in the world today to design complex digital systems.
The C++ programming language emerges as the dominant object-oriented language in the computer industry when Bjarne Stroustrup publishes the book The C++ Programming Language. Stroustrup, from AT&T Bell Labs, said his motivation stemmed from a desire to create a language that would allow for more complex programs and which combined the low-level features of BCPL with the high-level structures of Simula.
According to Stroustrup: "C++ is a general purpose programming language designed to make programming more enjoyable for the serious programmer.â
Aldus announces its PageMaker program for use on Macintosh computers, launching the desktop publishing revolution. Two years later, Aldus released a version for the IBM PC. Developed by Paul Brainerd, PageMaker allowed users to combine graphics and text easily into professional quality documents.
Pagemaker was one of three components to the desktop publishing revolution. The other two were the invention of Postscript by Adobe and the LaserWriter laser printer from Apple. All three were necessary to create a desktop publishing environment.
Perl (Practical Extraction and Report Language) is written by Larry Wall. It was intended to facilitate report processing and could scan and extract information from text files and ultimately create reports generated from that information. It was designed for ease of use and quick programming and has found multiple applications in every branch of computing. It is very useful in making other programs work together and has been called âthe duct tape of the Internet.â
Apple engineer William Atkinson designs HyperCard, a software tool that simplifies development of in-house applications. In HyperCard, programmers built âstacksâ of information with the concept of hypertext links between stacks of pages. As a stack author, a programmer employed various tools to create his own stacks, linked together as a sort of slide show. Apple distributed the program free with Macintosh computers until 1992. Hypercard influenced the creation on the Internet protocol HTTP and JavaScript.
Microsoft ships Windows 3.0. Compatible with DOS programs, the first successful version of Windows finally offered good enough performance to satisfy PC users. For the new version, Microsoft updated the interface and created a design that allowed PCs to support large graphical applications for the first time. It also allowed multiple programs to run simultaneously on its Intel 80386 microprocessor. Microsoft lined up a number of other applications ahead of time that ran under Windows 3.0, including versions of Microsoft Word and Microsoft Excel. As a result, PC users were exposed to the user-friendly concepts of the Apple Macintosh, making the IBM PC more popular.
Photoshop is released. Created by brothers John and Thomas Knoll, Photoshop was an image editing program and the most popular software program published by Adobe Systems. Thomas, while earning a PhD at the University of Michigan, had created an early version of the program in 1987, and John saw a practical use for it as a special effects staff member at Industrial Light & Magic. It was then used for image editing in the âpseudopodâ scene in the movie The Abyss. When Adobe saw potential in the project they bought a license for distribution in 1989 and released the product on February 19, 1990.
Designed by Finnish university student Linus Torvalds, the Linux kernel is released to several Usenet newsgroups. Almost immediately, enthusiasts began developing and improving it, such as adding support for peripherals and improving its stability. In February 1992, Linux became free software or, as its developers preferred to say after 1998, âopen source.â Linux also incorporated some elements of the GNU operating system and is used today in devices ranging from smartphones to supercomputers.
Pretty Good Privacy, or PGP, a public-key encryption program, is introduced and is used for securing texts, emails and files. Its inventor, software engineer Phil Zimmermann, created it as a tool for people to protect themselves from intrusive governments, businesses, and institutions around the world. Zimmermann posted PGP on the Internet in 1991 where it was available as a free download. The United States government, concerned about the strength of PGP, which rivaled some of the best secret codes in use at the time, prosecuted Zimmermann but dropped its investigation in 1996.
FreeBSD, a complete Unix-like operating system is launched. It was the most widely used open-source BSD (Berkeley Software Distribution) variant. After its initial release, the software was significantly re-engineered due to a lawsuit between Unix copyright holder Unix Systems Laboratories and the University of California, Berkeley. The lawsuit revolved around source code in Berkeleyâs 4.3BSD-Lite which was the basis of the FreeBSD operating system. FreeBSD incorporated features including networking, storage, security, portability and Linux compatibility.
Microsoft Windows NT is released. Work on the project began in the late 1980s in an effort spearheaded by a group of former Digital Equipment Corporation employees led by Dave Cutler. It was the first truly 32-bit version of Windows from Microsoft, which made it appealing to high-end engineering and scientific users that required better performance. A number of subsequent versions of Windows were based on NT technology.
On August 24th, 1995, Microsoft's Windows 95 operating system was launched with one of the most sweeping media campaigns in the history of computing. A $300 million promotional blitz included television commercials using the Rolling Stones' song "Start Me Up," as well as a 30-minute promotional video starring actors from the top-rated television comedy "Friends," made it one of the most anticipated product launches in American business history.
Windows 95 was a major improvement over the previous Windows 3.1, adding features such as âPlug and Playâ for more easily installing new peripherals, as well as the Taskbar and Start button menu. More than one million units were sold in the first four days following the release, and within two years, Windows 95 was installed on more computers than any other operating system.
During the late 1990s, the impending Year 2000 (Y2K) bug fuels news reports that the onset of the year 2000 will cripple telecommunications, the financial sector and other vital infrastructure. The issue was rooted in the fact that date stamps in most previously written software used only two digits to represent year information. This meant that some computers might not be able to distinguish the year 1900 from the year 2000. Although there were some minor glitches on New Yearâs Day in 2000, no major problems occurred, in part due to a massive effort by business, government and industry to repair their code beforehand.
Mac OS X is released. It was a significant departure from the classic Mac OS as it was based on the Unix-like operating systems FreeBSD, NetBSD and NeXTSTEP/OpenStep. OS X introduced a more stable and reliable platform and multiple applications could more efficiently be run at the same time. The Mac OS X 10.6 ("Snow Leopard") update, released in 2009, completed the Mac's transition from a 32-bit to a 64-bit operating system.
The Windows XP operating system is released. Based on the Windows NT kernel, XP was considered more stable than previous versions of the operating system. XP was widely adopted by industry and persisted much longer than Microsoft planned. For example, in 2014, 95% of the worldâs automated teller machines ran XP. Microsoft support for XP ended on April 8, 2014.
Appleâs iTunes is released. It was based on Bill Kincaidâs SoundJam MP software, the rights to which Apple purchased. Initially, iTunes was only supported on the Mac operating system and functioned as a media player and media management tool. iTunes allowed users to record music from CDs, bring it into iTunes, mix it with other songs and then burn a custom CD. When the Apple iTunes music store was launched in 2003, it transformed music distribution and the entire music industry. Less than a week after its launch, over one million songs were downloaded. By 2013, over 25 billion songs had been downloaded from the iTunes store.
Hadoop is an open source software project initially developed by Google as a means of extracting search results from large amounts of unstructured data, such as data found on the web. It was used by many large corporations where networked scalability, cost effectiveness and fault tolerance were critical to their business models. Companies such as Google, Yahoo, American Airlines, IBM and Twitter all used Hadoop, and it could be scaled from a single server to thousands. With Hadoop different types of data could be seamlessly integrated and Hadoop could redirect work to another system if a node failed in the cluster.
Scratch is released to the public. A free programming language that focused on education, it was designed by a team led by Mitchel Resnick at the MIT Media Lab Lifelong Kindergarten Group. Intended to be used by educators, students and parents as a teaching language, it had a number of applications in educational settings. These included math, computer science, language arts and social studies. Its interface allowed novice users to stack and organize block commands to write programs. Scratch has millions of users worldwide and is available in more than 40 languages.
The Stuxnet virus is widely reported in the media due to attacks centered in Iran. The virus attempted to damage uranium enrichment centrifuges used in Iranâs nuclear development program by causing damaging speed variations. Although it was recognized that some centrifuges were rendered inoperable by the virus, the full extent of the damage remained unknown. Stuxnet brought attention to the fragile nature of global infrastructure in a networked world.
Adobe Creative Cloud is announced as a subscription and cloud-based model of distribution for its major software products. Adobe Acrobat, Illustrator, Dreamweaver, Photoshop, and others, could be subscribed to either as a complete package or individually to suit user needs. This model also allowed Adobe to begin releasing continuous updates to their products, shortening the development cycle and the time need to incorporate new features.
Instagram, an image-sharing and social networking application, is purchased by Facebook for nearly $1 billion. It was initially launched in October 2010 by founders Kevin Systrom and Mike Krieger and became an instant hit, with over 100 million active users by early 2013. Photos and videos (with 15 second maximum length) could be shared among users, who could then annotate these images with specific hash tags to enable them to be easily shared among other social media platforms such as Twitter and Facebook. Instagram also allowed users to manipulate their photos with a variety of digital filters such as âSlumber,â âKelvin,â â1977,â âSierra,â and âInkwell.â
An updated Microsoft Office 365 is announced. It was a subscription-based software product. Microsoftâs Word, Excel, OneNote, PowerPoint, Outlook, Access, and Publisher were all available in packages for a monthly or annual subscription. Also included with a subscription was 1 TB of cloud storage on Microsoftâs One Drive (formerly Skydrive). Home, personal, university, business, and enterprise subscription plans were made available for a wide range of users. Microsoftâs change to a subscription model was not unique: Apple, Adobe, IBM and many other large software and technology companies adopted this model as well.
The Apple Pay mobile payment system is introduced into Appleâs product ecosystem. Initially only available for the iPhone 6 and 6 Plus, iWatch, iPad Air 2, and iPad Mini 3, many major banks and credit card companies participated in the Apple Pay system. The deviceâs near field communications (NFC) interface, Passbook app, and Appleâs Touch ID system worked in tandem with point-of-sale systems in retail outlets to complete transactions. Apple Pay could also be used for online purchases.
HTML 5 is announced as the successor to HTML 4, which had become the standard for web markup languages in 1997. Markup languages describe how web pages will look and function. Work on HTML 5 had begun in 2004 under the auspices of the Web Hypertext Application Technology Working Group. It was simplified compared to its predecessors and was intended to be human-readable. HLTML 5 also offered a number of improvements for multimedia, such as simplifying the embedding of content such as streaming video and games into web pages.
The Heartbleed bug is uncovered as a dangerous security flaw in the code base of the OpenSSL cryptographic software library. OpenSSL protected a significant portion of the worldâs web servers, and nearly 20% of them were found to be vulnerable to attack from this particular security bug, which allowed hackers to eavesdrop on the communications of unsuspecting victims and steal sensitive information such as user names and passwords, emails, instant messages, and even confidential files and documents. Although it was a dangerous and widespread bug, installation of the âFixed OpenSSLâ library by service providers and users greatly reduced its effectiveness.
|
||||
4768
|
dbpedia
|
1
| 30
|
https://thehistoryofcomputing.net/website/algol
|
en
|
The History of Computing: ALGOL
|
http://assets.libsyn.com/content/73786964?height=250&width=250&overlay=true
|
http://assets.libsyn.com/content/73786964?height=250&width=250&overlay=true
|
[
"https://ssl-static.libsyn.com/p/assets/platform/websuite/deezer-logo.png",
"https://ssl-static.libsyn.com/p/assets/platform/websuite/stitcher.png",
"https://ssl-static.libsyn.com/p/assets/platform/websuite/radio-public.png",
"https://assets.libsyn.com/secure/show/197558/become-a-patron-button-1.png"
] |
[] |
[] |
[
""
] | null |
[] | null |
Today we’re going to cover a computer programming language many might not have heard of, ALGOL. ALGOL was written in 1958. It wasn’t like many of the other languages in that it was built by committee. The Association for Computing Machinery and the German Society of Applied Mathematics and Mechanics were floating around ideas for a universal computer programming language. Members from the ACM were a who’s who of people influential in the transition from custom computers that were the size of small homes to mainframes. John Backus of IBM had written a programming language called Speedcoding and then Fortran. Joseph Wegstein had been involved in the development of COBOL. Alan Perlis had been involved in Whirlwind and was with the Carnegie Institute of Technology. Charles Katz had worked with Grace Hopper on UNIVAC and FLOW-MATIC. The Germans were equally as influential. Frederich Bauer had brought us the stack method while at the Technical University of Munich. Hermann Bottenbruch from The Institute for Applied Mathematics had written a paper on constructing languages. Klaus Samelson had worked on a computer called PERM that was similar to the MIT Whirlwind project. He’d come into computing while studying Eigenvalues. Heinz Ritishauser had written a number of papers on programming techniques and had codeveloped the language Superplan while at the The Swiss Federal Institute of Technology. This is where the meeting would be hosted. They went from May 27th to June 2nd in 1958 and initially called the language they would develop as IAL, or the International Algebraic Language. But would expand the name to ALGOL, short for Algorithmic Language. They brought us code blocks, the concept that you have a pair of words or symbols that would begin and end a stanza of code, like begin and end. They introduced nested scoped functions. They wrote the whole language right there. You would name a variable by simply saying integer or setting the variable as a := 1. You would substantiate a for and define the steps to perform until - the root of what we would now call a for loop. You could read a variable in from a punch card. It had built-in SIN and COSIN. It was line based and fairly simple functional programming by today’s standards. They defined how to handle special characters, built boolean operators, floating point notation. It even had portable types. And by the end had a compiler that would run on the Z22 computer from Konrad Zuse. While some of Backus’ best work it effectively competed with FORTRAN and never really gained traction at IBM. But it influenced almost everything that happened afterwards. Languages were popping up all over the place and in order to bring in more programmers, they wanted a formalized way to allow languages to flourish, but with a standardized notation system so algorithms could be published and shared and developers could follow along with logic. One outcome of the ALGOL project was the Backus–Naur form, which was the first such standardization. That would be expanded by Danish Peter Naur for ALGOL 60, thus the name. In ALGOL 60 they would meet in Paris, also adding Father John McCarthy, Julien Green, Bernard Vauquois, Adriaan van Wijngaarden, and Michael Woodger. It got refined, yet a bit more complicated. FORTRAN and COBOL use continued to rage on, but academics loved ALGOL. And the original implementation now referred to as the ZMMD implementation, gave way to X1 ALGOL, Case ALGOL, ZAM in Poland, GOGOL, VALGOL, RegneCentralen ALGOL, Whetstone ALGOL for physics, Chinese ALGOL, ALGAMS, NU ALGOL out of Norway, ALGEK out of Russia, Dartmouth ALGOL, DG/L, USS 90 Algol, Elliot ALGOL, the ALGOL Translator, Kidsgrove Algol, JOVIAL, Burroughs ALGOL, Niklaus Firths ALGOL W, which led to Pascal, MALGOL, and the last would be S-algol in 1979. But it got overly complicated and overly formal. Individual developers wanted more flexibility here and there. Some wanted simpler languages. Some needed more complicated languages. ALGOL didn’t disappear as much as it evolved into other languages. Those were coming out fast and with a committee to approve changes to ALGOL, they were much slower to iterate. You see, ALGOL profoundly shaped how we think of programming languages. That formalization was critical to paving the way for generations of developers who brought us future languages. ALGOL would end up being the parent of CPL and through CPL, BCPL, C, C++, and through that Objective-C. From ALGOL also sprang Simula and through Simula, Smalltalk. And Pascal and from there, Modula and Delphi. It was only used for a few years but it spawned so much of what developers use to build software today. In fact, other languages evolved as anti-ALGOL-derivitives, looking at how you did something and deciding to do it totally differently. And so we owe this crew our thanks. They helped to legitimize a new doctrine, a new career, computer programmer. They inspired. They coded. And in so doing, they helped bring us into the world of functional programming and set structures that allowed the the next generation of great thinkers to go even further, directly influencing people like Adele Goldberg and Alan Kay. And it’s okay that the name of this massive contribution is mostly lost to the annals of history. Because ultimately, the impact is not. So think about this - what can we do to help shape the world we live in? Whether it be through raw creation, iteration, standardization, or formalization - we all have a role to play in this world. I look forward to hearing more about yours as it evolves!
|
en
|
https://assets.libsyn.com/secure/content/47114459?height=16&width=16
|
https://thehistoryofcomputing.net/website/algol
|
Today we’re going to cover a computer programming language many might not have heard of, ALGOL.
ALGOL was written in 1958. It wasn’t like many of the other languages in that it was built by committee. The Association for Computing Machinery and the German Society of Applied Mathematics and Mechanics were floating around ideas for a universal computer programming language.
Members from the ACM were a who’s who of people influential in the transition from custom computers that were the size of small homes to mainframes. John Backus of IBM had written a programming language called Speedcoding and then Fortran. Joseph Wegstein had been involved in the development of COBOL. Alan Perlis had been involved in Whirlwind and was with the Carnegie Institute of Technology. Charles Katz had worked with Grace Hopper on UNIVAC and FLOW-MATIC.
The Germans were equally as influential. Frederich Bauer had brought us the stack method while at the Technical University of Munich. Hermann Bottenbruch from The Institute for Applied Mathematics had written a paper on constructing languages. Klaus Samelson had worked on a computer called PERM that was similar to the MIT Whirlwind project. He’d come into computing while studying Eigenvalues.
Heinz Ritishauser had written a number of papers on programming techniques and had codeveloped the language Superplan while at the The Swiss Federal Institute of Technology. This is where the meeting would be hosted.
They went from May 27th to June 2nd in 1958 and initially called the language they would develop as IAL, or the International Algebraic Language. But would expand the name to ALGOL, short for Algorithmic Language. They brought us code blocks, the concept that you have a pair of words or symbols that would begin and end a stanza of code, like begin and end. They introduced nested scoped functions. They wrote the whole language right there. You would name a variable by simply saying integer or setting the variable as a := 1. You would substantiate a for and define the steps to perform until - the root of what we would now call a for loop. You could read a variable in from a punch card. It had built-in SIN and COSIN. It was line based and fairly simple functional programming by today’s standards. They defined how to handle special characters, built boolean operators, floating point notation. It even had portable types.
And by the end had a compiler that would run on the Z22 computer from Konrad Zuse. While some of Backus’ best work it effectively competed with FORTRAN and never really gained traction at IBM. But it influenced almost everything that happened afterwards.
Languages were popping up all over the place and in order to bring in more programmers, they wanted a formalized way to allow languages to flourish, but with a standardized notation system so algorithms could be published and shared and developers could follow along with logic. One outcome of the ALGOL project was the Backus–Naur form, which was the first such standardization. That would be expanded by Danish Peter Naur for ALGOL 60, thus the name.
In ALGOL 60 they would meet in Paris, also adding Father John McCarthy, Julien Green, Bernard Vauquois, Adriaan van Wijngaarden, and Michael Woodger. It got refined, yet a bit more complicated. FORTRAN and COBOL use continued to rage on, but academics loved ALGOL.
And the original implementation now referred to as the ZMMD implementation, gave way to X1 ALGOL, Case ALGOL, ZAM in Poland, GOGOL, VALGOL, RegneCentralen ALGOL, Whetstone ALGOL for physics, Chinese ALGOL, ALGAMS, NU ALGOL out of Norway, ALGEK out of Russia, Dartmouth ALGOL, DG/L, USS 90 Algol, Elliot ALGOL, the ALGOL Translator, Kidsgrove Algol, JOVIAL, Burroughs ALGOL, Niklaus Firths ALGOL W, which led to Pascal, MALGOL, and the last would be S-algol in 1979.
But it got overly complicated and overly formal. Individual developers wanted more flexibility here and there. Some wanted simpler languages. Some needed more complicated languages. ALGOL didn’t disappear as much as it evolved into other languages. Those were coming out fast and with a committee to approve changes to ALGOL, they were much slower to iterate.
You see, ALGOL profoundly shaped how we think of programming languages. That formalization was critical to paving the way for generations of developers who brought us future languages. ALGOL would end up being the parent of CPL and through CPL, BCPL, C, C++, and through that Objective-C. From ALGOL also sprang Simula and through Simula, Smalltalk. And Pascal and from there, Modula and Delphi. It was only used for a few years but it spawned so much of what developers use to build software today.
In fact, other languages evolved as anti-ALGOL-derivitives, looking at how you did something and deciding to do it totally differently.
And so we owe this crew our thanks. They helped to legitimize a new doctrine, a new career, computer programmer. They inspired. They coded. And in so doing, they helped bring us into the world of functional programming and set structures that allowed the the next generation of great thinkers to go even further, directly influencing people like Adele Goldberg and Alan Kay.
And it’s okay that the name of this massive contribution is mostly lost to the annals of history. Because ultimately, the impact is not. So think about this - what can we do to help shape the world we live in? Whether it be through raw creation, iteration, standardization, or formalization - we all have a role to play in this world. I look forward to hearing more about yours as it evolves!
|
|||
4768
|
dbpedia
|
3
| 29
|
https://craftofcoding.wordpress.com/tag/algol/
|
en
|
Algol – The Craft of Coding
|
[
"https://craftofcoding.wordpress.com/wp-content/uploads/2021/03/algorithm35.png?w=572",
"https://craftofcoding.wordpress.com/wp-content/uploads/2021/02/wirthlanguages1-1.png?w=599",
"https://craftofcoding.wordpress.com/wp-content/uploads/2021/02/wirthlanguages2.png?w=620",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://pixel.wp.com/b.gif?v=noscript"
] |
[] |
[] |
[
""
] | null |
[] |
2021-04-06T20:03:37+00:00
|
Posts about Algol written by spqr
|
en
|
https://s1.wp.com/i/favicon.ico
|
The Craft of Coding
|
https://craftofcoding.wordpress.com/tag/algol/
|
The Sieve of Eratosthenes likely first appeared in computing in 1961 as Algorithm 35 in Communications of the ACM (p.151). It, likely many other algorithms of the time was shown in Algol-60.
To try and implement it as close as possible to the original syntax, given that there are few if any Algol-60 compilers out there, I chose to reproduce the algorithm in its entirety in Algol-68 using the Algol-68 Genie (a68g) specification. It wasn’t a lot of fun.
PROC sieve = (INT nmax, REF INT j) []INT: ( [nmax]INT primes; INT i; primes[1] := 1; primes[2] := 2; primes[3] := 3; j := 3; FOR n FROM 3 BY 2 TO nmax DO i := 3; l1: IF primes[i] <= sqrt(n) THEN GOTO a1 ELSE GOTO a2 FI; a1: IF n/primes[i] = n%primes[i] THEN GOTO b1 ELSE GOTO b2 FI; b2: i := i + 1; GOTO l1; a2: primes[j] := n; j := j + 1; b1: print("") OD; j := j - 1; primes ); main: ( INT n, numpr; print("Upper limit to calculate sieve? "); read(n); [n]INT primes; primes := sieve(n,numpr); FOR i FROM 1 TO numpr DO print(whole(primes[i],5)); IF i MOD 10 = 0 THEN print(new line) FI OD )
It was an interesting experience, partially because the documentation for a68g is okay, but it is filled with the same obtuse stuff in the original reports for Algol-68. There is a lot to this language, and trying to find simple things, like how to pass back a variable from a subprogram is trying. There are things I inherently like, like the ability to specify the size of the array to be created within the function (proc seems like a procedure, but behaves more like a function). And I like that assignment is :=, and equality is =. It’s also nice to use a language where it is easy to return an array from a function. Here is the program executed with the value of n set to 1000.
Upper limit to calculate sieve? 1000 +1 +2 +3 +5 +7 +11 +13 +17 +19 +23 +29 +31 +37 +41 +43 +47 +53 +59 +61 +67 +71 +73 +79 +83 +89 +97 +101 +103 +107 +109 +113 +127 +131 +137 +139 +149 +151 +157 +163 +167 +173 +179 +181 +191 +193 +197 +199 +211 +223 +227 +229 +233 +239 +241 +251 +257 +263 +269 +271 +277 +281 +283 +293 +307 +311 +313 +317 +331 +337 +347 +349 +353 +359 +367 +373 +379 +383 +389 +397 +401 +409 +419 +421 +431 +433 +439 +443 +449 +457 +461 +463 +467 +479 +487 +491 +499 +503 +509 +521 +523 +541 +547 +557 +563 +569 +571 +577 +587 +593 +599 +601 +607 +613 +617 +619 +631 +641 +643 +647 +653 +659 +661 +673 +677 +683 +691 +701 +709 +719 +727 +733 +739 +743 +751 +757 +761 +769 +773 +787 +797 +809 +811 +821 +823 +827 +829 +839 +853 +857 +859 +863 +877 +881 +883 +887 +907 +911 +919 +929 +937 +941 +947 +953 +967 +971 +977 +983 +991 +997
Language design in the 1960’s was dominated by attempts to improve upon Algol60. Hoare noted in 1973 [1] that ALGOL-60 was “… a language so far ahead of its time, that it was not only an improvement on its predecessors, but also on nearly all its successors”. Niklaus Wirth was a graduate student at University of California, Berkeley when he started playing around with languages. He joined a research group which was engaged with the implementation and improvement of a dialect of ALGOL-58, NELIAC. He described the compiler as “an intricate mess“, and the process one of “1% science, and 99% sorcery” [1].
The first language Wirth designed leading to his dissertation was Euler, as Wirth himself put it “a trip with the bush knife through the jungle of language features and facilities” [1]. Euler had academic elegance, but no real practical value, however it did catch the attention of the IFIP Working Group, engaged in designing the successor to ALGOL-60. There seemed to be two camps here, one which wanted to push the boundaries of language design and another which wanted to extend ALGOL-60. Wirth belonged to the latter group. In 1965, three reports by Wirth, Seegmüller, and Wijngaarden described three different quasi-complete languages. The complexity of the design process, involving far too many people, eventually led Wirth to become disheartened with the design process, and he went off to develop his own version of ALGOL-60s successor. Designed with contributions from Tony Hoare, this language would become ALGOL-W.
Wirth’s first significant language was PL360, a byproduct of the ALGOL-W effort. The IBM 360 upon which ALGOL-W was implemented offered the choice of assembly language or Fortran compilers, neither of which was very attractive. PL360 [2] was a tool with which to implement ALGOL-W. ALGOL-W had a number of applications, but was deficient as a systems programming language. PL360 was to become more successful than ALGOL-W, largely because ALGOL-W was a complex language, and the target computers inadequate. In the fall of 1967, Wirth returned to Switzerland to begin work on the language that would become most closely associated with his language design efforts – Pascal. Wirth would go on to create a trinity of languages, neither of which were that closely related, but all that had one thing in common, they were “ALGOL-like” languages – Pascal, Modula-2 and Oberon.
Pascal – Based largely on Algol W, including the use of it’s while and case statements, and record structures. There were syntactic differences, however Algol 60 was almost a subset of Pascal.
Modula-2 – Wirth noted that Modula-2 “includes all aspects of Pascal, and extends them with the module concept”.
Oberon – Evolved from Modula-2 by very few additions, and several subtractions.
Note that “ALGOL-like” really implied – a procedural paradigm, a rigorously defined syntax, traditional mathematical notation (with the nonsense of symbols like ++), block structure providing identifier scope, the availability of recursion, and a strict, static data typing system. [3]
[1] Wirth, N., “From programming language design to computer construction”, CACM, 28(2), pp. 160-164 (1985)
[2] Wirth, N., “PL360, a programming language for the 360 computers”, Journal of the ACM, 15(1), pp.34-74 (1968)
[3] Wirth, N., “Modula-2 and Oberon”, in ACM Conf. on History of Programming Languages, pp.3-1-3-10 (2007)
The cool thing about some languages is that you don’t have to be a guru in order to interpret them.
Take Algol 60 for example. Algol wasn’t the best language in the world, but it had some nice features. For example you could use a if statement within an assignment to assign values. Here’s an example:
postage := if weight ≤ 20 then 40 else if weight ≤ 125 then 80 else if weight ≤ 500 then 120 else 200;
Okay, some people will say “But C can do that too!” Yes it can, but certainly not as elegantly. Here is the C expression, using a ternary operator.
postage = (weight <= 20) ? 40 : (weight <= 125) ? 80 : (weight <= 500) ? 120 : 200 ;
It basically embeds a ternary conditional each time the “else if” is invoked. It just isn’t very elegant to look at, let alone write, or understand.
After the freewheeling fifties, programming languages entered a more bohemian time, with many languages appearing, and those 1950s languages evolving, as new programming ideologies evolved. First out of the gate was ALGOL 60 which evolved from the fledgling Algol 58. It was common practice to “re-design” a language quite considerably, rather than make tweaks to an existing language. As such languages such as Algol 58 often fell out of favour as people adopted newer versions of a language. Algol was to spawn what would eventually become quite an issue in the computing community: augmentations, extensions, and derivations, leading to numerous dialects of a language. ALGOL 68 appeared just before its contemporaries C and Pascal, however due to inherent complexities, never really hit it off. (ALGOL 68 was used by European defense agencies, however the US decided to hedge its bets with Ada.) There was some controversy over the design of a successor to Algol 60, with Wirth and Hoare going on to develop their own successor, Algol-W. Algol 68 was a major revision to Algol 60, whereas Algol-W included more subtle changes. The 1960’s also saw a number of updates to both Fortran and Cobol.
Languages had also begun to diversify into differing realms. In 1964, Kenneth E. Iverson introduced APL, (A Programming Language) a concise symbol-based language adept at dealing with arrays. Early versions of the language contained no control structures, and whilst Cobol may have strayed too far into “English-language” syntax, APL may have gone too far with its mathematical syntax. APL was popular with those doing computer graphics, but its use declined in the 1980s, due in part to the advent of languages such as MATLAB and GNU Octave. The 1960s also saw the second generation of programming languages evolve, those influenced by the likes of Fortran or Algol. In 1964 PL/I (Programming Language One) made its appearance, supposedly for use in data processing, numerical computation, scientific computing, and systems programming. However it was a more complex language than either Fortran or Cobol, from which it had evolved, and was not extremely successful. Algol also influenced the design of Simula, the first OO language which appeared in 1967. In an educational context, Logo appeared in 1967, famous for its use of “turtle graphics”, influenced by Lisp.
P.S. If you want to have some fun with a Logo interpreter, check this out.
|
||||
4768
|
dbpedia
|
0
| 0
|
https://en.wikipedia.org/wiki/ALGOL_58
|
en
|
Wikipedia
|
https://en.wikipedia.org/static/favicon/wikipedia.ico
|
https://en.wikipedia.org/static/favicon/wikipedia.ico
|
[
"https://en.wikipedia.org/static/images/icons/wikipedia.png",
"https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-wordmark-en.svg",
"https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-tagline-en.svg",
"https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png",
"https://login.wikimedia.org/wiki/Special:CentralAutoLogin/start?type=1x1",
"https://en.wikipedia.org/static/images/footer/wikimedia-button.svg",
"https://en.wikipedia.org/static/images/footer/poweredby_mediawiki.svg"
] |
[] |
[] |
[
""
] | null |
[
"Contributors to Wikimedia projects"
] |
2004-08-30T19:27:07+00:00
|
en
|
/static/apple-touch/wikipedia.png
|
https://en.wikipedia.org/wiki/ALGOL_58
|
Programming language
ALGOL 58Paradigmprocedural, imperative, structuredFamilyALGOLDesigned byFriedrich L. Bauer, Hermann Bottenbruch, Heinz Rutishauser, Klaus Samelson, John Backus, Charles Katz, Alan Perlis, Joseph Henry WegsteinFirst appeared1958; 66 years ago ( )Typing disciplineStatic, strongScopeLexicalInfluenced byFORTRAN, IT, Plankalkül,[1] Superplan, Sequentielle FormelübersetzungInfluencedMost subsequent imperative languages (Algol-like)
ALGOL 58, originally named IAL, is one of the family of ALGOL computer programming languages. It was an early compromise design soon superseded by ALGOL 60. According to John Backus:
The Zurich ACM-GAMM Conference had two principal motives in proposing the IAL: (a) To provide a means of communicating numerical methods and other procedures between people, and (b) To provide a means of realizing a stated process on a variety of machines...[2]
ALGOL 58 introduced the fundamental notion of the compound statement, but it was restricted to control flow only, and it was not tied to identifier scope in the way that Algol 60's blocks were.
Name
[edit]
Bauer attributes the name to Hermann Bottenbruch, who coined the term algorithmic language (algorithmische Sprache) in 1957, "at least in Germany".[3][4]
History
[edit]
There were proposals for a universal language by the Association for Computing Machinery (ACM) and also by the German Gesellschaft für Angewandte Mathematik und Mechanik ("Society of Applied Mathematics and Mechanics") (GAMM). It was decided to organize a joint meeting to combine them. The meeting took place from May 27 to June 2, 1958, at ETH Zurich and was attended by the following people:
Friedrich L. Bauer, Hermann Bottenbruch, Heinz Rutishauser, and Klaus Samelson (from the GAMM)
John Backus, Charles Katz, Alan Perlis, and Joseph Henry Wegstein (from the ACM).
The language was originally proposed to be called IAL (International Algebraic Language) but according to Perlis,[5] this was rejected as an "'unspeakable' and pompous acronym". ALGOL was suggested instead, though not officially adopted until a year later. The publication following the meeting still used the name IAL.[6]
By the end of 1958 the ZMMD-group had built a working ALGOL 58 compiler for the Z22 computer. ZMMD was an abbreviation for Zürich (where Rutishauser worked), München (workplace of Bauer and Samelson), Mainz (location of the Z22 computer), Darmstadt (workplace of Bottenbruch).
ALGOL 58 saw some implementation effort at IBM, but the effort was in competition with FORTRAN, and soon abandoned. It was also implemented at Dartmouth College on an LGP-30, but that implementation soon evolved into ALGOL 60. An implementation for the Burroughs 220 called BALGOL evolved along its own lines as well, but retained much of ALGOL 58's original character.[7]
ALGOL 58's primary contribution was to later languages; it was used as a basis for JOVIAL,[4] MAD, NELIAC[4] and ALGO. It was also used during 1959 to publish algorithms in CACM, beginning a trend of using ALGOL notation in publication that continued for many years.
Time line of implementations of ALGOL 58 variants
[edit]
Name Year Author State Description Target CPU ZMMD-implementation 1958 Friedrich L. Bauer, Heinz Rutishauser, Klaus Samelson, Hermann Bottenbruch Germany Z22 NELIAC 1958 Naval Electronics Laboratory USA AN/USQ-17 JOVIAL 1960 Jules Schwartz USA Was the DOD HOL prior to Ada (programming language) Various (see article) BALGOL 1960 Joel Merner et al. USA Burroughs Corporation B220 MAD 1960 University of Michigan USA IBM 7090/7094 mainframe, then mid-1960s ported to Univac 1108 Dartmouth ALGOL 30 1962 Thomas Eugene Kurtz et al. - evolved into ALGOL 60 USA LGP-30 SUBALGOL 1962 Bob Braden, Lawrence M. Breed and Roger Moore, Stanford University USA BALGOL extension IBM 7090 ALGO ~ Bendix Corporation USA Bendix G-15
ALGOL 58's influence on ALGOL 60
[edit]
IAL introduced the three-level concept of reference, publication and hardware language, and the concept of "word delimiters" having a separate representation from freely chosen identifiers (hence, no reserved words). ALGOL 60 kept this three-level concept.[8]
The distinction between assignment (:= representing a left-facing arrow) and the equality relation = was introduced in IAL and kept in ALGOL 60.
Both IAL and ALGOL 60 allow arrays with arbitrary lower and upper subscript bounds, and allow subscript bounds to be defined by integer expressions.
Both IAL and ALGOL 60 allow nesting of procedure declarations and the corresponding identifier scopes.
The IAL report described parameter substitution in much the same terms as the ALGOL 60 report, leaving open the possibility of call by name. It is unclear if this was realized at the time.
IAL allows numeric statement labels, that ALGOL 60 kept.
The possibility of including non-ALGOL code within a program was already hinted at, in the context of parameters to procedures.
Both IAL and ALGOL 60 have a switch designator, unrelated, however, to the switch statement in C and other languages.
In-line functions of the form f(x) := x / 2; were proposed in IAL but dropped in ALGOL 60.
IAL procedure declarations provide separate declaration lists for input and output parameters, a procedure can return multiple values; this mechanism was replaced in ALGOL 60 with the value declaration.
Variable declarations in IAL can be placed anywhere in the program and not necessarily at the beginning of a procedure. In contrast, the declarations within an ALGOL 60 block should occur before all execution statements.
The for-statement has the form for i:=base(increment)limit, directly resembling the loop of Rutishauser's programming language Superplan, replacing =with :=, and replacing its German keyword Für with the direct English translation for; ALGOL 60 replaced the parentheses with the word delimiters step and until, such that the previous statement instead would be i:=base step increment until limit.
The IAL if-statement does not have a then-clause or else-clause; it rather guards the succeeding statement. IAL provides an if either-statement that cleanly allows testing of multiple conditions. Both were replaced by ALGOL's if-then construct, with the introduction of the "dangling-else" ambiguity.
IAL provides macro-substitution with the do-statement; this was dropped in ALGOL 60.
IAL allows one or more array subscripts to be omitted when passing arrays to procedures, and to provide any or all arguments to a procedure passed to another procedure.
IAL's infix boolean operators are all of the same precedence level. Exponents are indicated with paired up and down arrows, which removed any confusion about the correct interpretation of nested exponents; ALGOL 60 replaced the paired arrows with a single up-arrow whose function is equivalent to FORTRAN's **.
The IAL report does not explicitly specify which standard functions were to be provided, making a vague reference to the "standard functions of analysis." The ALGOL 60 report has a more explicit list of standard functions.
References
[edit]
|
||||
4768
|
dbpedia
|
0
| 93
|
https://www.summarize.tech/www.youtube.com/watch%3Fv%3DayWjESs4ZCk
|
en
|
[] |
[] |
[] |
[
""
] | null |
[] | null | null |
https://www.youtube.com/watch%3Fv%3DayWjESs4ZCk is not a valid YouTube video link. Try another video.
Copyright © 2024 Summarize, LLC. All rights reserved. · Terms of Service · Privacy Policy · As an Amazon Associate, summarize.tech earns from qualifying purchases.
|
|||||||||
4768
|
dbpedia
|
2
| 24
|
https://www.scribd.com/document/521912766/3-Algol
|
en
|
After Fortran: Chapter 3: Generality and Hierarchy: ALGOL-60
|
https://imgv2-2-f.scribdassets.com/img/document/521912766/original/4f8dda1dca/1723667697?v=1
|
https://imgv2-2-f.scribdassets.com/img/document/521912766/original/4f8dda1dca/1723667697?v=1
|
[
"https://s-f.scribdassets.com/webpack/assets/images/shared/gr_table_reading.9f6101a1.png"
] |
[] |
[] |
[
""
] | null |
[
"Nawij Itrahg"
] | null |
3. Algol - Free download as PDF File (.pdf), Text File (.txt) or read online for free. 1) Algol-60 was created in 1960 by an international committee to establish a universal programming language as a successor to FORTRAN that was not dependent on any specific computing platform.
2) Algol-60 had a hierarchical, block-structured syntax defined using Backus-Naur Form (BNF) which allowed statements and other language constructs to be nested.
3) Key features included support for procedural abstraction, conditionals, loops, arrays and other basic programming constructs while avoiding features like input/output which were platform dependent.
|
en
|
https://s-f.scribdassets.com/scribd.ico?6e29186df?v=5
|
Scribd
|
https://www.scribd.com/document/521912766/3-Algol
|
After Fortran: Chapter 3: Generality and Hierarchy: ALGOL-60
|
||
4768
|
dbpedia
|
2
| 73
|
https://worldwidescience.org/topicpages/f/fortran%2Bprogramming%2Blanguages.html
|
en
|
fortran programming languages: Topics by WorldWideScience.org
|
[
"https://worldwidescience.org/sites/www.osti.gov/files/public/image-files/WWSlogo_wTag650px-min.png",
"https://worldwidescience.org/sites/www.osti.gov/files/public/image-files/OSTIlogo.svg",
"https://worldwidescience.org/sites/www.osti.gov/files/public/image-files/ICSTIlogo.svg"
] |
[] |
[] |
[
""
] | null |
[] | null |
en
| null |
Comparison of and conversion between different implementations of the FORTRAN programming language
Science.gov (United States)
Treinish, L.
1980-01-01
A guideline for computer programmers who may need to exchange FORTRAN programs between several computers is presented. The characteristics of the FORTRAN language available on three different types of computers are outlined, and procedures and other considerations for the transfer of programs from one type of FORTRAN to another are discussed. In addition, the variance of these different FORTRAN's from the FORTRAN 77 standard are discussed.
Scientific Programming in Fortran
Directory of Open Access Journals (Sweden)
W. Van Snyder
2007-01-01
Full Text Available The Fortran programming language was designed by John Backus and his colleagues at IBM to reduce the cost of programming scientific applications. IBM delivered the first compiler for its model 704 in 1957. IBM's competitors soon offered incompatible versions. ANSI (ASA at the time developed a standard, largely based on IBM's Fortran IV in 1966. Revisions of the standard were produced in 1977, 1990, 1995 and 2003. Development of a revision, scheduled for 2008, is under way. Unlike most other programming languages, Fortran is periodically revised to keep pace with developments in language and processor design, while revisions largely preserve compatibility with previous versions. Throughout, the focus on scientific programming, and especially on efficient generated programs, has been maintained.
A brief description and comparison of programming languages FORTRAN, ALGOL, COBOL, PL/1, and LISP 1.5 from a critical standpoint
Science.gov (United States)
Mathur, F. P.
1972-01-01
Several common higher level program languages are described. FORTRAN, ALGOL, COBOL, PL/1, and LISP 1.5 are summarized and compared. FORTRAN is the most widely used scientific programming language. ALGOL is a more powerful language for scientific programming. COBOL is used for most commercial programming applications. LISP 1.5 is primarily a list-processing language. PL/1 attempts to combine the desirable features of FORTRAN, ALGOL, and COBOL into a single language.
Object-Oriented Scientific Programming with Fortran 90
Science.gov (United States)
Norton, C.
1998-01-01
Fortran 90 is a modern language that introduces many important new features beneficial for scientific programming. We discuss our experiences in plasma particle simulation and unstructured adaptive mesh refinement on supercomputers, illustrating the features of Fortran 90 that support the object-oriented methodology.
Programs in Fortran language for reporting the results of the analyses by ICP emission spectroscopy
International Nuclear Information System (INIS)
Roca, M.
1985-01-01
Three programs, written in FORTRAN IV language, for reporting the results of the analyses by ICP emission spectroscopy from data stored in files on floppy disks have been developed. They are intended, respectively, for the analyses of: 1) waters, 2) granites and slates, and 3) different kinds of geological materials. (Author) 8 refs
Fortran
CERN Document Server
Marateck, Samuel L
1977-01-01
FORTRAN is written for students who have no prior knowledge of computers or programming. The book aims to teach students how to program using the FORTRAN language.The publication first elaborates on an introduction to computers and programming, introduction to FORTRAN, and calculations and the READ statement. Discussions focus on flow charts, rounding numbers, strings, executing the program, the WRITE and FORMAT statements, performing an addition, input and output devices, and algorithms. The text then takes a look at functions and the IF statement and the DO Loop, the IF-THEN-ELSE and the WHI
Programming in Fortran M
Energy Technology Data Exchange (ETDEWEB)
Foster, I.; Olson, R.; Tuecke, S.
1993-08-01
Fortran M is a small set of extensions to Fortran that supports a modular approach to the construction of sequential and parallel programs. Fortran M programs use channels to plug together processes which may be written in Fortran M or Fortran 77. Processes communicate by sending and receiving messages on channels. Channels and processes can be created dynamically, but programs remain deterministic unless specialized nondeterministic constructs are used. Fortran M programs can execute on a range of sequential, parallel, and networked computers. This report incorporates both a tutorial introduction to Fortran M and a users guide for the Fortran M compiler developed at Argonne National Laboratory. The Fortran M compiler, supporting software, and documentation are made available free of charge by Argonne National Laboratory, but are protected by a copyright which places certain restrictions on how they may be redistributed. See the software for details. The latest version of both the compiler and this manual can be obtained by anonymous ftp from Argonne National Laboratory in the directory pub/fortran-m at info.mcs.anl.gov.
Automatic generation of Fortran programs for algebraic simulation models
International Nuclear Information System (INIS)
Schopf, W.; Rexer, G.; Ruehle, R.
1978-04-01
This report documents a generator program by which econometric simulation models formulated in an application-orientated language can be transformed automatically in a Fortran program. Thus the model designer is able to build up, test and modify models without the need of a Fortran programmer. The development of a computer model is therefore simplified and shortened appreciably; in chapter 1-3 of this report all rules are presented for the application of the generator to the model design. Algebraic models including exogeneous and endogeneous time series variables, lead and lag function can be generated. In addition, to these language elements, Fortran sequences can be applied to the formulation of models in the case of complex model interrelations. Automatically the generated model is a module of the program system RSYST III and is therefore able to exchange input and output data with the central data bank of the system and in connection with the method library modules can be used to handle planning problems. (orig.) [de
C Versus Fortran-77 for Scientific Programming
Directory of Open Access Journals (Sweden)
Tom MacDonald
1992-01-01
Full Text Available The predominant programming language for numeric and scientific applications is Fortran-77 and supercomputers are primarily used to run large-scale numeric and scientific applications. Standard C* is not widely used for numerical and scientific programming, yet Standard C provides many desirable linguistic features not present in Fortran-77. Furthermore, the existence of a standard library and preprocessor eliminates the worst portability problems. A comparison of Standard C and Fortran-77 shows several key deficiencies in C that reduce its ability to adequately solve some numerical problems. Some of these problems have already been addressed by the C standard but others remain. Standard C with a few extensions and modifications could be suitable for all numerical applications and could become more popular in supercomputing environments.
Development of the static analyzer ANALYSIS/EX for FORTRAN programs
International Nuclear Information System (INIS)
Osanai, Seiji; Yokokawa, Mitsuo
1993-08-01
The static analyzer 'ANALYSIS' is the software tool for analyzing tree structure and COMMON regions of a FORTRAN program statically. With the installation of the new FORTRAN compiler, FORTRAN77EX(V12), to the computer system at JAERI, a new version of ANALYSIS, 'ANALYSIS/EX', has been developed to enhance its analyzing functions. In addition to the conventional functions of ANALYSIS, the ANALYSIS/EX is capable of analyzing of FORTRAN programs written in the FORTRAN77EX(V12) language grammar such as large-scale nuclear codes. The analyzing function of COMMON regions are also improved so as to obtain the relation between variables in COMMON regions in more detail. In this report, results of improvement and enhanced functions of the static analyzer ANALYSIS/EX are presented. (author)
MORTRAN-2, FORTRAN Language Extension with User-Supplied Macros
International Nuclear Information System (INIS)
Cook, A. James; Shustek, L.J.
1980-01-01
1 - Description of problem or function: MORTRAN2 is a FORTRAN language extension that permits a relatively easy transition from FORTRAN to a more convenient and structured language. Its features include free-field format; alphanumeric statement labels; flexible comment convention; nested block structure; for-by-to, do, while, until, loop, if-then-else, if-else, exit, and next statements; multiple assignment statements; conditional compilation; and automatic listing indentation. The language is implemented by a macro-based pre-processor and is further extensible by user-defined macros. 2 - Method of solution: The MORTRAN2 pre-processor may be regarded as a compiler whose object code is ANSI Standard FORTRAN. The MORTRAN2 language is dynamically defined by macros which are input at each use of the pre-processor. 3 - Restrictions on the complexity of the problem: The pre-processor output must be accepted by a FORTRAN compiler
Aspects of FORTRAN in large-scale programming
International Nuclear Information System (INIS)
Metcalf, M.
1983-01-01
In these two lectures I examine the following three questions: i) Why did high-energy physicists begin to use FORTRAN. ii) Why do high-energy physicists continue to use FORTRAN. iii) Will high-energy physicists always use FORTRAN. In order to find answers to these questions, it is necessary to look at the history of the language, its present position, and its likely future, and also to consider its manner of use, the topic of portability, and the competition from other languages. Here we think especially of early competition from ALGOL, the more recent spread in the use of PASCAL, and the appearance of a completely new and ambitious language, ADA. (orig.)
Aspects of FORTRAN in large-scale programming
CERN Document Server
Metcalf, M
1983-01-01
In these two lectures I shall try to examine the following three questions: i) Why did high-energy physicists begin to use FORTRAN? ii) Why do high-energy physicists continue to use FORTRAN? iii) Will high-energy physicists always use FORTRAN? In order to find answers to these questions, it is necessary to look at the history of the language, its present position, and its likely future, and also to consider its manner of use, the topic of portability, and the competition from other languages. Here we think especially of early competition from ALGOL, the more recent spread in the use of PASCAL, and the appearance of a completely new and ambitious language, ADA.
DB90: A Fortran Callable Relational Database Routine for Scientific and Engineering Computer Programs
Science.gov (United States)
Wrenn, Gregory A.
2005-01-01
This report describes a database routine called DB90 which is intended for use with scientific and engineering computer programs. The software is written in the Fortran 90/95 programming language standard with file input and output routines written in the C programming language. These routines should be completely portable to any computing platform and operating system that has Fortran 90/95 and C compilers. DB90 allows a program to supply relation names and up to 5 integer key values to uniquely identify each record of each relation. This permits the user to select records or retrieve data in any desired order.
Application of Modern Fortran to Spacecraft Trajectory Design and Optimization
Science.gov (United States)
Williams, Jacob; Falck, Robert D.; Beekman, Izaak B.
2018-01-01
In this paper, applications of the modern Fortran programming language to the field of spacecraft trajectory optimization and design are examined. Modern object-oriented Fortran has many advantages for scientific programming, although many legacy Fortran aerospace codes have not been upgraded to use the newer standards (or have been rewritten in other languages perceived to be more modern). NASA's Copernicus spacecraft trajectory optimization program, originally a combination of Fortran 77 and Fortran 95, has attempted to keep up with modern standards and makes significant use of the new language features. Various algorithms and methods are presented from trajectory tools such as Copernicus, as well as modern Fortran open source libraries and other projects.
Modern Fortran in practice
NARCIS (Netherlands)
Markus, A.
2012-01-01
From its earliest days, the Fortran programming language has been designed with computing efficiency in mind. The latest standard, Fortran 2008, incorporates a host of modern features, including object-orientation, array operations, user-defined types, and provisions for parallel computing. This
Strategies and Experiences Using High Performance Fortran
National Research Council Canada - National Science Library
Shires, Dale
2001-01-01
.... High performance Fortran (HPF) is a relative new addition to the Fortran dialect It is an attempt to provide an efficient high-level Fortran parallel programming language for the latest generation of been debatable...
Programming language structures
CERN Document Server
Organick, Elliott Irving; Plummer, Robert P
1978-01-01
Programming Language Structures deals with the structures of programming languages and introduces the reader to five important programming languages: Algol, Fortran, Lisp, Snobol, and Pascal. The fundamental similarities and differences among these languages are discussed. A unifying framework is constructed that can be used to study the structure of other languages, such as Cobol, PL/I, and APL. Several of the tools and methodologies needed to construct large programs are also considered.Comprised of 10 chapters, this book begins with a summary of the relevant concepts and principles about al
Mixed language programming
International Nuclear Information System (INIS)
Burow, Burkhard D.
1996-01-01
Computing in the next millennium will be using software from this millennium. Programming languages evolve and new ones continue to be created. The use of legacy code demonstrates why some present and future applications may span programming languages. Even a completely new application may mix programming languages, if it allows its components to be more conveniently expressed. Given the need, mixed language programming should be easy and robust. By resolving a variety of difficulties, the well established cfortran.h package provides, the desired convenient interface across the C and Fortran programming languages, as demonstrated using CERN's Book. (author)
The programming language EFL
Science.gov (United States)
Feldman, S. I.
1978-01-01
EFL is a comprehensive language designed to make it easy to write portable, understandable programs. It provides a rich set of data types and structures, a convenient operator set, and good control flow forms. The lexical form is easy to type and to read. Whenever possible, EFL uses the same forms that Ratfor does; in this sense EFL may be viewed as a superset of Ratfor. EFL is a well-defined language; this distinguishes it from most FORTRAN preprocessors which only add simple flow of control constructs to FORTRAN. The EFL compiler generates (possibly tailored) Standard FORTRAN as its output. EFL should catch and diagnose all syntax errors.
Exploiting first-class arrays in Fortran for accelerator programming
International Nuclear Information System (INIS)
Rasmussen, Craig E.; Weseloh, Wayne N.; Robey, Robert W.; Sottile, Matthew J.; Quinlan, Daniel; Overbey, Jeffrey
2010-01-01
Emerging architectures for high performance computing often are well suited to a data parallel programming model. This paper presents a simple programming methodology based on existing languages and compiler tools that allows programmers to take advantage of these systems. We will work with the array features of Fortran 90 to show how this infrequently exploited, standardized language feature is easily transformed to lower level accelerator code. Our transformations are based on a mapping from Fortran 90 to C++ code with OpenCL extensions. The sheer complexity of programming for clusters of many or multi-core processors with tens of millions threads of execution make the simplicity of the data parallel model attractive. Furthermore, the increasing complexity of todays applications (especially when convolved with the increasing complexity of the hardware) and the need for portability across hardware architectures make a higher-level and simpler programming model like data parallel attractive. The goal of this work has been to exploit source-to-source transformations that allow programmers to develop and maintain programs at a high-level of abstraction, without coding to a specific hardware architecture. Furthermore these transformations allow multiple hardware architectures to be targeted without changing the high-level source. It also removes the necessity for application programmers to understand details of the accelerator architecture or to know OpenCL.
The comparison and selection of programming languages for high energy physics applications
International Nuclear Information System (INIS)
White, B.; Stanford Linear Accelerator Center, CA
1989-01-01
In this paper a comparison is presented of programming languages in the context of high energy physics software applications. The evaluation method uses was specifically devised to address the issues of particular importance to HEP applications, not just the technical features of the languages considered. The candidate languages evaluated were Ada, C, FORTRAN 77, FORTRAN 8x, Pascal and PL/I. Some conclusions are drawn and recommendations made regarding the role of FORTRAN and other programming languages in the current and future development of HEP software. (orig.)
Analysis of computer programming languages
International Nuclear Information System (INIS)
Risset, Claude Alain
1967-01-01
This research thesis aims at trying to identify some methods of syntax analysis which can be used for computer programming languages while putting aside computer devices which influence the choice of the programming language and methods of analysis and compilation. In a first part, the author proposes attempts of formalization of Chomsky grammar languages. In a second part, he studies analytical grammars, and then studies a compiler or analytic grammar for the Fortran language
Fortran programs for the time-dependent Gross-Pitaevskii equation in a fully anisotropic trap
Science.gov (United States)
Muruganandam, P.; Adhikari, S. K.
2009-10-01
, etc.: 122 907 No. of bytes in distributed program, including test data, etc.: 609 662 Distribution format: tar.gz Programming language: FORTRAN 77 and Fortran 90/95 Computer: PC Operating system: Linux, Unix RAM: 1 GByte (i, iv, v), 2 GByte (ii, vi, vii, x, xi), 4 GByte (iii, viii, xii), 8 GByte (ix) Classification: 2.9, 4.3, 4.12 Nature of problem: These programs are designed to solve the time-dependent Gross-Pitaevskii nonlinear partial differential equation in one-, two- or three-space dimensions with a harmonic, circularly-symmetric, spherically-symmetric, axially-symmetric or anisotropic trap. The Gross-Pitaevskii equation describes the properties of a dilute trapped Bose-Einstein condensate. Solution method: The time-dependent Gross-Pitaevskii equation is solved by the split-step Crank-Nicolson method by discretizing in space and time. The discretized equation is then solved by propagation, in either imaginary or real time, over small time steps. The method yields the solution of stationary and/or non-stationary problems. Additional comments: This package consists of 12 programs, see "Program title", above. FORTRAN77 versions are provided for each of the 12 and, in addition, Fortran 90/95 versions are included for ii, iii, vi, viii, ix, xii. For the particular purpose of each program please see the below. Running time: Minutes on a medium PC (i, iv, v, vii, x, xi), a few hours on a medium PC (ii, vi, viii, xii), days on a medium PC (iii, ix). Program summary (1)Title of program: imagtime1d.F Title of electronic file: imagtime1d.tar.gz Catalogue identifier: Program summary URL: Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Distribution format: tar.gz Computers: PC/Linux, workstation/UNIX Maximum RAM memory: 1 GByte Programming language used: Fortran 77 Typical running time: Minutes on a medium PC Unusual features: None Nature of physical problem: This program is designed to solve the time-dependent Gross
Structured Design Language for Computer Programs
Science.gov (United States)
Pace, Walter H., Jr.
1986-01-01
Box language used at all stages of program development. Developed to provide improved productivity in designing, coding, and maintaining computer programs. BOX system written in FORTRAN 77 for batch execution.
Classical Fortran programming for engineering and scientific applications
CERN Document Server
Kupferschmid, Michael
2009-01-01
IntroductionWhy Study Programming?The Evolution of FORTRANWhy Study FORTRAN?Classical FORTRANAbout This BookAdvice to InstructorsAbout the AuthorAcknowledgmentsDisclaimersHello, World!Case Study: A First FORTRAN ProgramCompiling the ProgramRunning a Program in UNIXOmissionsExpressions and Assignment StatementsConstantsVariables and Variable NamesArithmetic OperatorsFunction ReferencesExpressionsA
Programs in Fortran language for reporting the results of the analyses by ICP emission spectroscopy; Programas en lenguaje Fortran para la informacion de los resultados de los analisis efectuados mediante Espectroscopia Optica de emision con fuente de plasma
Energy Technology Data Exchange (ETDEWEB)
Roca, M
1985-07-01
Three programs, written in FORTRAN IV language, for reporting the results of the analyses by ICP emission spectroscopy from data stored in files on floppy disks have been developed. They are intended, respectively, for the analyses of: 1) waters, 2) granites and slates, and 3) different kinds of geological materials. (Author) 8 refs.
High Performance Object-Oriented Scientific Programming in Fortran 90
Science.gov (United States)
Norton, Charles D.; Decyk, Viktor K.; Szymanski, Boleslaw K.
1997-01-01
We illustrate how Fortran 90 supports object-oriented concepts by example of plasma particle computations on the IBM SP. Our experience shows that Fortran 90 and object-oriented methodology give high performance while providing a bridge from Fortran 77 legacy codes to modern programming principles. All of our object-oriented Fortran 90 codes execute more quickly thatn the equeivalent C++ versions, yet the abstraction modelling capabilities used for scentific programming are comparably powereful.
Formula Translation in Blitz++, NumPy and Modern Fortran: A Case Study of the Language Choice Tradeoffs
Directory of Open Access Journals (Sweden)
Sylwester Arabas
2014-01-01
Full Text Available Three object-oriented implementations of a prototype solver of the advection equation are introduced. The presented programs are based on Blitz++ (C++, NumPy (Python and Fortran's built-in array containers. The solvers constitute implementations of the Multidimensional Positive-Definite Advective Transport Algorithm (MPDATA. The introduced codes serve as examples for how the application of object-oriented programming (OOP techniques and new language constructs from C++11 and Fortran 2008 allow to reproduce the mathematical notation used in the literature within the program code. A discussion on the tradeoffs of the programming language choice is presented. The main angles of comparison are code brevity and syntax clarity (and hence maintainability and auditability as well as performance. All performance tests are carried out using free and open-source compilers. In the case of Python, a significant performance gain is observed when switching from the standard interpreter (CPython to the PyPy implementation of Python. Entire source code of all three implementations is embedded in the text and is licensed under the terms of the GNU GPL license.
The comparison and selection of programming languages for high energy physics applications
International Nuclear Information System (INIS)
White, B.
1991-06-01
This paper discusses the issues surrounding the comparison and selection of a programming language to be used in high energy physics software applications. The evaluation method used was specifically devised to address the issues of particular importance to high energy physics (HEP) applications, not just the technical features of the languages considered. The method assumes a knowledge of the requirements of current HEP applications, the data-processing environments expected to support these applications and relevant non-technical issues. The languages evaluated were Ada, C, FORTRAN 77, FORTRAN 99 (formerly 8X), Pascal and PL/1. Particular emphasis is placed upon the past, present and anticipated future role of FORTRAN in HEP software applications. Upon examination of the technical and practical issues, conclusions are reached and some recommendations are made regarding the role of FORTRAN and other programming languages in the current and future development of HEP software. 54 refs
VFC: The Vienna Fortran Compiler
Directory of Open Access Journals (Sweden)
Siegfried Benkner
1999-01-01
Full Text Available High Performance Fortran (HPF offers an attractive highâlevel language interface for programming scalable parallel architectures providing the user with directives for the specification of data distribution and delegating to the compiler the task of generating an explicitly parallel program. Available HPF compilers can handle regular codes quite efficiently, but dramatic performance losses may be encountered for applications which are based on highly irregular, dynamically changing data structures and access patterns. In this paper we introduce the Vienna Fortran Compiler (VFC, a new sourceâtoâsource parallelization system for HPF+, an optimized version of HPF, which addresses the requirements of irregular applications. In addition to extended data distribution and work distribution mechanisms, HPF+ provides the user with language features for specifying certain information that decisively influence a programâs performance. This comprises data locality assertions, nonâlocal access specifications and the possibility of reusing runtimeâgenerated communication schedules of irregular loops. Performance measurements of kernels from advanced applications demonstrate that with a highâlevel data parallel language such as HPF+ a performance close to handâwritten messageâpassing programs can be achieved even for highly irregular codes.
Fortran 90 for scientists and engineers
CERN Document Server
Hahn, Brian
1994-01-01
The introduction of the Fortran 90 standard is the first significant change in the Fortran language in over 20 years. this book is designed for anyone wanting to learn Fortran for the first time or or a programmer who needs to upgrade from Fortran 77 to Fortran 90.Employing a practical, problem-based approach this book provides a comprehensive introduction to the language. More experienced programmers will find it a useful update to the new standard and will benefit from the emphasis on science and engineering applications.
Programming Language Software For Graphics Applications
Science.gov (United States)
Beckman, Brian C.
1993-01-01
New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.
ARBUS: A FORTRAN tool for generating tree structure diagrams
International Nuclear Information System (INIS)
Ferrero, C.; Zanger, M.
1992-02-01
The FORTRAN77 stand-alone code ARBUS has been designed to aid the user by providing a tree structure diagram generating utility for computer programs written in FORTRAN language. This report is intended to describe the main purpose and features of ARBUS and to highlight some additional applications of the code by means of practical test cases. (orig.) [de
Language constructs for modular parallel programs
Energy Technology Data Exchange (ETDEWEB)
Foster, I.
1996-03-01
We describe programming language constructs that facilitate the application of modular design techniques in parallel programming. These constructs allow us to isolate resource management and processor scheduling decisions from the specification of individual modules, which can themselves encapsulate design decisions concerned with concurrence, communication, process mapping, and data distribution. This approach permits development of libraries of reusable parallel program components and the reuse of these components in different contexts. In particular, alternative mapping strategies can be explored without modifying other aspects of program logic. We describe how these constructs are incorporated in two practical parallel programming languages, PCN and Fortran M. Compilers have been developed for both languages, allowing experimentation in substantial applications.
Visualization of Distributed Data Structures for High Performance Fortran-Like Languages
Directory of Open Access Journals (Sweden)
Rainer Koppler
1997-01-01
Full Text Available This article motivates the usage of graphics and visualization for efficient utilization of High Performance Fortran's (HPF's data distribution facilities. It proposes a graphical toolkit consisting of exploratory and estimation tools which allow the programmer to navigate through complex distributions and to obtain graphical ratings with respect to load distribution and communication. The toolkit has been implemented in a mapping design and visualization tool which is coupled with a compilation system for the HPF predecessor Vienna Fortran. Since this language covers a superset of HPF's facilities, the tool may also be used for visualization of HPF data structures.
IFF, Full-Screen Input Menu Generator for FORTRAN Program
International Nuclear Information System (INIS)
Seidl, Albert
1991-01-01
1 - Description of program or function: The IFF-package contains input modules for use within FORTRAN programs. This package enables the programmer to easily include interactive menu-directed data input (module VTMEN1) and command-word processing (module INPCOM) into a FORTRAN program. 2 - Method of solution: No mathematical operations are performed. 3 - Restrictions on the complexity of the problem: Certain restrictions of use may arise from the dimensioning of arrays. Field lengths are defined via PARAMETER-statements
SVM Support in the Vienna Fortran Compilation System
OpenAIRE
Brezany, Peter; Gerndt, Michael; Sipkova, Viera
1994-01-01
Vienna Fortran, a machine-independent language extension to Fortran which allows the user to write programs for distributed-memory systems using global addresses, provides the forall-loop construct for specifying irregular computations that do not cause inter-iteration dependences. Compilers for distributed-memory systems generate code that is based on runtime analysis techniques and is only efficient if, in addition, aggressive compile-time optimizations are applied. Since these optimization...
NLEdit: A generic graphical user interface for Fortran programs
Science.gov (United States)
Curlett, Brian P.
1994-01-01
NLEdit is a generic graphical user interface for the preprocessing of Fortran namelist input files. The interface consists of a menu system, a message window, a help system, and data entry forms. A form is generated for each namelist. The form has an input field for each namelist variable along with a one-line description of that variable. Detailed help information, default values, and minimum and maximum allowable values can all be displayed via menu picks. Inputs are processed through a scientific calculator program that allows complex equations to be used instead of simple numeric inputs. A custom user interface is generated simply by entering information about the namelist input variables into an ASCII file. There is no need to learn a new graphics system or programming language. NLEdit can be used as a stand-alone program or as part of a larger graphical user interface. Although NLEdit is intended for files using namelist format, it can be easily modified to handle other file formats.
MAPLIB, Thermodynamics Materials Property Generator for FORTRAN Program
International Nuclear Information System (INIS)
Schumann, U.; Zimmerer, W. and others
1978-01-01
1 - Nature of physical problem solved: MAPLIB is a program system which is able to incorporate the values of the properties of any material in a form suitable for use in other computer programs. The data are implemented in FORTRAN functions. A utility program is provided to assist in library management. 2 - Method of solution: MAPLIB consists of the following parts: 1) Conventions for the data format. 2) Some integrated data. 3) A data access system (FORTRAN subroutine). 4) An utility program for updating and documentation of the actual library content. The central part is a set of FORTRAN functions, e.g. WL H2O v(t,p) (heat conduction of water vapor as a function of temperature and pressure), which compute the required data and which can be called by the user program. The data content of MAPLIB has been delivered by many persons. There was no systematic evaluation of the material. It is the responsibility of every user to check the data for physical accuracy. MAPLIB only serves as a library system for manipulation and storing of such data. 3 - Restrictions on the complexity of the problem: a) See responsibility as explained above. b) Up to 1000 data functions could be implemented. c) If too many data functions are included in MAPLIB, the storage requirements become excessive for application in users programs
Final Report, Center for Programming Models for Scalable Parallel Computing: Co-Array Fortran, Grant Number DE-FC02-01ER25505
Energy Technology Data Exchange (ETDEWEB)
Robert W. Numrich
2008-04-22
The major accomplishment of this project is the production of CafLib, an 'object-oriented' parallel numerical library written in Co-Array Fortran. CafLib contains distributed objects such as block vectors and block matrices along with procedures, attached to each object, that perform basic linear algebra operations such as matrix multiplication, matrix transpose and LU decomposition. It also contains constructors and destructors for each object that hide the details of data decomposition from the programmer, and it contains collective operations that allow the programmer to calculate global reductions, such as global sums, global minima and global maxima, as well as vector and matrix norms of several kinds. CafLib is designed to be extensible in such a way that programmers can define distributed grid and field objects, based on vector and matrix objects from the library, for finite difference algorithms to solve partial differential equations. A very important extra benefit that resulted from the project is the inclusion of the co-array programming model in the next Fortran standard called Fortran 2008. It is the first parallel programming model ever included as a standard part of the language. Co-arrays will be a supported feature in all Fortran compilers, and the portability provided by standardization will encourage a large number of programmers to adopt it for new parallel application development. The combination of object-oriented programming in Fortran 2003 with co-arrays in Fortran 2008 provides a very powerful programming model for high-performance scientific computing. Additional benefits from the project, beyond the original goal, include a programto provide access to the co-array model through access to the Cray compiler as a resource for teaching and research. Several academics, for the first time, included the co-array model as a topic in their courses on parallel computing. A separate collaborative project with LANL and PNNL showed how to
Standard Fortran
International Nuclear Information System (INIS)
Marshall, N.H.
1981-01-01
Because of its vast software investment in Fortran programs, the nuclear community has an inherent interest in the evolution of Fortran. This paper reviews the impact of the new Fortran 77 standard and discusses the projected changes which can be expected in the future
Programming a real code in a functional language (part 1)
Energy Technology Data Exchange (ETDEWEB)
Hendrickson, C.P.
1991-09-10
For some, functional languages hold the promise of allowing ease of programming massively parallel computers that imperative languages such as Fortran and C do not offer. At LLNL, we have initiated a project to write the physics of a major production code in Sisal, a functional language developed at LLNL in collaboration with researchers throughout the world. We are investigating the expressibility of Sisal, as well as its performance on a shared-memory multiprocessor, the Y-MP. An interesting aspect of the project is that Sisal modules can call Fortran modules, and are callable by them. This eliminates the rewriting of 80% of the production code that would not benefit from parallel execution. Preliminary results indicate that the restrictive nature of the language does not cause problems in expressing the algorithms we have chosen. Some interesting aspects of programming in a mixed functional-imperative environment have surfaced, but can be managed. 8 refs.
Alternatives to FORTRAN in control systems
International Nuclear Information System (INIS)
Howell, J.A.; Wright, R.M.
1985-01-01
Control system software has traditionally been written in assembly language, FORTRAN, or Basic. Today there exist several high-level languages with features that make them convenient and effective in control systems. These features include bit manipulation, user-defined data types, character manipulation, and high-level logical operations. Some of theses languages are quite different from FORTRAN and yet are easy to read and use. We discuss several languages, their features that make them convenient for control systems, and give examples of their use. We focus particular attention on the language C, developed by Bell Laboratories
Programming effort analysis of the ELLPACK language
Science.gov (United States)
Rice, J. R.
1978-01-01
ELLPACK is a problem statement language and system for elliptic partial differential equations which is implemented by a FORTRAN preprocessor. ELLPACK's principal purpose is as a tool for the performance evaluation of software. However, it is used here as an example with which to study the programming effort required for problem solving. It is obvious that problem statement languages can reduce programming effort tremendously; the goal is to quantify this somewhat. This is done by analyzing the lengths and effort (as measured by Halstead's software science technique) of various approaches to solving these problems.
A comparative study of programming languages for next-generation astrodynamics systems
Science.gov (United States)
Eichhorn, Helge; Cano, Juan Luis; McLean, Frazer; Anderl, Reiner
2018-03-01
Due to the computationally intensive nature of astrodynamics tasks, astrodynamicists have relied on compiled programming languages such as Fortran for the development of astrodynamics software. Interpreted languages such as Python, on the other hand, offer higher flexibility and development speed thereby increasing the productivity of the programmer. While interpreted languages are generally slower than compiled languages, recent developments such as just-in-time (JIT) compilers or transpilers have been able to close this speed gap significantly. Another important factor for the usefulness of a programming language is its wider ecosystem which consists of the available open-source packages and development tools such as integrated development environments or debuggers. This study compares three compiled languages and three interpreted languages, which were selected based on their popularity within the scientific programming community and technical merit. The three compiled candidate languages are Fortran, C++, and Java. Python, Matlab, and Julia were selected as the interpreted candidate languages. All six languages are assessed and compared to each other based on their features, performance, and ease-of-use through the implementation of idiomatic solutions to classical astrodynamics problems. We show that compiled languages still provide the best performance for astrodynamics applications, but JIT-compiled dynamic languages have reached a competitive level of speed and offer an attractive compromise between numerical performance and programmer productivity.
Computer Language For Optimization Of Design
Science.gov (United States)
Scotti, Stephen J.; Lucas, Stephen H.
1991-01-01
SOL is computer language geared to solution of design problems. Includes mathematical modeling and logical capabilities of computer language like FORTRAN; also includes additional power of nonlinear mathematical programming methods at language level. SOL compiler takes SOL-language statements and generates equivalent FORTRAN code and system calls. Provides syntactic and semantic checking for recovery from errors and provides detailed reports containing cross-references to show where each variable used. Implemented on VAX/VMS computer systems. Requires VAX FORTRAN compiler to produce executable program.
Fortran interface layer of the framework for developing particle simulator FDPS
Science.gov (United States)
Namekata, Daisuke; Iwasawa, Masaki; Nitadori, Keigo; Tanikawa, Ataru; Muranushi, Takayuki; Wang, Long; Hosono, Natsuki; Nomura, Kentaro; Makino, Junichiro
2018-06-01
Numerical simulations based on particle methods have been widely used in various fields including astrophysics. To date, various versions of simulation software have been developed by individual researchers or research groups in each field, through a huge amount of time and effort, even though the numerical algorithms used are very similar. To improve the situation, we have developed a framework, called FDPS (Framework for Developing Particle Simulators), which enables researchers to develop massively parallel particle simulation codes for arbitrary particle methods easily. Until version 3.0, FDPS provided an API (application programming interface) for the C++ programming language only. This limitation comes from the fact that FDPS is developed using the template feature in C++, which is essential to support arbitrary data types of particle. However, there are many researchers who use Fortran to develop their codes. Thus, the previous versions of FDPS require such people to invest much time to learn C++. This is inefficient. To cope with this problem, we developed a Fortran interface layer in FDPS, which provides API for Fortran. In order to support arbitrary data types of particle in Fortran, we design the Fortran interface layer as follows. Based on a given derived data type in Fortran representing particle, a PYTHON script provided by us automatically generates a library that manipulates the C++ core part of FDPS. This library is seen as a Fortran module providing an API of FDPS from the Fortran side and uses C programs internally to interoperate Fortran with C++. In this way, we have overcome several technical issues when emulating a `template' in Fortran. Using the Fortran interface, users can develop all parts of their codes in Fortran. We show that the overhead of the Fortran interface part is sufficiently small and a code written in Fortran shows a performance practically identical to the one written in C++.
Introduction to modern Fortran for the Earth system sciences
CERN Document Server
Chirila, Dragos B
2014-01-01
This work provides a short "getting started" guide to Fortran 90/95. The main target audience consists of newcomers to the field of numerical computation within Earth system sciences (students, researchers or scientific programmers). Furthermore, readers accustomed to other programming languages may also benefit from this work, by discovering how some programming techniques they are familiar with map to Fortran 95. The main goal is to enable readers to quickly start using Fortran 95 for writing useful programs. It also introduces a gradual discussion of Input/Output facilities relevant for Earth system sciences, from the simplest ones to the more advanced netCDF library (which has become a de facto standard for handling the massive datasets used within Earth system sciences). While related works already treat these disciplines separately (each often providing much more information than needed by the beginning practitioner), the reader finds in this book a shorter guide which links them. Compared to other book...
The Fortran-P Translator: Towards Automatic Translation of Fortran 77 Programs for Massively Parallel Processors
Directory of Open Access Journals (Sweden)
Matthew O'keefe
1995-01-01
Full Text Available Massively parallel processors (MPPs hold the promise of extremely high performance that, if realized, could be used to study problems of unprecedented size and complexity. One of the primary stumbling blocks to this promise has been the lack of tools to translate application codes to MPP form. In this article we show how applications codes written in a subset of Fortran 77, called Fortran-P, can be translated to achieve good performance on several massively parallel machines. This subset can express codes that are self-similar, where the algorithm applied to the global data domain is also applied to each subdomain. We have found many codes that match the Fortran-P programming style and have converted them using our tools. We believe a self-similar coding style will accomplish what a vectorizable style has accomplished for vector machines by allowing the construction of robust, user-friendly, automatic translation systems that increase programmer productivity and generate fast, efficient code for MPPs.
OpenMP GNU and Intel Fortran programs for solving the time-dependent Gross-Pitaevskii equation
Science.gov (United States)
Young-S., Luis E.; Muruganandam, Paulsamy; Adhikari, Sadhan K.; LonÄar, Vladimir; VudragoviÄ, DuÅ¡an; Balaž, Antun
2017-11-01
We present Open Multi-Processing (OpenMP) version of Fortran 90 programs for solving the Gross-Pitaevskii (GP) equation for a Bose-Einstein condensate in one, two, and three spatial dimensions, optimized for use with GNU and Intel compilers. We use the split-step Crank-Nicolson algorithm for imaginary- and real-time propagation, which enables efficient calculation of stationary and non-stationary solutions, respectively. The present OpenMP programs are designed for computers with multi-core processors and optimized for compiling with both commercially-licensed Intel Fortran and popular free open-source GNU Fortran compiler. The programs are easy to use and are elaborated with helpful comments for the users. All input parameters are listed at the beginning of each program. Different output files provide physical quantities such as energy, chemical potential, root-mean-square sizes, densities, etc. We also present speedup test results for new versions of the programs. Program files doi:http://dx.doi.org/10.17632/y8zk3jgn84.2 Licensing provisions: Apache License 2.0 Programming language: OpenMP GNU and Intel Fortran 90. Computer: Any multi-core personal computer or workstation with the appropriate OpenMP-capable Fortran compiler installed. Number of processors used: All available CPU cores on the executing computer. Journal reference of previous version: Comput. Phys. Commun. 180 (2009) 1888; ibid.204 (2016) 209. Does the new version supersede the previous version?: Not completely. It does supersede previous Fortran programs from both references above, but not OpenMP C programs from Comput. Phys. Commun. 204 (2016) 209. Nature of problem: The present Open Multi-Processing (OpenMP) Fortran programs, optimized for use with commercially-licensed Intel Fortran and free open-source GNU Fortran compilers, solve the time-dependent nonlinear partial differential (GP) equation for a trapped Bose-Einstein condensate in one (1d), two (2d), and three (3d) spatial dimensions for
On the tradeoffs of programming language choice for numerical modelling in geoscience. A case study comparing modern Fortran, C++/Blitz++ and Python/NumPy.
Science.gov (United States)
Jarecka, D.; Arabas, S.; Fijalkowski, M.; Gaynor, A.
2012-04-01
The language of choice for numerical modelling in geoscience has long been Fortran. A choice of a particular language and coding paradigm comes with different set of tradeoffs such as that between performance, ease of use (and ease of abuse), code clarity, maintainability and reusability, availability of open source compilers, debugging tools, adequate external libraries and parallelisation mechanisms. The availability of trained personnel and the scale and activeness of the developer community is of importance as well. We present a short comparison study aimed at identification and quantification of these tradeoffs for a particular example of an object oriented implementation of a parallel 2D-advection-equation solver in Python/NumPy, C++/Blitz++ and modern Fortran. The main angles of comparison will be complexity of implementation, performance of various compilers or interpreters and characterisation of the "added value" gained by a particular choice of the language. The choice of the numerical problem is dictated by the aim to make the comparison useful and meaningful to geoscientists. Python is chosen as a language that traditionally is associated with ease of use, elegant syntax but limited performance. C++ is chosen for its traditional association with high performance but even higher complexity and syntax obscurity. Fortran is included in the comparison for its widespread use in geoscience often attributed to its performance. We confront the validity of these traditional views. We point out how the usability of a particular language in geoscience depends on the characteristics of the language itself and the availability of pre-existing software libraries (e.g. NumPy, SciPy, PyNGL, PyNIO, MPI4Py for Python and Blitz++, Boost.Units, Boost.MPI for C++). Having in mind the limited complexity of the considered numerical problem, we present a tentative comparison of performance of the three implementations with different open source compilers including CPython and
Numerical methods of mathematical optimization with Algol and Fortran programs
CERN Document Server
Künzi, Hans P; Zehnder, C A; Rheinboldt, Werner
1971-01-01
Numerical Methods of Mathematical Optimization: With ALGOL and FORTRAN Programs reviews the theory and the practical application of the numerical methods of mathematical optimization. An ALGOL and a FORTRAN program was developed for each one of the algorithms described in the theoretical section. This should result in easy access to the application of the different optimization methods.Comprised of four chapters, this volume begins with a discussion on the theory of linear and nonlinear optimization, with the main stress on an easily understood, mathematically precise presentation. In addition
FORTRAN programs for transient eddy current calculations using a perturbation-polynomial expansion technique
International Nuclear Information System (INIS)
Carpenter, K.H.
1976-11-01
A description is given of FORTRAN programs for transient eddy current calculations in thin, non-magnetic conductors using a perturbation-polynomial expansion technique. Basic equations are presented as well as flow charts for the programs implementing them. The implementation is in two steps--a batch program to produce an intermediate data file and interactive programs to produce graphical output. FORTRAN source listings are included for all program elements, and sample inputs and outputs are given for the major programs
Multidimentional and Multi-Parameter Fortran-Based Curve Fitting ...
African Journals Online (AJOL)
This work briefly describes the mathematics behind the algorithm, and also elaborates how to implement it using FORTRAN 95 programming language. The advantage of this algorithm, when it is extended to surfaces and complex functions, is that it makes researchers to have a better trust during fitting. It also improves the ...
A FORTRAN program for a least-square fitting
International Nuclear Information System (INIS)
Yamazaki, Tetsuo
1978-01-01
A practical FORTRAN program for a least-squares fitting is presented. Although the method is quite usual, the program calculates not only the most satisfactory set of values of unknowns but also the plausible errors associated with them. As an example, a measured lateral absorbed-dose distribution in water for a narrow 25-MeV electron beam is fitted to a Gaussian distribution. (auth.)
The Julia programming language: the future of scientific computing
Science.gov (United States)
Gibson, John
2017-11-01
Julia is an innovative new open-source programming language for high-level, high-performance numerical computing. Julia combines the general-purpose breadth and extensibility of Python, the ease-of-use and numeric focus of Matlab, the speed of C and Fortran, and the metaprogramming power of Lisp. Julia uses type inference and just-in-time compilation to compile high-level user code to machine code on the fly. A rich set of numeric types and extensive numerical libraries are built-in. As a result, Julia is competitive with Matlab for interactive graphical exploration and with C and Fortran for high-performance computing. This talk interactively demonstrates Julia's numerical features and benchmarks Julia against C, C++, Fortran, Matlab, and Python on a spectral time-stepping algorithm for a 1d nonlinear partial differential equation. The Julia code is nearly as compact as Matlab and nearly as fast as Fortran. This material is based upon work supported by the National Science Foundation under Grant No. 1554149.
PGHPF â An Optimizing High Performance Fortran Compiler for Distributed Memory Machines
Directory of Open Access Journals (Sweden)
Zeki Bozkus
1997-01-01
Full Text Available High Performance Fortran (HPF is the first widely supported, efficient, and portable parallel programming language for shared and distributed memory systems. HPF is realized through a set of directive-based extensions to Fortran 90. It enables application developers and Fortran end-users to write compact, portable, and efficient software that will compile and execute on workstations, shared memory servers, clusters, traditional supercomputers, or massively parallel processors. This article describes a production-quality HPF compiler for a set of parallel machines. Compilation techniques such as data and computation distribution, communication generation, run-time support, and optimization issues are elaborated as the basis for an HPF compiler implementation on distributed memory machines. The performance of this compiler on benchmark programs demonstrates that high efficiency can be achieved executing HPF code on parallel architectures.
FORTRAN text correction with the CDC-1604-A console typewriter during reading a punched card program
International Nuclear Information System (INIS)
Kotorobaj, F.; Ruzhichka, Ya.; Stolyarskij, Yu.V.
1977-01-01
The paper describes FORTRAN text correction with the CDC 1604-A console typewriter during reading a punched card program. This method gives one more possibility of FORTRAN program correction during program's input to the CDC 1604-A computer. This essentially reduced the time necessary for punched card correction with other methods. Possibility of inputting desired number of punched cards one after another allows one writing small FORTRAN programs to computer core storage with simultaneous punching of the cards. The correction program has been written to the CDC 1604 COOP monitor
An Introduction to Fortran Programming: An IPI Approach.
Science.gov (United States)
Fisher, D. D.; And Others
This text is designed to give individually paced instruction in Fortran Programing. The text contains fifteen units. Unit titles include: Flowcharts, Input and Output, Loops, and Debugging. Also included is an extensive set of appendices. These were designed to contain a great deal of practical information necessary to the course. These appendicesâ¦
The Transition and Adoption to Modern Programming Concepts for Scientific Computing in Fortran
Directory of Open Access Journals (Sweden)
Charles D. Norton
2007-01-01
Full Text Available This paper describes our experiences in the early exploration of modern concepts introduced in Fortran90 for large-scale scientific programming. We review our early work in expressing object-oriented concepts based on the new Fortran90 constructs â foreign to most programmers at the time â our experimental work in applying them to various applications, the impact on the WG5/J3 standards committees to consider formalizing object-oriented constructs for later versions of Fortran, and work in exploring how other modern programming techniques such as Design Patterns can and have impacted our software development. Applications will be drawn from plasma particle simulation and finite element adaptive mesh refinement for solid earth crustal deformation modeling.
TOOLPACK1, Tools for Development and Maintenance of FORTRAN 77 Program
International Nuclear Information System (INIS)
Cowell, Wayne R.
1993-01-01
1 - Description of program or function: TOOLPACK1 consists of the following categories of software; (1) an integrated collection of tools intended to support the development and maintenance of FORTRAN 77 programs, in particular moderate-sized collections of mathematical software; (2) several user/Toolpack interfaces, one of which is selected for use at any particular installation; (3) three implementations of the tool/system interface, called TIE (Tool Interface to the Environment). The tools are written in FORTRAN 77 and are portable among TIE installations. The source contains symbolic constants as macro names and must be expanded with a suitable macro expander before being compiled and loaded. A portable macro expander is supplied in TOOLPACK1. The tools may be divided into three functional areas: general, documentation, and processing. One tool, the macro processor, Can be used in any of these categories. ISTDC: data comparison tool is designed mainly for comparing files of numeric values, and files with embedded text. ISTET Expands tabs. ISTFI: finds all the include files that a file needs. ISTGP Searches multiple files for occurrences of a regular expression. ISTHP: will provide limited help information about tools. ISTMP: The macro processor may be used to pre-process a file. The processor provides macro replacement, inclusion, conditional replacement, and processing capabilities for complex file processing. ISTSP: TIE-conforming version of the SPLIT utility to split up the concatenated files used on the tape. ISTSV: save/restore utility to save and restore sub-trees of the Portable File Store (PFS). ISTTD: text comparison tool. ISTVC: simple text file version controller. ISTAL: aids is a preprocessor that can be used to generate specific information from intermediate files created by other tools. The information that can be generated includes call-graphs, cross reference listings, segment execution frequencies, and symbol information. ISTAL can also strip
Fortran code for generating random probability vectors, unitaries, and quantum states
Directory of Open Access Journals (Sweden)
Jonas eMaziero
2016-03-01
Full Text Available The usefulness of generating random configurations is recognized in many areas of knowledge. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.
Mass: Fortran program for calculating mass-absorption coefficients
International Nuclear Information System (INIS)
Nielsen, Aa.; Svane Petersen, T.
1980-01-01
Determinations of mass-absorption coefficients in the x-ray analysis of trace elements are an important and time consuming part of the arithmetic calculation. In the course of time different metods have been used. The program MASS calculates the mass-absorption coefficients from a given major element analysis at the x-ray wavelengths normally used in trace element determinations and lists the chemical analysis and the mass-absorption coefficients. The program is coded in FORTRAN IV, and is operational on the IBM 370/165 computer, on the UNIVAC 1110 and on PDP 11/05. (author)
hepawk - A language for scanning high energy physics events
International Nuclear Information System (INIS)
Ohl, T.
1992-01-01
We present the programming language hepawk, designed for convenient scanning of data structures arising in the simulation of high energy physics events. The interpreter for this language has been implemented in FORTRAN-77, therefore hepawk runs on any machine with a FORTRAN-77 compiler. (orig.)
Cloudy's Journey from FORTRAN to C, Why and How
Science.gov (United States)
Ferland, G. J.
Cloudy is a large-scale plasma simulation code that is widely used across the astronomical community as an aid in the interpretation of spectroscopic data. The cover of the ADAS VI book featured predictions of the code. The FORTRAN 77 source code has always been freely available on the Internet, contributing to its widespread use. The coming of PCs and Linux has fundamentally changed the computing environment. Modern Fortran compilers (F90 and F95) are not freely available. A common-use code must be written in either FORTRAN 77 or C to be Open Source/GNU/Linux friendly. F77 has serious drawbacks - modern language constructs cannot be used, students do not have skills in this language, and it does not contribute to their future employability. It became clear that the code would have to be ported to C to have a viable future. I describe the approach I used to convert Cloudy from FORTRAN 77 with MILSPEC extensions to ANSI/ISO 89 C. Cloudy is now openly available as a C code, and will evolve to C++ as gcc and standard C++ mature. Cloudy looks to a bright future with a modern language.
1957-2007: 50 Years of Higher Order Programming Languages
Directory of Open Access Journals (Sweden)
Alen LovrenÄiÄ
2009-06-01
Full Text Available Fifty years ago one of the greatest breakthroughs in computer programming and in the history of computers happened -- the appearance of FORTRAN, the first higher-order programming language. From that time until now hundreds of programming languages were invented, different programming paradigms were defined, all with the main goal to make computer programming easier and closer to as many people as possible. Many battles were fought among scientists as well as among developers around concepts of programming, programming languages and paradigms. It can be said that programming paradigms and programming languages were very often a trigger for many changes and improvements in computer science as well as in computer industry. Definitely, computer programming is one of the cornerstones of computer science.Today there are many tools that give a help in the process of programming, but there is still a programming tasks that can be solved only manually. Therefore, programming is still one of the most creative parts of interaction with computers.Programmers should chose programming language in accordance to task they have to solve, but very often, they chose it in accordance to their personal preferences, their beliefs and many other subjective reasons.Nevertheless, the market of programming languages can be merciless to languages as history was merciless to some people, even whole nations. Programming languages and developers get born, live and die leaving more or less tracks and successors, and not always the best survives. The history of programming languages is closely connected to the history of computers and computer science itself. Every single thing from one of them has its reflexions onto the other. This paper gives a short overview of last fifty years of computer programming and computer programming languages, but also gives many ideas that influenced other aspects of computer science. Particularly, programming paradigms are described, their
Perbandingan Bubble Sort dengan Insertion Sort pada Bahasa Pemrograman C dan Fortran
Directory of Open Access Journals (Sweden)
Reina Reina
2013-12-01
Full Text Available Sorting is a basic algorithm studied by students of computer science major. Sorting algorithm is the basis of other algorithms such as searching algorithm, pattern matching algorithm. Bubble sort is a popular basic sorting algorithm due to its easiness to be implemented. Besides bubble sort, there is insertion sort. It is lesspopular than bubble sort because it has more difficult algorithm. This paper discusses about process time between insertion sort and bubble sort with two kinds of data. First is randomized data, and the second is data of descending list. Comparison of process time has been done in two kinds of programming language that is C programming language and FORTRAN programming language. The result shows that bubble sort needs more time than insertion sort does.
LFK, FORTRAN Application Performance Test
International Nuclear Information System (INIS)
McMahon, F.H.
1991-01-01
1 - Description of program or function: LFK, the Livermore FORTRAN Kernels, is a computer performance test that measures a realistic floating-point performance range for FORTRAN applications. Informally known as the Livermore Loops test, the LFK test may be used as a computer performance test, as a test of compiler accuracy (via checksums) and efficiency, or as a hardware endurance test. The LFK test, which focuses on FORTRAN as used in computational physics, measures the joint performance of the computer CPU, the compiler, and the computational structures in units of Mega-flops/sec or Mflops. A C language version of subroutine KERNEL is also included which executes 24 samples of C numerical computation. The 24 kernels are a hydrodynamics code fragment, a fragment from an incomplete Cholesky conjugate gradient code, the standard inner product function of linear algebra, a fragment from a banded linear equations routine, a segment of a tridiagonal elimination routine, an example of a general linear recurrence equation, an equation of state fragment, part of an alternating direction implicit integration code, an integrate predictor code, a difference predictor code, a first sum, a first difference, a fragment from a two-dimensional particle-in-cell code, a part of a one-dimensional particle-in-cell code, an example of how casually FORTRAN can be written, a Monte Carlo search loop, an example of an implicit conditional computation, a fragment of a two-dimensional explicit hydrodynamics code, a general linear recurrence equation, part of a discrete ordinates transport program, a simple matrix calculation, a segment of a Planck distribution procedure, a two-dimensional implicit hydrodynamics fragment, and determination of the location of the first minimum in an array. 2 - Method of solution: CPU performance rates depend strongly on the maturity of FORTRAN compiler machine code optimization. The LFK test-bed executes the set of 24 kernels three times, resetting the DO
A Fortran Program for Deep Space Sensor Analysis.
Science.gov (United States)
1984-12-14
used to help p maintain currency to the deep space satellite catelog? Research Question Can a Fortran program be designed to evaluate the effectiveness ...Range ( AFETR ) Range p Measurements Laboratory (RML) is located in Malibar, .- Florida. Like GEODSS, Malibar uses a 48 inch telescope with a...phased out. This mode will evaluate the effect of the loss of the 3 Baker-Nunn sites to mode 3 Mode 5 through Mode 8 Modes 5 through 8 are identical to
An environment for parallel structuring of Fortran programs
International Nuclear Information System (INIS)
Sridharan, K.; McShea, M.; Denton, C.; Eventoff, B.; Browne, J.C.; Newton, P.; Ellis, M.; Grossbard, D.; Wise, T.; Clemmer, D.
1990-01-01
The paper describes and illustrates an environment for interactive support of the detection and implementation of macro-level parallelism in Fortran programs. The approach couples algorithms for dependence analysis with both innovative techniques for complexity management and capabilities for the measurement and analysis of the parallel computation structures generated through use of the environment. The resulting environment is complementary to the more common approach of seeking local parallelism by loop unrolling, either by an automatic compiler or manually. (orig.)
Implementation of Neutronics Analysis Code using the Features of Object Oriented Programming via Fortran90/95
Energy Technology Data Exchange (ETDEWEB)
Han, Tae Young; Cho, Beom Jin [KEPCO Nuclear Fuel, Daejeon (Korea, Republic of)
2011-05-15
The object-oriented programming (OOP) concept was radically established after 1990s and successfully involved in Fortran 90/95. The features of OOP are such as the information hiding, encapsulation, modularity and inheritance, which lead to producing code that satisfy three R's: reusability, reliability and readability. The major OOP concepts, however, except Module are not mainly used in neutronics analysis codes even though the code was written by Fortran 90/95. In this work, we show that the OOP concept can be employed to develop the neutronics analysis code, ASTRA1D (Advanced Static and Transient Reactor Analyzer for 1-Dimension), via Fortran90/95 and those can be more efficient and reasonable programming methods
Innovative Language-Based & Object-Oriented Structured AMR Using Fortran 90 and OpenMP
Science.gov (United States)
Norton, C.; Balsara, D.
1999-01-01
Parallel adaptive mesh refinement (AMR) is an important numerical technique that leads to the efficient solution of many physical and engineering problems. In this paper, we describe how AMR programing can be performed in an object-oreinted way using the modern aspects of Fortran 90 combined with the parallelization features of OpenMP.
User manual for two simple postscript output FORTRAN plotting routines
Science.gov (United States)
Nguyen, T. X.
1991-01-01
Graphics is one of the important tools in engineering analysis and design. However, plotting routines that generate output on high quality laser printers normally come in graphics packages, which tend to be expensive and system dependent. These factors become important for small computer systems or desktop computers, especially when only some form of a simple plotting routine is sufficient. With the Postscript language becoming popular, there are more and more Postscript laser printers now available. Simple, versatile, low cost plotting routines that can generate output on high quality laser printers are needed and standard FORTRAN language plotting routines using output in Postscript language seems logical. The purpose here is to explain two simple FORTRAN plotting routines that generate output in Postscript language.
Computer applications in physics with FORTRAN, BASIC and C
CERN Document Server
Chandra, Suresh
2014-01-01
Because of encouraging response for first two editions of the book and for taking into account valuable suggestion from teachers as well as students, the text for Interpolation, Differentiation, Integration, Roots of an Equation, Solution of Simultaneous Equations, Eigenvalues and Eigenvectors of Matrix, Solution of Differential Equations, Solution of Partial Differential Equations, Monte Carlo Method and Simulation, Computation of some Functions is improved throughout and presented in a more systematic manner by using simple language. These techniques have vast applications in Science, Engineering and Technology. The C language is becoming popular in universities, colleges and engineering institutions. Besides the C language, programs are written in FORTRAN and BASIC languages. Consequently, this book has rather wide scope for its use. Each of the topics are developed in a systematic manner; thus making this book useful for graduate, postgraduate and engineering students. KEY FEATURES: Each topic is self exp...
Implementation of the Next Generation Attenuation (NGA) ground-motion prediction equations in Fortran and R
Science.gov (United States)
Kaklamanos, James; Boore, David M.; Thompson, Eric M.; Campbell, Kenneth W.
2010-01-01
This report presents two methods for implementing the earthquake ground-motion prediction equations released in 2008 as part of the Next Generation Attenuation of Ground Motions (NGA-West, or NGA) project coordinated by the Pacific Earthquake Engineering Research Center (PEER). These models were developed for predicting ground-motion parameters for shallow crustal earthquakes in active tectonic regions (such as California). Of the five ground-motion prediction equations (GMPEs) developed during the NGA project, four models are implemented: the GMPEs of Abrahamson and Silva (2008), Boore and Atkinson (2008), Campbell and Bozorgnia (2008), and Chiou and Youngs (2008a); these models are abbreviated as AS08, BA08, CB08, and CY08, respectively. Since site response is widely recognized as an important influence of ground motions, engineering applications typically require that such effects be modeled. The model of Idriss (2008) is not implemented in our programs because it does not explicitly include site response, whereas the other four models include site response and use the same variable to describe the site condition (VS30). We do not intend to discourage the use of the Idriss (2008) model, but we have chosen to implement the other four NGA models in our programs for those users who require ground-motion estimates for various site conditions. We have implemented the NGA models by using two separate programming languages: Fortran and R (R Development Core Team, 2010). Fortran, a compiled programming language, has been used in the scientific community for decades. R is an object-oriented language and environment for statistical computing that is gaining popularity in the statistical and scientific community. Derived from the S language and environment developed at Bell Laboratories, R is an open-source language that is freely available at http://www.r-project.org/ (last accessed 11 January 2011). In R, the functions for computing the NGA equations can be loaded as an
Implementing embedded artificial intelligence rules within algorithmic programming languages
Science.gov (United States)
Feyock, Stefan
1988-01-01
Most integrations of artificial intelligence (AI) capabilities with non-AI (usually FORTRAN-based) application programs require the latter to execute separately to run as a subprogram or, at best, as a coroutine, of the AI system. In many cases, this organization is unacceptable; instead, the requirement is for an AI facility that runs in embedded mode; i.e., is called as subprogram by the application program. The design and implementation of a Prolog-based AI capability that can be invoked in embedded mode are described. The significance of this system is twofold: Provision of Prolog-based symbol-manipulation and deduction facilities makes a powerful symbolic reasoning mechanism available to applications programs written in non-AI languages. The power of the deductive and non-procedural descriptive capabilities of Prolog, which allow the user to describe the problem to be solved, rather than the solution, is to a large extent vitiated by the absence of the standard control structures provided by other languages. Embedding invocations of Prolog rule bases in programs written in non-AI languages makes it possible to put Prolog calls inside DO loops and similar control constructs. The resulting merger of non-AI and AI languages thus results in a symbiotic system in which the advantages of both programming systems are retained, and their deficiencies largely remedied.
SLACINPT - a FORTRAN program that generates boundary data for the SLAC gun code
International Nuclear Information System (INIS)
Michel, W.L.; Hepburn, J.D.
1982-03-01
The FORTRAN program SLACINPT was written to simplify the preparation of boundary data for the SLAC gun code. In SLACINPT, the boundary is described by a sequence of straight line or arc segments. From these, the program generates the individual boundary mesh point data, required as input by the SLAC gun code
Cross-language Babel structsâmaking scientific interfaces more efficient
International Nuclear Information System (INIS)
Prantl, Adrian; Epperly, Thomas G W; Ebner, Dietmar
2013-01-01
Babel is an open-source language interoperability framework tailored to the needs of high-performance scientific computing. As an integral element of the Common Component Architecture, it is employed in a wide range of scientific applications where it is used to connect components written in different programming languages. In this paper we describe how we extended Babel to support interoperable tuple data types (structs). Structs are a common idiom in (mono-lingual) scientific application programming interfaces (APIs); they are an efficient way to pass tuples of nonuniform data between functions, and are supported natively by most programming languages. Using our extended version of Babel, developers of scientific codes can now pass structs as arguments between functions implemented in any of the supported languages. In C, C++, Fortran 2003/2008 and Chapel, structs can be passed without the overhead of data marshaling or copying, providing language interoperability at minimal cost. Other supported languages are Fortran 77, Fortran 90/95, Java and Python. We will show how we designed a struct implementation that is interoperable with all of the supported languages and present benchmark data to compare the performance of all language bindings, highlighting the differences between languages that offer native struct support and an object-oriented interface with getter/setter methods. A case study shows how structs can help simplify the interfaces of scientific codes significantly. (paper)
Basic linear algebra subprograms for FORTRAN usage
Science.gov (United States)
Lawson, C. L.; Hanson, R. J.; Kincaid, D. R.; Krogh, F. T.
1977-01-01
A package of 38 low level subprograms for many of the basic operations of numerical linear algebra is presented. The package is intended to be used with FORTRAN. The operations in the package are dot products, elementary vector operations, Givens transformations, vector copy and swap, vector norms, vector scaling, and the indices of components of largest magnitude. The subprograms and a test driver are available in portable FORTRAN. Versions of the subprograms are also provided in assembly language for the IBM 360/67, the CDC 6600 and CDC 7600, and the Univac 1108.
specsim: A Fortran-77 program for conditional spectral simulation in 3D
Science.gov (United States)
Yao, Tingting
1998-12-01
A Fortran 77 program, specsim, is presented for conditional spectral simulation in 3D domains. The traditional Fourier integral method allows generating random fields with a given covariance spectrum. Conditioning to local data is achieved by an iterative identification of the conditional phase information. A flowchart of the program is given to illustrate the implementation procedures of the program. A 3D case study is presented to demonstrate application of the program. A comparison with the traditional sequential Gaussian simulation algorithm emphasizes the advantages and drawbacks of the proposed algorithm.
Multi-language Struct Support in Babel
Energy Technology Data Exchange (ETDEWEB)
Ebner, D; Prantl, A; Epperly, T W
2011-03-22
Babel is an open-source language interoperability framework tailored to the needs of high-performance scientific computing. As an integral element of the Common Component Architecture (CCA) it is used in a wide range of research projects. In this paper we describe how we extended Babel to support interoperable tuple data types (structs). Structs are a common idiom in scientific APIs; they are an efficient way to pass tuples of nonuniform data between functions, and are supported natively by most programming languages. Using our extended version of Babel, developers of scientific code can now pass structs as arguments between functions implemented in any of the supported languages. In C, C++ and Fortran 2003, structs can be passed without the overhead of data marshaling or copying, providing language interoperability at minimal cost. Other supported languages are Fortran 77, Fortran 90, Java and Python. We will show how we designed a struct implementation that is interoperable with all of the supported languages and present benchmark data compare the performance of all language bindings, highlighting the differences between languages that offer native struct support and an object-oriented interface with getter/setter methods.
Use of the PASKAL' language for programming in experiment automation systems
International Nuclear Information System (INIS)
Ostrovnoj, A.I.
1985-01-01
A complex of standard solutions intended for realization of the main functions is suggested; execution of these solutions is provided by any system for experiment automation. They include: recording and accumulation of experimental data; visualization and preliminary processing of incoming data, interaction with the operator and system control; data filing. It is advisable to use standard software, to represent data processing algorithms as parallel processes, to apply the PASCAL' language for programming. Programming using CAMAC equipment is provided by complex of procedures similar to the set of subprograms in the FORTRAN language. Utilization of a simple data file in accumulation and processing programs ensures unified representation of experimental data and uniform access to them on behalf of a large number of programs operating both on-line and off-line regimes. The suggested approach is realized when developing systems on the base of the SM-3, SM-4 and MERA-60 computers with RAFOS operating system
Portable parallel programming in a Fortran environment
International Nuclear Information System (INIS)
May, E.N.
1989-01-01
Experience using the Argonne-developed PARMACs macro package to implement a portable parallel programming environment is described. Fortran programs with intrinsic parallelism of coarse and medium granularity are easily converted to parallel programs which are portable among a number of commercially available parallel processors in the class of shared-memory bus-based and local-memory network based MIMD processors. The parallelism is implemented using standard UNIX (tm) tools and a small number of easily understood synchronization concepts (monitors and message-passing techniques) to construct and coordinate multiple cooperating processes on one or many processors. Benchmark results are presented for parallel computers such as the Alliant FX/8, the Encore MultiMax, the Sequent Balance, the Intel iPSC/2 Hypercube and a network of Sun 3 workstations. These parallel machines are typical MIMD types with from 8 to 30 processors, each rated at from 1 to 10 MIPS processing power. The demonstration code used for this work is a Monte Carlo simulation of the response to photons of a ''nearly realistic'' lead, iron and plastic electromagnetic and hadronic calorimeter, using the EGS4 code system. 6 refs., 2 figs., 2 tabs
MULTITASKER, Multitasking Kernel for C and FORTRAN Under UNIX
International Nuclear Information System (INIS)
Brooks, E.D. III
1988-01-01
1 - Description of program or function: MULTITASKER implements a multitasking kernel for the C and FORTRAN programming languages that runs under UNIX. The kernel provides a multitasking environment which serves two purposes. The first is to provide an efficient portable environment for the development, debugging, and execution of production multiprocessor programs. The second is to provide a means of evaluating the performance of a multitasking program on model multiprocessor hardware. The performance evaluation features require no changes in the application program source and are implemented as a set of compile- and run-time options in the kernel. 2 - Method of solution: The FORTRAN interface to the kernel is identical in function to the CRI multitasking package provided for the Cray XMP. This provides a migration path to high speed (but small N) multiprocessors once the application has been coded and debugged. With use of the UNIX m4 macro preprocessor, source compatibility can be achieved between the UNIX code development system and the target Cray multiprocessor. The kernel also provides a means of evaluating a program's performance on model multiprocessors. Execution traces may be obtained which allow the user to determine kernel overhead, memory conflicts between various tasks, and the average concurrency being exploited. The kernel may also be made to switch tasks every cpu instruction with a random execution ordering. This allows the user to look for unprotected critical regions in the program. These features, implemented as a set of compile- and run-time options, cause extra execution overhead which is not present in the standard production version of the kernel
Object oriented programming in simulation of ions transport
International Nuclear Information System (INIS)
Zhang Wenyong; Wang Tongquan; Xiao Yabin; Dai Hongyi; Chen Yuzhong
2001-01-01
Using Object Oriented Programming (OOP) method can make our program more reliable and easier to read, debug, maintain and upgrade. This paper compared FORTRAN90-the language widely used in science computing with C ++ --An Object Oriented Language, and the conclusion was made that although FORTRAN90 have many deficiencies, it can be used in Object Oriented programming. Then OOP method was used in programming of Monte Carlo simulation of ions transport and the general process of OOP was given
FASTPLOT, Interface Routines to MS FORTRAN Graphics Library
International Nuclear Information System (INIS)
1999-01-01
1 - Description of program or function: FASTPLOT is a library of routines that can be used to interface with the Microsoft FORTRAN Graphics library (GRAPHICS.LIB). The FASTPLOT routines simplify the development of graphics applications and add capabilities such as histograms, Splines, symbols, and error bars. FASTPLOT also includes routines that can be used to create menus. 2 - Methods: FASTPLOT is a library of routines which must be linked with a user's FORTRAN programs that call any FASTPLOT routines. In addition, the user must link with the Microsoft FORTRAN Graphics library (GRAPHICS.LIB). 3 - Restrictions on the complexity of the problem: None noted
GRESS, FORTRAN Pre-compiler with Differentiation Enhancement
International Nuclear Information System (INIS)
1999-01-01
1 - Description of program or function: The GRESS FORTRAN pre-compiler (SYMG) and run-time library are used to enhance conventional FORTRAN-77 programs with analytic differentiation of arithmetic statements for automatic differentiation in either forward or reverse mode. GRESS 3.0 is functionally equivalent to GRESS 2.1. GRESS 2.1 is an improved and updated version of the previous released GRESS 1.1. Improvements in the implementation of a the CHAIN option have resulted in a 70 to 85% reduction in execution time and up to a 50% reduction in memory required for forward chaining applications. 2 - Method of solution: GRESS uses a pre-compiler to analyze FORTRAN statements and determine the mathematical operations embodied in them. As each arithmetic assignment statement in a program is analyzed, SYMG generates the partial derivatives of the term on the left with respect to each floating-point variable on the right. The result of the pre-compilation step is a new FORTRAN program that can produce derivatives for any REAL (i.e., single or double precision) variable calculated by the model. Consequently, GRESS enhances FORTRAN programs or subprograms by adding the calculation of derivatives along with the original output. Derivatives from a GRESS enhanced model can be used internally (e.g., iteration acceleration) or externally (e.g., sensitivity studies). By calling GRESS run-time routines, derivatives can be propagated through the code via the chain rule (referred to as the CHAIN option) or accumulated to create an adjoint matrix (referred to as the ADGEN option). A third option, GENSUB, makes it possible to process a subset of a program (i.e., a do loop, subroutine, function, a sequence of subroutines, or a whole program) for calculating derivatives of dependent variables with respect to independent variables. A code enhanced with the GENSUB option can use forward mode, reverse mode, or a hybrid of the two modes. 3 - Restrictions on the complexity of the problem: GRESS
A FORTRAN program for an IBM PC compatible computer for calculating kinematical electron diffraction patterns
International Nuclear Information System (INIS)
Skjerpe, P.
1989-01-01
This report describes a computer program which is useful in transmission electron microscopy. The program is written in FORTRAN and calculates kinematical electron diffraction patterns in any zone axis from a given crystal structure. Quite large unit cells, containing up to 2250 atoms, can be handled by the program. The program runs on both the Helcules graphic card and the standard IBM CGA card
ASSIST - a package of Fortran routines for handling input under specified syntax rules and for management of data structures
International Nuclear Information System (INIS)
Sinclair, J.E.
1991-02-01
The ASSIST package (A Structured Storage and Input Syntax Tool) provides for Fortran programs a means for handling data structures more general than those provided by the Fortran language, and for obtaining input to the program from a file or terminal according to specified syntax rules. The syntax-controlled input can be interactive, with automatic generation of prompts, and dialogue to correct any input errors. The range of syntax rules possible is sufficient to handle lists of numbers and character strings, keywords, commands with optional clauses, and many kinds of variable-format constructions, such as algebraic expressions. ASSIST was developed for use in two large programs for the analysis of safety of radioactive waste disposal facilities, but it should prove useful for a wide variety of applications. (author)
FORTRAN program for calculating liquid-phase and gas-phase thermal diffusion column coefficients
International Nuclear Information System (INIS)
Rutherford, W.M.
1980-01-01
A computer program (COLCO) was developed for calculating thermal diffusion column coefficients from theory. The program, which is written in FORTRAN IV, can be used for both liquid-phase and gas-phase thermal diffusion columns. Column coefficients for the gas phase can be based on gas properties calculated from kinetic theory using tables of omega integrals or on tables of compiled physical properties as functions of temperature. Column coefficients for the liquid phase can be based on compiled physical property tables. Program listings, test data, sample output, and users manual are supplied for appendices
Comparison of PASCAL and FORTRAN for solving problems in the physical sciences
Science.gov (United States)
Watson, V. R.
1981-01-01
The paper compares PASCAL and FORTRAN for problem solving in the physical sciences, due to requests NASA has received to make PASCAL available on the Numerical Aerodynamic Simulator (scheduled to be operational in 1986). PASCAL disadvantages include the lack of scientific utility procedures equivalent to the IBM scientific subroutine package or the IMSL package which are available in FORTRAN. Advantages include a well-organized, easy to read and maintain writing code, range checking to prevent errors, and a broad selection of data types. It is concluded that FORTRAN may be the better language, although ADA (patterned after PASCAL) may surpass FORTRAN due to its ability to add complex and vector math, and the specify the precision and range of variables.
Objective Oriented Design of System Thermal Hydraulic Analysis Program and Verification of Feasibility
International Nuclear Information System (INIS)
Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu
2008-01-01
The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. TRACE, RELAP5-3D and MARS codes are examples of these activities. The codes were redesigned to have modular structures utilizing Fortran 90 features. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. It was not commonly used in mainstream software application development until the early 1990s. Many modern programming languages now support OOP. Although the recent Fortran language also support the OOP, it is considered to have limited functions compared to the modern software features. In this work, objective oriented program for system safety analysis code has been tried utilizing modern C language feature. The advantage of OOP has been discussed after verification of design feasibility
Mixed-Language High-Performance Computing for Plasma Simulations
Directory of Open Access Journals (Sweden)
Quanming Lu
2003-01-01
Full Text Available Java is receiving increasing attention as the most popular platform for distributed computing. However, programmers are still reluctant to embrace Java as a tool for writing scientific and engineering applications due to its still noticeable performance drawbacks compared with other programming languages such as Fortran or C. In this paper, we present a hybrid Java/Fortran implementation of a parallel particle-in-cell (PIC algorithm for plasma simulations. In our approach, the time-consuming components of this application are designed and implemented as Fortran subroutines, while less calculation-intensive components usually involved in building the user interface are written in Java. The two types of software modules have been glued together using the Java native interface (JNI. Our mixed-language PIC code was tested and its performance compared with pure Java and Fortran versions of the same algorithm on a Sun E6500 SMP system and a Linux cluster of Pentium~III machines.
Optimization of Grillages Using Genetic Algorithms for Integrating Matlab and Fortran Environments
Directory of Open Access Journals (Sweden)
Darius MaÄiÅ«nas
2013-02-01
Full Text Available The purpose of the paper is to present technology applied for the global optimization of grillage-type pile foundations (further grillages. The goal of optimization is to obtain the optimal layout of pile placement in the grillages. The problem can be categorized as a topology optimization problem. The objective function is comprised of maximum reactive force emerging in a pile. The reactive force is minimized during the procedure of optimization during which variables enclose the positions of piles beneath connecting beams. Reactive forces in all piles are computed utilizing an original algorithm implemented in the Fortran programming language. The algorithm is integrated into the MatLab environment where the optimization procedure is executed utilizing a genetic algorithm. The article also describes technology enabling the integration of MatLab and Fortran environments. The authors seek to evaluate the quality of a solution to the problem analyzing experimental results obtained applying the proposed technology.
Optimization of Grillages Using Genetic Algorithms for Integrating Matlab and Fortran Environments
Directory of Open Access Journals (Sweden)
Darius MaÄiÅ«nas
2012-12-01
Full Text Available The purpose of the paper is to present technology applied for the global optimization of grillage-type pile foundations (further grillages. The goal of optimization is to obtain the optimal layout of pile placement in the grillages. The problem can be categorized as a topology optimization problem. The objective function is comprised of maximum reactive force emerging in a pile. The reactive force is minimized during the procedure of optimization during which variables enclose the positions of piles beneath connecting beams. Reactive forces in all piles are computed utilizing an original algorithm implemented in the Fortran programming language. The algorithm is integrated into the MatLab environment where the optimization procedure is executed utilizing a genetic algorithm. The article also describes technology enabling the integration of MatLab and Fortran environments. The authors seek to evaluate the quality of a solution to the problem analyzing experimental results obtained applying the proposed technology.
Far-field Lorenz-Mie scattering in an absorbing host medium: Theoretical formalism and FORTRAN program
Science.gov (United States)
Mishchenko, Michael I.; Yang, Ping
2018-01-01
In this paper we make practical use of the recently developed first-principles approach to electromagnetic scattering by particles immersed in an unbounded absorbing host medium. Specifically, we introduce an actual computational tool for the calculation of pertinent far-field optical observables in the context of the classical Lorenz-Mie theory. The paper summarizes the relevant theoretical formalism, explains various aspects of the corresponding numerical algorithm, specifies the input and output parameters of a FORTRAN program available at https://www.giss.nasa.gov/staff/mmishchenko/Lorenz-Mie.html, and tabulates benchmark results useful for testing purposes. This public-domain FORTRAN program enables on
|
|||||||
4768
|
dbpedia
|
1
| 5
|
https://retrocomputing.stackexchange.com/questions/9732/what-was-the-first-language-to-offer-full-structured-programming-support
|
en
|
What was the first language to offer "full" structured programming support?
|
[
"https://lh6.googleusercontent.com/-74M83RBb6ys/AAAAAAAAAAI/AAAAAAAAABM/igPkit2LlSc/photo.jpg?sz=64",
"https://i.sstatic.net/36WSK.png?s=64",
"https://i.sstatic.net/J38sS.jpg?s=64",
"https://www.gravatar.com/avatar/645dbb4f7ba1ff9d7ff2de02f68fbe9c?s=64&d=identicon&r=PG",
"https://retrocomputing.stackexchange.com/posts/9732/ivc/acfa?prg=e3f308fd-859e-41c7-83ed-0e50153afa97"
] |
[] |
[] |
[
""
] | null |
[
"R. Schmitz"
] |
2019-04-18T12:49:48
|
In the same vein as this question about conditionals, I am interested in the first time a language provided functionality we assume as a given for modern programming languages. The question is when a
|
en
|
https://cdn.sstatic.net/Sites/retrocomputing/Img/favicon.ico?v=9b6f1e4fd156
|
Retrocomputing Stack Exchange
|
https://retrocomputing.stackexchange.com/questions/9732/what-was-the-first-language-to-offer-full-structured-programming-support
|
A couple of misconceptions in this one:
Recursion is just another kind of iteration. It was in fact the only iteration mechanism provided in early versions of Lisp. Any use of recursion can be transformed into standard looping iteration (although many of them require use of a stack as well). Recursion can be viewed as kind of a hack to be able to leverage the program stack for your stack usage while iterating (in languages that use a program stack). Likewise early Lisp demonstrated that its possible (although not always convenient) to use recursion for all your iteration needs.
I have never before heard that support for recursion is required for a language to be considered a "structured language". Really all that is required is that the language be Turing complete without relying on "unstructured" branch statements (aka: GOTOs)
Now with all these caveats, most of the early "Structured Programming" developers were also at the same time involved in the development and use of ALGOL. This includes most prominently Dijkstra, who was the leading light in both the Structured Programming movement and developed the first ALGOL 60 compiler. Hoare and Dahl were also very involved in the development of both.
The other major languages that were extant at the time were COBOL and FORTRAN. Early FORTRAN in particular made extensive use of GOTOs, along with data overlays that are essentially the data equivalent of GOTOS, and thus a large amount of the development of ALGOL was in reaction to FORTRAN. So if you are looking for the early standard-bearer for the paradigm, ALGOL is probably your language.
This is a copy-paste of my answer to the other question. R. Schmitz said I could do it though.
If we are talking about a block structured conditional like
if condition some arbitrary sequence of statements including perhaps nested ifs else some other arbitrary sequence of statements including perhaps nested ifs
rather than just if ... goto some label which is no different to a conditional branch, then two early contenders would be:
Lisp (1958) which has an if and cond function. I think condpredates if. When I learned Lisp in the 1980's I'm fairly sure that if wasn't there. Lisp also had recursion, which means it had iteration by definition.
Algol 60 (1960) which has the structured if inherited by most modern imperative languages. Also while and for loops and recursion.
Early versions of FORTRAN and COBOL did not have structured if statements as far as I know or recursion.
|
||||
4768
|
dbpedia
|
1
| 10
|
http://groups.umd.umich.edu/cis/course.des/cis400/maxim/lectures/chp2.htm
|
en
|
Chapter 2
|
[] |
[] |
[] |
[
""
] | null |
[] | null | null |
Evolution of the Major Programming Languages
Zuse�s Plankalk�l
� Plankalk�l is the name of a programming language written by a German scientist by the name of Konrad Zuse, between 1936 and 1945
� This language was never implemented
� Its name means program calculus
MINIMAL HARDWARE PROGRAMMING: PSEUDOCODE
� Deficiencies in machine code and assembly language code led to the development of somewhat higher�level languages
� Short code � developed by John Mauchly in 1949
- Implemented with a pure interpreter
- Simplified the programming process; however, it was approximately 50 times slower than machine code
� Speedcoding � developed by John Backus in 1954
- Extended machine languages to include floating-point operations
- Backus claimed that problems that could take two weeks to program in machine code could be programmed in a few hours using Speedcoding
� The UNIVAC �compiling system � developed by a team led by Grace Hopper between 1951 and 1953
- It made machine code source programs much shorter; however it was still quite primitive
THE IBM 704 AND FORTRAN
� Historical Background
- The first widely accepted high-level language was FORTRAN
- FORTRAN was developed in large measure for the capabilities of the IBM 704 system
- The IBM 704 system was first introduced in 1954
� Design Process
- Plans for FORTRAN were actually begun before the announcement of the IBM 704 system
- The report titled �The IBM Mathematical FORmula TRANslating System� described the first version of FORTRAN
- It stated that FORTRAN could provide the efficiency of hand-coded programs and the ease of programming the interpretive pseudocode systems
- It also stated that it would eliminate coding errors and the debugging errors
� FORTRAN I Overview
- All of FORTRAN I�s control statements were based on 704 instructions
- There are no data-typing statements; instead variables whose names began with I, J, K, L, M, and N are implicitly integer type, and all others were implicitly floating-point type
- FORTRANS claim that the machine code produced by the compiler was as efficient as what could be produced by hand was nearly achieved
� FORTRAN II Overview
- Introduced in the spring of 1958
- Fixes a lot of the bugs in FORTRAN I and adds some significant features, such as, independent compilation of subroutines
� FORTRAN IV, FORTRAN 77, and FORTRAN 90
- FORTRAN III, although developed, it was never a big success
- FORTRAN IV became one the most widely distributed programming languages until 1978
- FORTRAN IV is a big improvement over FORTRAN II because of additions to it, such as, explicit type declaration for variables, a logical IF construct, and the capability of passing subprograms
- FORTRAN 77 is introduced in 1977, it retains most of the FORTRAN IV features and adds to it, character string handling, logical loop control statements, and an IF with an optional ELSE clause
- FORTRAN 90 is the latest version of FORTRAN is dramatically different than FORTRAN 77
- It supplies a collection of built in functions for array operations
- Arrays can be dynamically allocated and deallocated on command
- New control statements, such as, CASE, EXIT, and CYCLE are added
- Recursion is now possible
FUNCTIONAL PROGRAMMING: LISP
� The first functional programming language was invented to provide language features for list processing, which grew out of the applications in artificial intelligence
� The Beginnings of Artificial Intelligence and List Processing
- Interest in AI appears in the mid-1950�s
- This interest comes from the study of linguistics, psychology, and mathematics
- Linguistics because of the interest in the processing of a natural language
- Psychology because of the interest in modeling human information storage and retrieval, along with other brain processes
- Mathematics because of the interest in mechanizing certain intelligent processes
- The concept of list processing was introduced in 1956 by Allen Newell, J.C. Shaw, and Herbert Simon
- A few languages are developed and implemented for list processing but none become really popular until the development of LISP
� LISP Design Process
- Developed by John McCarthy and Marvin Minsky at MIT, because of their dislike for the other languages developed for AI
� Language Overview
- There are only two data structures, atoms and lists
- Atoms are either symbols or numeric literals
- Lists are specified by delimiting their elements with parentheses and may be simple or nested lists
- A list is stored internally as a single-linked list structure
� Processes in Functional Programming
- All computation in a functional program is accomplished by applying functions to arguments
- Iterative processes can be specified with recursive function calls, which means no loops
� The Syntax of LISP
- Program code and data have exactly the same form
� Evaluation
- LISP is still the most widely used language for AI
- In the 1970�s and 1980�s many different dialects of LISP were developed, which led to the problem of portability
- In order to solve the problem of LISP�s portability issue, COMMON LISP was developed as a standard version
� Two Descendants of LISP
- These two dialects are Scheme and COMMON LISP
- Scheme was developed in 1975
- It is relatively small in size and uses static scooping exclusively
- Scheme functions can be values of expressions and elements of lists; they can be assigned to variables, passed as parameters, and returned as values.� These features were not available in the early versions of LISP
- COMMON LISP was developed in 1984
- It takes the features of a few dialects and scheme and combines them into one language; while Scheme is a relatively small language, COMMON LISP is relatively large language and uses dynamic as well as static scooping
� Related Languages
- Three related functional programming languages are ML, Miranda, and Haskell
- MetaLanguage or ML was designed in the 1980�s by Robin Milner
- ML is a functional language, which supports imperative programming
- Miranda was designed by David Turner in the 1980�s as well; It is based on ML, SASL, and KRC
- Haskell is based on Miranda and was designed by Hudak and Fasel in 1992
THE FIRST STEP TOWARD SOPHISTICATION: ALGOL 60
� Historical Background
- Efforts to design a universal language inspired the development of the programming language ALGOL 60
- In the late 1950�s, all of the languages that had been developed thus far were for single architectures only
- Two user groups SHARE and USE showed an interest in forming a committee through the ACM organization, in order to study and recommend the design and implementation of a universal programming language
- Another group in Europe (GAMM) also was interested and in 1958 the ACM committee along with GAMM officially agreed to a joint language design project
- FORTRAN was not considered because of its sole ownership by IBM.
� Early Design Process
- The syntax of the language should be close to standard mathematical notation
- It should be possible to use the language for the description of computing processes in publications
- And programs in the new language must be mechanically translatable into machine language
� ALGOL 58 Overview
- The first language designed in this project was ALGOL 58
- ALGOL comes from the original name of the language, which is the International ALGOrithmic Language
- Many of the features in ALGOL 58 were derived from FORTRAN
- It generalized FORTRAN�s features and added several new constructs and concepts
� ALGOL 60 Design Process
- ALGOL 58 was furiously debated, resulting in modifications and additions to the language and the development of ALGOL 60
� ALGOL 60 Overview
- ALGOL 60 is result of the modifications made to ALGOL 58
- Introduces the concept of block structure, which allows parts of programs to localized
- Passing parameters by means of pass by value and pass by name are introduced
- Procedures are allowed to be recursive
- Stack-dynamic arrays are now allowed
� ALGOL 60 Evaluation
- ALGOL 60 succeeded in becoming the only acceptable formal means of communicating algorithms
- All imperative languages formed after 1960 are either direct or indirect descendants to ALGOL 60
- It was the first language whose syntax was formally described
- Unfortunately, ALGOL 60 never achieved widespread use, because it was too flexible and it lacked input and output statements
COMPUTERIZING BUSINESS RECORDS: COBOL
� Although COBOL has been used more than any other programming language it has ahd little effect on the design of any other languages, with the exception of PL/I.
� Historical Background
- Similar to ALGOL, COBOL was designed by a committee
- Three other languages for business applications that existed before COBOL were FLOW-MATIC, AIMACO, and COMTRAN
� COBOL Design Process
- The biggest concern regarding this new application language was that it be easy to use, even at the expense of being less powerful
- The language specifications for COBOL were published in 1960
� Evaluation
- COBOL originated a number of concepts, such as, constructs for macros, implementation of hierarchical data structures, allowed connotative names
- It was the first language whose use was mandated by the Department of Defense
- The poor performance of the early compilers made COBOL expensive to use, fortunately with the advent of better compiler designs and the mandate of the Defense Department COBOL became very popular
THE BEGINNINGS OF TIME SHARING: BASIC
� Similar to COBOL, BASIC has had widespread use, however it has gotten little respect
� Design Process
- BASIC was designed at DARTMOUTH University in the 1960�s by John Kemeny and Thomas Kurtz
- The goals of the BASIC system were to make it easy for non-science students to learn, be pleasant and friendly, provide fast turnaround for homework, allow free and private access, and consider user time more important than computer time
- The combination of the second, third, and fourth goals mentioned above led to the timeshared aspect of BASIC
� Language Overview
- The original version was very small having only fourteen different statements, and was not interactive, for there was no way to get user input from the terminal
- Overall, it was very limited making it very easy to learn
� Evaluation
- BASIC was the first widely used method of remote access to a computer
- Most of its design came from FORTRAN with some minor influence by ALGOL 60
- Reasons for BASIC�s success are probably it�s ease for which it can be learned and implemented
- Two contemporary versions of BASIC are QuickBasic and Visual BASIC
EVERYTHING FOR EVERYBODY: PL/I
� PL/I represents the first large-scale attempt to design a language that could be used for a broad spectrum of application areas
� Historical Background
- Developed as an IBM product
- Designed to replace FORTRAN, COBOL, LISP and the systems applications of assembly language
� Design Process
- The first version of PL/I was originally named FORTRAN VI because the initial design was just an extension of FORTRAN IV
- The design along with the name was quickly dropped and the new design was called NPL for new programming language
- The name change to PL/I resulted in the avoidance of confusing the name NPL with the National Physical Laboratory in England, since this was were PL/I was developed
� Language Overview
- PL/I included the best of ALGOL 60 (recursion and block structure), FORTRAN IV (separate compilation with communication through global data), and COBOL (data structures, input/output, and report generating facilities), along with a few new constructs
- PL/I was the first language to have programs allowed to create concurrently executing tasks, the possibility to detect and handle 23 different types of exceptions, procedures allowed to be use recursively, pointers included as a data type, and reference to the cross sections of arrays
� Evaluation
- The problem with PL/I is it�s complexity due to it�s large size
- It also suffered from what are now considered to be poorly designed constructs
- However, it did get significant use in both business and scientific applications
TWO EARLY DYNAMIC LANGUAGES: APL AND SNOBOL
� Neither of these languages are based on any previous language and neither have had much affect on any later languages
� The only similarity between APL and SNOBOL is there dynamic typing and dynamic storage allocation
� Origins and Characteristics of APL
- Kenneth E Iverson designed APL in 1960
- It was intended to be a vehicle for describing computer architecture not as a programming language
- It has many powerful operators in order to allow arrays to be manipulated as if they were scalar variables, however, this� creates a problem for implementers of the language
- Although it is not widely used, it is still in use today.
� Origins and Characteristics of SNOBOL
- Designed by D.J. Farber, R.E. Griswold, and F.P. Polensky in the early 1960�s
- It was designed specifically for text processing.
- However, it is rarely used nowadays because it is slower than some other languages
THE BEGINNINGS OF DATA ABSTRACTION: SIMULA 67
� Design Process
- Kristen Nygraad and Ole-Johan Dahl developed Simula I between 1962 and 1964
- It was designed exclusively for system simulation
� Language Overview
- SIMULA 67 is an extension of ALGOL 60
- Allowed for support of coroutines using the class construct, which is important because the idea of data abstraction starts here
ORTHOGONAL DESIGN: ALGOL 68
� Design Process
- Introduces the design criteria of orthogonality, which results in features, such as, the inclusion of user-defined data types and implicit heap-dynamic arrays
� Evaluation
- Introduces a significant number of features not previously used
- Although it may be contrasted with PL/I, PL/I gained a far greater acceptance than ALGOL 68
SOME IMPORTANT DESCENDANTS OF THE ALGOL�S
� All imperative languages including the object-oriented languages, designed since 1960 owe some of their design to ALGOL 60
� Pascal
- Designed by Niklaus Wirth
- Highly used for teaching programming; because of this it lacks some features, such as, separate compilation and passing an array of variable length
� C
- Originally designed for systems programming by Dennis Ritchie in 1972
- Because of its adequate control structures and data structuring facilities C is adaptive to many application areas
- A reason for both like and dislike of C is its lack of complete type checking
- C tends to be a very flexible language
� Modula-2
- Designed by Niklaus Wirth in 1976 (also designed Pascal)
- Distinguishing features include modules, which provide support for abstract data types, procedures as types, and low-level facilities for systems programming and coroutines
� Modula-3
- Based on Modula-2 and Modula-2+
- To Modula-2 it adds classes and objects for support of object-oriented programming, exception handling, garbagfe collection, and support for concurrency
� Oberon
- Loosely based on Modula-2 was written by Niklaus Wirth
- Although many features were added to Modula-2 to come up with Oberon many were subtracted also.
- The main feature added is type extension, which supports object oriented programming
- Among those removed are variant records, opaque types, enumeration types, sub-range types, the CARDINAL type, non-integer array indexes, the with statement, and the for statement
� Delphi
- A hybrid language, similar to C++ in that it was created by adding object-oriented support, along with other things, to an existing imperative language
- It is derived from Pascal, therefore making it much more elegant and safe than C++
PROGRAMMING BASED ON LOGIC: PROLOG
� The use of a formal logic notation to communicate computational processes to a computer
� Design Processes
- Alain Colmerauer, Phillip Roussel, and Robert Kowalski developed the design of Prolog
- Prolog uses a method for specifying predicate calculus propositions and a restricted form of resolution
� Language Overview
- Prolog�s programs consist of a collection of statements or facts
- The most common use of Prolog is as a kind of database
� Evaluation
- There are two major reasons why logic programming has not become widespread
- The first reason is that it has been proven to highly inefficient
- The second reason is that it has been shown to effective for only few areas of application, such as, certain kinds of database management and some areas of AI
HISTORY�S LARGEST DESIGN EFFORT: ADA
� Ada was originally developed for the Department of Defense
� Historical Background
- The Army, Navy, and Air Force all proposed the development of a high-level language for embedded systems in 1974 as an attempt to standardize their own embedded systems
� Design Process
- The committee assigned to this task was responsible for identifying requirements for a new Department of Defense high-level language, evaluating the existing languages to determine whether there was a viable candidate, and recommend adoption or implementation of a minimal set of programming languages
- The resulting language went through multiple phases and was named Ada
� Language Overview
- There are four major features of the Ada language
- Packages provide the means for encapsulation of data objects
- It includes extensive facilities for exception handling
- Allows program units to be generic
- It provides for concurrent execution of special program units and named tasks
� Evaluation
- The Ada language contains most of the concepts of software engineering and language design
- The development of a compiler was a difficult task and almost four years after the design of the Ada language was completed, did the Ada compiler appear
OBJECT-ORIENTED PROGRAMMING: SMALLTALK
� Design Process
- The concepts that led to the development of Smalltalk came out of the Ph.D. dissertation work of Alan Kay
� Language Overview
- The program units of Smalltalk are objects
- All computing in Smalltalk is done in the same fashion, passing messages to objects in order to invoke one of the methods assosciated with that object
COMBINING IMPERATIVE AND OBJECT-ORIENTED FEATURES: C++
� C++ uses C as a base for which to support many of the concepts introduced by Smalltalk
� Design Process
- Modifications to C included the addition of function parameter type checking and conversion, and classes
- Later the additions included inline functions, default parameters, and overloading of assignment operators
- With the above additions to C came the name C with Classes
- Not until the inclusion of virtual functions, was the name changed to C++
� Language Overview
- C++ provides a collection of predefined classes, along with the possibility of user-defined classes
- Operators in C++ may be overloaded, meaning that the user may create operators for existing operators
- Dynamic Binding is provided by virtual class functions
- Both functions and classes can be templated, which means that they can be parameterized
- It also includes exception handling
� Evaluation
- C++ has become extremely popular
- On the downside, C++ is extremely large and suffers some drawbacks similar to PL/I
� A Related Language: Eiffel
- Like C++ Eiffel combines both imperative and object-oriented features
PROGRAMMING THE WORLD WIDE WEB: JAVA
� Based on C++ however, numerous constructs have been removed, some changed, and a few others added
� The resulting language provides the power and flexibility of C++, however, it is smaller, simpler, and safer
� Design Process
- Java was designed for an application for which their appeared to be no other language satisfactory enough
- This started out as a programming language for embedded consumer electronics
- Later it proved to be a useful tool in programming for the web
� Language Overview
- Java employs both types and classes
- It does not have pointers, although its reference types provide some of the capabilities of pointers
- Uses a primitive Boolean type mainly for the control expressions of its control statements
- There are no record, union, or enumeration types
- Supports only single inheritance
- Provides garbage collection
� Evaluation
- Designers of Java have eliminated many of the features that are considered unsafe in C++
- Java is still considered a complex language and its lack of multiple inheritance is said to lead to some peculiar designs
- The use of Java has increased much faster than the use of any other language
|
||||||||
4768
|
dbpedia
|
2
| 12
|
https://datatron.blogspot.com/2018/07/algol-58-international-algebraic.html
|
en
|
The Burroughs 205 and 220 Blog: Algol
|
https://datatron.blogspot.com/favicon.ico
|
https://datatron.blogspot.com/favicon.ico
|
[
"https://resources.blogblog.com/img/icon18_edit_allbkg.gif",
"https://3.bp.blogspot.com/-DAY4Y69Bfls/VL4_Kc85dcI/AAAAAAAAC7Y/5l1lU1Nu0TE/s220/B205Prime.jpg"
] |
[] |
[] |
[
""
] | null |
[
"Paul Kimpel"
] | null |
By the early 1950s, the number and variety of electronic computers were increasing, the scope of their application to science, business, a...
|
en
|
https://datatron.blogspot.com/favicon.ico
|
https://datatron.blogspot.com/2018/07/algol-58-international-algebraic.html
| ||||
4768
|
dbpedia
|
2
| 69
|
https://vanzhanguisher.wordpress.com/page/2/
|
en
|
The time you waste to enjoy life is not a waste of time. :)
|
[
"https://s2.wp.com/wp-content/themes/pub/twentyten/images/headers/forestfloor.jpg",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://pixel.wp.com/b.gif?v=noscript"
] |
[] |
[] |
[
""
] | null |
[] | null |
Still new, but gonna be developed soon.
|
en
|
https://s1.wp.com/i/favicon.ico
|
The time you waste to enjoy life is not a waste of time. :)
|
https://vanzhanguisher.wordpress.com/
| |||||
4768
|
dbpedia
|
3
| 25
|
https://chii90.wordpress.com/2013/03/09/chapter-2-evolution-of-the-major-programming-languages-review-questions/
|
en
|
Chapter 2 Evolution of the Major Programming Languages (Review Questions)
|
[
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://pixel.wp.com/b.gif?v=noscript"
] |
[] |
[] |
[
""
] | null |
[] |
2013-03-09T00:00:00
|
1. In what year was Plankalkül designed? In what year was that design published? = Plankalkül was designed by Konrad Zuse in 1945 but it isn't published until 1972. 2. Mention an interesting feature of Zuse's programs. = An interesting features of Zuse's programs was the inclusion of mathematical expressions showing the current relationships…
|
en
|
https://s1.wp.com/i/favicon.ico
|
チイちゃんのブログ~
|
https://chii90.wordpress.com/2013/03/09/chapter-2-evolution-of-the-major-programming-languages-review-questions/
|
1. In what year was Plankalkül designed? In what year was that design published?
= Plankalkül was designed by Konrad Zuse in 1945 but it isn’t published until 1972.
2. Mention an interesting feature of Zuse’s programs.
= An interesting features of Zuse’s programs was the inclusion of mathematical expressions showing the current relationships between program variables. These expressions stated what would be true during execution at the points in the code where they appeared.
3. What does Plankalkül mean?
= Konrad Zuse’s language Plankalkül means program calculus because it is used to solve a wide variety of problems by writing algorithms in the language.
4. Speedcoding was invented to overcome two significant shortcomings of the computer hardware of the early 1950s. What were they?
= The two significant shortcomings of the computer hardware are Pseudoinstructions and novel facility of automatically incrementing address registers.
5. What is the number of bits in a single word of the UNIVAC I’s memory? How are the bits grouped?
= The words of the UNIVAC I’s memory had 72 bits, grouped as 12 six-bit bytes.
6. What hardware capability that first appeared in the IBM 704 computer strongly affected the evolution of programming languages? Explain why?
= The hardware capability that first appeared in the IBM 704 are indexing and floating-point instructions in hardware because processor of that time took too many time in software floating-point processing, the overhead of interpretation and the simulation of indexing were relatively insignificant. Many programmers of that time never used interpretive systems, preferring the efficiency of hand-coded machine (or assembly) language
7. Who developed Speedcoding system for the IBM 701?
= The Speedcoding system was developed by John Backus for the IBM 701. The Speedcoding interpreter effectively converted the 701 to a virtual three-address floating-point calculator.
8. Who developed Short Code? Why is Short Code called automatic programming?
= Short Code was developed by John Mauchly in 1949 for the BINAC computer, which was one of the first successful stored-program electronic computers.
9. Under what environmental consideration was FORTRAN developed? Which is the first version of FORTRAN?
= The environmental consideration in which FORTRAN was developed was as follows: (1) Computers had small memories and were slow and relatively unreliable; (2) the primary use of computers was for scientific computations; (3) there were no existing efficient and effective ways to program computers; (4) because of the high cost of computers compared to the cost of programmers, speed of the generated object code was the primary goal of the first FORTRAN compilers.
10. What was the most significant feature added to FORTRAN I to get FORTRAN II?
= the significant feature added to FORTRANS I to get FORTRAN II is the independent compilation of subroutines.
11. What control flow statements were added to FORTRAN IV to get FORTRAN 77?
= the logical loop control statements.
12. Which version of FORTRAN was the first to have any sort of dynamic variables?
= The first to have any sort of dynamic variables is FORTRAN 90 because it has dynamic arrays, records, pointers, a multiple selection statements, and modules.
13. Which version of FORTRAN was the first to have character string handling?
= the first FORTRAN to have character string handling is FORTRAN 77.
14. Why were linguists interested in artificial intelligence in the late 1950s?
= It were linguists interested in artificial intelligence because linguistics were concerned with natural language processing.
15. What are the different data types and structures in Common LISP?
= The different data types and structures in Common LISP are records, arrays, complex numbers, and character strings. It also has a form of packages for modularizing collections of functions and data providing access control.
16. In what way are Scheme and Common LISP opposites of each other?
= Scheme lacks on portability but LISP doesn’t. That’s why they are different.
17. What dialect of LISP is used for introductory programming courses at some universities?
= Scheme
18. What two professional organizations together designed ALGOL 60?
= They are GAMM (a German acronym for Society for Applied Mathematics and Mechanics) and ACM (Association for Computing Machinery).
19. What was the goal for developing C?
= to provide a language in which programs could be organized as they could be organized in SIMULA 67—that is, with classes and inheritance.
20. What were the significant modifications to ALGOL 58 to produce ALGOL 60?
= Block structure introduced- method of designing code that segments scopes of variables and control statement, passing parameters by value or name allowed, recursive procedures allowed, stack dynamic arrays allowed, can create user defined data types, uses reserved keywords.
21. What language was designed to describe the syntax of ALGOL 60?
= In ALGOL 60, it uses BNF (Backus-Naur form) to describe the syntax that was introduced by Backus.
22. On what language was COBOL based?
= COBOL was based on FLOW-MATIC language. FLOW-MATIC was being implemented in 1957.
23. In what year did the COBOL design process begin?
= the COBOL design process began in 1959 but it was first named CBL (Common Business Language).
24. What data structure that appeared in COBOL originated with Plankalkül?
= Hierarchical data structures (records).
25. What organization was most responsible for the early success of COBOL (in term of extent of use)?
= DoD (Department of Defense) because it was published by the Government Printing Office in April 1960.
26. Which data type does the original BASIC language support?
= The original Dartmouth BASIC supported only numeric and string data types. There was no integer type. All numeric variables were floating-point. Strings were dynamic length. Arrays of both numbers and strings were supported, as well as matrices (two dimensional arrays).
27. Why was BASIC an important language in the early 1980s?
= It was an important language because it provided a simple way of building graphical user interfaces (GUIs).
28. PL/I was designed to replace what two languages?
= It was designed to replace COBOL and FORTRAN IV.
29. For what new line of computers was PL/I designed?
= It is designed for the IBM System/360 line of computers.
30. What features of SIMULA 67 are now important parts of some object-oriented languages?
= the features are data abstraction. SIMULA 67 is an extension of ALGOL 60, taking both block structure and the control statements from that language. To provide support for coroutines in SIMLULA 67, the class constrict was developed. This was an important development because the concept of data abstraction began with it. Furthermore, data abstraction provides the foundation for object-oriented programming.
31. What innovation of data structuring was introduced in ALGOL 68 but is often credited to Pascal?
= User defined data types. User defined data types are valuable because they allow the user to design data abstractions that fit particular problems very closely.
32. What design criterion was used extensively in ALGOL 68?
= Orthogonality. ALGOL 68 includes a significant number of features that had not been previously used. Its use of orthogonality, which some may argue was overdone, was nevertheless revolutionary.
33. What language introduced the case statement?
= Case statements. ALGOL-W was the With and Hoarse version of ALGOL 60. it was implemented at Stanford University and was used primarily as ans instructional vehicle, but only at a few universities. The primary contributions of ALGOL-W were the value-result method of passing parameters and the case statement for multiple selection.
34. What operators in C were modeled on similar operators in ALGOL 68?
= For and switch statements, in its assigning operators, and in its treatment of pointers.
35. What are two characteristics of C that make it less safe than Pascal?
= Pointers and C has this capability via the ANSI C setjmp and longjmp.
36. Why is Prolog called a non procedural language?
= It is called a non procedural language because it doesn’t require writing traditional programming logic.
37. What are the two kinds of statements that populate a Prolog database?
= The database of Prolog program consists of two kinds of statements: facts and rules.
38. What is the primary application area for which Ada was designed?
= Embedded system. The Ada language was developed for the Department of Defense (DoD), so the state of their computing environment was instrumental in determining its form. By 1974, over half of the applications of computers in DoD were embedded systems. An embedded system is one in which the computer hardware is embedded in the device it controls or for which it provides services.
39. What are the concurrent program units of Ada called?
= Concurrent execution of special program units, named tasks, using the rendezvous mechanism.
40. What C++ construct provides support for abstract data types?
= Packages in the C++ language provide the means for encapsulating data objects, specifications for data types and procedures.
42. What three concepts are the basis for object-oriented programming?
= classes, objects, and methods.
43. Why does C++ include the features of Ada that are known to be unsafe?
= A second-important goal was that there should be little or no performance penalty to C. A third goal of C with Classes was that it could be used for every application for which C could be used, so virtually none of the features of C would be removed, not even those considered to be unsafe.
44. From what language does Objective-C borrow its syntax for method calls?
= Smaltalk
45. What programming paradigm that nearly all recently designed languages support is not supported by Go?
= Does not support traditional object-oriented programming, as it has no form of inheritance
46. What is the primary application for Objective-C?
= MacOS/iOS – iPhone
47. What language designer worked on both C and Go?
= Ken Thomson
48. What do the Ada and COBOL languages have in common?
= Both were popularized by the Department of Defense.
49. What as the first application for Java?
= A programming language for embedded consumer electronic devices.
50. What characteristic of Java is most evident in Java Script?
= Similar syntax.
51. How does the typing system of PHP and Java Script differ from that of Java?
= The dynamic nature of its strings and arrays, and its use of dynamic typing.
52. What array structure is included in C# but not in C, C++ or Java?
= rectangular arrays.
53. What two languages was the original version of Perl meant to replace?
= ksh and awk. Perl language was originally a combination of ksh and awk.
54. For what application area is Java Script most widely used?
= Java Script is most widely used in web
55. What is the relationship between Java Script and PHP , in term of their use?
= They both complement each other. PHP or any server-side language is mandatory for client-server interaction while Javascript is optional and not required. You could easily build a site using PHP & HTML but all the pretty things and effects which make your site attractive are provided by JavaScript along with CSS.
56. PHP’s primary data structure is a combination of what two data structures from other languages?
= PHP’s arrays are a combination of JAvaScript’s arrays and Perl’s hashes.
57. What data types does Java support?
= The data types in Java are divided into two categories: Primitive data types (integer data types and floating data types) and reference data types (array, class, interface).
58. What characteristic does C++ shares with .NET?
= provide access to the functionality of the .NET Framework. C++ and .NET both doesn’t support multiple inheritance.
59. What characteristic of Ruby’s arithmetic operators makes them unique among those of other languages?
= Ruby supports a rich set of operators, as you’d expect from a modern language. Most operators are actually method calls.
60. How does Java provide storage deallocation?
= Java does not allow explicit deallocation of storage. There is no explicit deallocation operator (such as “delete”). Since storage cannot be explicitly deallocated, you cannot have a dangling reference in Java.
61. Is Lua normally compiled, purely interpreted or impurely interpreted?
= purely interpreted.
63. What deficiency of the switch statement of C is addressed with the changes made by C# to that statements?
= There is no simplicity branch at the end of the selectable segments of code, which has cost enumerable programming errors.
64. What is the primary platform on which C# is used?
= .NET. The purpose of C# is to provide a language for component-based software development, specifically for such development in the .NET Framework. In this environment, components from a variety of languages can be easily combined to form systems.
65. What are the inputs to an XSLT processor?
= An XML data document and XSLT document.
66. What is the output of an XSLT processor?
= XML document
67. What element of the JSTL is related to a subprogram?
= Servlet, JSP.
68. To what is a JSP document converted by a JSP processor?
= A servlet
69. Where are .jsp files executed?
|
||||
4768
|
dbpedia
|
0
| 27
|
https://unacademy.com/content/bank-exam/study-material/computer-knowledge/algol/
|
en
|
Notes on ALGOL
|
[
"https://unacademy.com/content/wp-content/uploads/sites/2/2021/06/mobile-logo.svg",
"https://unacademy.com/content/wp-content/uploads/sites/2/2021/06/logo-1.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165846/avatar.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165845/user.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165843/settings.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165841/holidays.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165839/logout.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165837/u_facebook-f.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165835/u_youtube.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165834/u_twitter.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165832/u_instagram.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165830/u_linkedin.svg",
"https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/exam-updates.svg",
"https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/study-material.svg",
"https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/tests-and-practices.svg",
"https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/rankers-guide.svg",
"https://unacademy.com/content/wp-content/uploads/sites/2/2021/08/faq.svg",
"https://static.uacdn.net/production/_next/static/images/un_white_logo.svg?q=75&w=256 1x, https://static.uacdn.net/production/_next/static/images/un_white_logo.svg?q=75&w=384 2x",
"https://static.uacdn.net/production/_next/static/images/footer/learner.svg?q=75&w=48",
"https://static.uacdn.net/production/_next/static/images/footer/educator.svg?q=75&w=48",
"https://static.uacdn.net/production/_next/static/images/footer/parent.svg?q=75&w=48",
"https://unacademy.com/content/wp-content/uploads/sites/2/2021/06/mobile-logo.svg"
] |
[] |
[] |
[
""
] | null |
[] |
2022-05-19T13:19:40+00:00
|
Complete summary of ALGOL. Features and facts of ALGOL. Brief info on ALGOL 58, ALGOL 60 and ALGOL 68; their implementations.
|
en
|
Unacademy
|
https://unacademy.com/content/bank-exam/study-material/computer-knowledge/algol/
|
ALGOL is a computer programming language that was created in 1958–60 by an international committee of the Association of Computing Machinery (ACM) chaired by Alan J. Perlis of Carnegie Mellon University for publishing algorithms and performing calculations. ALGOL, like LISP, included recursive subprograms, which were procedures that could be used to solve an issue by reducing it to a smaller problem of the same type. ALGOL stands for algorithmic language and was previously known as IAL. It’s a group of portable programming languages for scientific computations that influenced a lot of other languages. For nearly 30 years, the ACM has utilised ALGOL as the standard approach for generating algorithms in textbooks and academics.
Features of ALGOL:
BNF Notation – Backus-Naur Form is a formal means for describing the syntax of a computer language. Think of it as short-hand for computer languages.
Statement Blocks – This is a syntactic means for grouping a series of programming statements together. Think of this like putting a group of related items in a box, and treating the box as a unit.
Parameter Passing – Specifically, there are two methods for passing information to and from a subprogram, pass by value, and pass by reference. Think of these as a way to protect information entering a subprogram from modification, one sends a copy, the other sends the actual data.
Dynamic Arrays – These are a list of values where the size of the list can be determined at run-time. Think of these as flexible arrays, arrays that adjust to the need.
Reserved Words – These are predefined words that have special meaning to a computer language. Think of these like your name. It represents you and the things you can do.
Major ALGOL specifications:
ALGOL 58
ALGOL 58, formerly known as IAL, is a member of the ALGOL programming language family. ALGOL 60 quickly surpassed it as an early compromise design.
The core concept of the compound statement was introduced in ALGOL 58, but it was limited to control flow only, and it was not bound to identifier scope like Algol 60’s blocks were.
According to Perlis, the language was first offered as IAL (International Algebraic Language), but this was rejected as a “‘unspeakable’ and pretentious acronym.”
IBM attempted to implement ALGOL 58, however the development was shortly abandoned due to rivalry with FORTRAN.
BALGOL, a version for the Burroughs 220, evolved along similar lines, but preserved much of ALGOL 58’s original personality.
The major contribution of ALGOL 58 was to later languages; it served as the foundation for JOVIAL, MAD, NELIAC, and ALGO.
ALGOL 58’s influence on ALGOL 60:
Both IAL and ALGOL 60 support arrays with arbitrary lower and higher subscript boundaries, as well as integer expressions to create subscript bounds.
Procedure declarations and their related identifier scopes can be nested in both IAL and ALGOL 60.
ALGOL 60 does not enable numeric statement labels, whereas IAL does.
A switch designator exists in both IAL and ALGOL 60, although it is unrelated to the switch statement in C and other languages.
The IAL if-statement does not have a then-clause or an else-clause; instead, it protects the next statement. IAL has an if either-statement that allows you to test several conditions in a neat way.
Both were superseded by ALGOL’s if-then construct, which included the ambiguous “dangling-else.”
ALGOL 60
ALGOL 60 (short for Algorithmic Language 1960) is a computer programming language from the ALGOL family. It came after ALGOL 58, which introduced code blocks and the begin and end pairs for delimiting them, marking a significant step forward in the development of organised programming. The first language to provide nested function declarations with the lexical scope was ALGOL 60. Many additional programming languages arose from it, including CPL, Simula, BCPL, B, Pascal, and C. ALGOL 60 principles were used to programme almost every computer of the time.
For parameter passing, ALGOL 60 provides two evaluation strategies: call-by-value and call-by-name. For each formal parameter, the procedure declaration stated which was to be used: value supplied for call-by-value and omitted for call-by-name. In contrast to call-by-reference, calling by name has certain implications.
ALGOL 68 implementations:
The Royal Radar Establishment in the United Kingdom presented ALGOL 68-R in July 1970 as the first implementation of the standard, based on the late-1968 draught Report. This was, however, only a portion of the phrase.
Despite its popularity on ICL computers, this version became a widely used language in military code, particularly in the United Kingdom.
CDC Netherlands released the first comprehensive implementation of the language in 1974 for the Control Data mainframe series.
There are three open-source Algol 68 implementations:
Marcel van der Veer wrote -a68g, GPLv3
algol 68 toc, an ALGOL 68RS open-source software port.
Jose E. Marchesi wrote an experimental Algol68 frontend for GCC.
Conclusion:
ALGOL is a computer programming language to provide lexical scope nested function declarations. Furthermore, it was the first programming language to pay close attention to formal language definition, and it was the first to introduce Backus–Naur form, a key formal grammar notation for language design, through the Algol 60 Report.
|
|||||
4768
|
dbpedia
|
3
| 7
|
http://groups.umd.umich.edu/cis/course.des/cis400/maxim/lectures/chp2.htm
|
en
|
Chapter 2
|
[] |
[] |
[] |
[
""
] | null |
[] | null | null |
Evolution of the Major Programming Languages
Zuse�s Plankalk�l
� Plankalk�l is the name of a programming language written by a German scientist by the name of Konrad Zuse, between 1936 and 1945
� This language was never implemented
� Its name means program calculus
MINIMAL HARDWARE PROGRAMMING: PSEUDOCODE
� Deficiencies in machine code and assembly language code led to the development of somewhat higher�level languages
� Short code � developed by John Mauchly in 1949
- Implemented with a pure interpreter
- Simplified the programming process; however, it was approximately 50 times slower than machine code
� Speedcoding � developed by John Backus in 1954
- Extended machine languages to include floating-point operations
- Backus claimed that problems that could take two weeks to program in machine code could be programmed in a few hours using Speedcoding
� The UNIVAC �compiling system � developed by a team led by Grace Hopper between 1951 and 1953
- It made machine code source programs much shorter; however it was still quite primitive
THE IBM 704 AND FORTRAN
� Historical Background
- The first widely accepted high-level language was FORTRAN
- FORTRAN was developed in large measure for the capabilities of the IBM 704 system
- The IBM 704 system was first introduced in 1954
� Design Process
- Plans for FORTRAN were actually begun before the announcement of the IBM 704 system
- The report titled �The IBM Mathematical FORmula TRANslating System� described the first version of FORTRAN
- It stated that FORTRAN could provide the efficiency of hand-coded programs and the ease of programming the interpretive pseudocode systems
- It also stated that it would eliminate coding errors and the debugging errors
� FORTRAN I Overview
- All of FORTRAN I�s control statements were based on 704 instructions
- There are no data-typing statements; instead variables whose names began with I, J, K, L, M, and N are implicitly integer type, and all others were implicitly floating-point type
- FORTRANS claim that the machine code produced by the compiler was as efficient as what could be produced by hand was nearly achieved
� FORTRAN II Overview
- Introduced in the spring of 1958
- Fixes a lot of the bugs in FORTRAN I and adds some significant features, such as, independent compilation of subroutines
� FORTRAN IV, FORTRAN 77, and FORTRAN 90
- FORTRAN III, although developed, it was never a big success
- FORTRAN IV became one the most widely distributed programming languages until 1978
- FORTRAN IV is a big improvement over FORTRAN II because of additions to it, such as, explicit type declaration for variables, a logical IF construct, and the capability of passing subprograms
- FORTRAN 77 is introduced in 1977, it retains most of the FORTRAN IV features and adds to it, character string handling, logical loop control statements, and an IF with an optional ELSE clause
- FORTRAN 90 is the latest version of FORTRAN is dramatically different than FORTRAN 77
- It supplies a collection of built in functions for array operations
- Arrays can be dynamically allocated and deallocated on command
- New control statements, such as, CASE, EXIT, and CYCLE are added
- Recursion is now possible
FUNCTIONAL PROGRAMMING: LISP
� The first functional programming language was invented to provide language features for list processing, which grew out of the applications in artificial intelligence
� The Beginnings of Artificial Intelligence and List Processing
- Interest in AI appears in the mid-1950�s
- This interest comes from the study of linguistics, psychology, and mathematics
- Linguistics because of the interest in the processing of a natural language
- Psychology because of the interest in modeling human information storage and retrieval, along with other brain processes
- Mathematics because of the interest in mechanizing certain intelligent processes
- The concept of list processing was introduced in 1956 by Allen Newell, J.C. Shaw, and Herbert Simon
- A few languages are developed and implemented for list processing but none become really popular until the development of LISP
� LISP Design Process
- Developed by John McCarthy and Marvin Minsky at MIT, because of their dislike for the other languages developed for AI
� Language Overview
- There are only two data structures, atoms and lists
- Atoms are either symbols or numeric literals
- Lists are specified by delimiting their elements with parentheses and may be simple or nested lists
- A list is stored internally as a single-linked list structure
� Processes in Functional Programming
- All computation in a functional program is accomplished by applying functions to arguments
- Iterative processes can be specified with recursive function calls, which means no loops
� The Syntax of LISP
- Program code and data have exactly the same form
� Evaluation
- LISP is still the most widely used language for AI
- In the 1970�s and 1980�s many different dialects of LISP were developed, which led to the problem of portability
- In order to solve the problem of LISP�s portability issue, COMMON LISP was developed as a standard version
� Two Descendants of LISP
- These two dialects are Scheme and COMMON LISP
- Scheme was developed in 1975
- It is relatively small in size and uses static scooping exclusively
- Scheme functions can be values of expressions and elements of lists; they can be assigned to variables, passed as parameters, and returned as values.� These features were not available in the early versions of LISP
- COMMON LISP was developed in 1984
- It takes the features of a few dialects and scheme and combines them into one language; while Scheme is a relatively small language, COMMON LISP is relatively large language and uses dynamic as well as static scooping
� Related Languages
- Three related functional programming languages are ML, Miranda, and Haskell
- MetaLanguage or ML was designed in the 1980�s by Robin Milner
- ML is a functional language, which supports imperative programming
- Miranda was designed by David Turner in the 1980�s as well; It is based on ML, SASL, and KRC
- Haskell is based on Miranda and was designed by Hudak and Fasel in 1992
THE FIRST STEP TOWARD SOPHISTICATION: ALGOL 60
� Historical Background
- Efforts to design a universal language inspired the development of the programming language ALGOL 60
- In the late 1950�s, all of the languages that had been developed thus far were for single architectures only
- Two user groups SHARE and USE showed an interest in forming a committee through the ACM organization, in order to study and recommend the design and implementation of a universal programming language
- Another group in Europe (GAMM) also was interested and in 1958 the ACM committee along with GAMM officially agreed to a joint language design project
- FORTRAN was not considered because of its sole ownership by IBM.
� Early Design Process
- The syntax of the language should be close to standard mathematical notation
- It should be possible to use the language for the description of computing processes in publications
- And programs in the new language must be mechanically translatable into machine language
� ALGOL 58 Overview
- The first language designed in this project was ALGOL 58
- ALGOL comes from the original name of the language, which is the International ALGOrithmic Language
- Many of the features in ALGOL 58 were derived from FORTRAN
- It generalized FORTRAN�s features and added several new constructs and concepts
� ALGOL 60 Design Process
- ALGOL 58 was furiously debated, resulting in modifications and additions to the language and the development of ALGOL 60
� ALGOL 60 Overview
- ALGOL 60 is result of the modifications made to ALGOL 58
- Introduces the concept of block structure, which allows parts of programs to localized
- Passing parameters by means of pass by value and pass by name are introduced
- Procedures are allowed to be recursive
- Stack-dynamic arrays are now allowed
� ALGOL 60 Evaluation
- ALGOL 60 succeeded in becoming the only acceptable formal means of communicating algorithms
- All imperative languages formed after 1960 are either direct or indirect descendants to ALGOL 60
- It was the first language whose syntax was formally described
- Unfortunately, ALGOL 60 never achieved widespread use, because it was too flexible and it lacked input and output statements
COMPUTERIZING BUSINESS RECORDS: COBOL
� Although COBOL has been used more than any other programming language it has ahd little effect on the design of any other languages, with the exception of PL/I.
� Historical Background
- Similar to ALGOL, COBOL was designed by a committee
- Three other languages for business applications that existed before COBOL were FLOW-MATIC, AIMACO, and COMTRAN
� COBOL Design Process
- The biggest concern regarding this new application language was that it be easy to use, even at the expense of being less powerful
- The language specifications for COBOL were published in 1960
� Evaluation
- COBOL originated a number of concepts, such as, constructs for macros, implementation of hierarchical data structures, allowed connotative names
- It was the first language whose use was mandated by the Department of Defense
- The poor performance of the early compilers made COBOL expensive to use, fortunately with the advent of better compiler designs and the mandate of the Defense Department COBOL became very popular
THE BEGINNINGS OF TIME SHARING: BASIC
� Similar to COBOL, BASIC has had widespread use, however it has gotten little respect
� Design Process
- BASIC was designed at DARTMOUTH University in the 1960�s by John Kemeny and Thomas Kurtz
- The goals of the BASIC system were to make it easy for non-science students to learn, be pleasant and friendly, provide fast turnaround for homework, allow free and private access, and consider user time more important than computer time
- The combination of the second, third, and fourth goals mentioned above led to the timeshared aspect of BASIC
� Language Overview
- The original version was very small having only fourteen different statements, and was not interactive, for there was no way to get user input from the terminal
- Overall, it was very limited making it very easy to learn
� Evaluation
- BASIC was the first widely used method of remote access to a computer
- Most of its design came from FORTRAN with some minor influence by ALGOL 60
- Reasons for BASIC�s success are probably it�s ease for which it can be learned and implemented
- Two contemporary versions of BASIC are QuickBasic and Visual BASIC
EVERYTHING FOR EVERYBODY: PL/I
� PL/I represents the first large-scale attempt to design a language that could be used for a broad spectrum of application areas
� Historical Background
- Developed as an IBM product
- Designed to replace FORTRAN, COBOL, LISP and the systems applications of assembly language
� Design Process
- The first version of PL/I was originally named FORTRAN VI because the initial design was just an extension of FORTRAN IV
- The design along with the name was quickly dropped and the new design was called NPL for new programming language
- The name change to PL/I resulted in the avoidance of confusing the name NPL with the National Physical Laboratory in England, since this was were PL/I was developed
� Language Overview
- PL/I included the best of ALGOL 60 (recursion and block structure), FORTRAN IV (separate compilation with communication through global data), and COBOL (data structures, input/output, and report generating facilities), along with a few new constructs
- PL/I was the first language to have programs allowed to create concurrently executing tasks, the possibility to detect and handle 23 different types of exceptions, procedures allowed to be use recursively, pointers included as a data type, and reference to the cross sections of arrays
� Evaluation
- The problem with PL/I is it�s complexity due to it�s large size
- It also suffered from what are now considered to be poorly designed constructs
- However, it did get significant use in both business and scientific applications
TWO EARLY DYNAMIC LANGUAGES: APL AND SNOBOL
� Neither of these languages are based on any previous language and neither have had much affect on any later languages
� The only similarity between APL and SNOBOL is there dynamic typing and dynamic storage allocation
� Origins and Characteristics of APL
- Kenneth E Iverson designed APL in 1960
- It was intended to be a vehicle for describing computer architecture not as a programming language
- It has many powerful operators in order to allow arrays to be manipulated as if they were scalar variables, however, this� creates a problem for implementers of the language
- Although it is not widely used, it is still in use today.
� Origins and Characteristics of SNOBOL
- Designed by D.J. Farber, R.E. Griswold, and F.P. Polensky in the early 1960�s
- It was designed specifically for text processing.
- However, it is rarely used nowadays because it is slower than some other languages
THE BEGINNINGS OF DATA ABSTRACTION: SIMULA 67
� Design Process
- Kristen Nygraad and Ole-Johan Dahl developed Simula I between 1962 and 1964
- It was designed exclusively for system simulation
� Language Overview
- SIMULA 67 is an extension of ALGOL 60
- Allowed for support of coroutines using the class construct, which is important because the idea of data abstraction starts here
ORTHOGONAL DESIGN: ALGOL 68
� Design Process
- Introduces the design criteria of orthogonality, which results in features, such as, the inclusion of user-defined data types and implicit heap-dynamic arrays
� Evaluation
- Introduces a significant number of features not previously used
- Although it may be contrasted with PL/I, PL/I gained a far greater acceptance than ALGOL 68
SOME IMPORTANT DESCENDANTS OF THE ALGOL�S
� All imperative languages including the object-oriented languages, designed since 1960 owe some of their design to ALGOL 60
� Pascal
- Designed by Niklaus Wirth
- Highly used for teaching programming; because of this it lacks some features, such as, separate compilation and passing an array of variable length
� C
- Originally designed for systems programming by Dennis Ritchie in 1972
- Because of its adequate control structures and data structuring facilities C is adaptive to many application areas
- A reason for both like and dislike of C is its lack of complete type checking
- C tends to be a very flexible language
� Modula-2
- Designed by Niklaus Wirth in 1976 (also designed Pascal)
- Distinguishing features include modules, which provide support for abstract data types, procedures as types, and low-level facilities for systems programming and coroutines
� Modula-3
- Based on Modula-2 and Modula-2+
- To Modula-2 it adds classes and objects for support of object-oriented programming, exception handling, garbagfe collection, and support for concurrency
� Oberon
- Loosely based on Modula-2 was written by Niklaus Wirth
- Although many features were added to Modula-2 to come up with Oberon many were subtracted also.
- The main feature added is type extension, which supports object oriented programming
- Among those removed are variant records, opaque types, enumeration types, sub-range types, the CARDINAL type, non-integer array indexes, the with statement, and the for statement
� Delphi
- A hybrid language, similar to C++ in that it was created by adding object-oriented support, along with other things, to an existing imperative language
- It is derived from Pascal, therefore making it much more elegant and safe than C++
PROGRAMMING BASED ON LOGIC: PROLOG
� The use of a formal logic notation to communicate computational processes to a computer
� Design Processes
- Alain Colmerauer, Phillip Roussel, and Robert Kowalski developed the design of Prolog
- Prolog uses a method for specifying predicate calculus propositions and a restricted form of resolution
� Language Overview
- Prolog�s programs consist of a collection of statements or facts
- The most common use of Prolog is as a kind of database
� Evaluation
- There are two major reasons why logic programming has not become widespread
- The first reason is that it has been proven to highly inefficient
- The second reason is that it has been shown to effective for only few areas of application, such as, certain kinds of database management and some areas of AI
HISTORY�S LARGEST DESIGN EFFORT: ADA
� Ada was originally developed for the Department of Defense
� Historical Background
- The Army, Navy, and Air Force all proposed the development of a high-level language for embedded systems in 1974 as an attempt to standardize their own embedded systems
� Design Process
- The committee assigned to this task was responsible for identifying requirements for a new Department of Defense high-level language, evaluating the existing languages to determine whether there was a viable candidate, and recommend adoption or implementation of a minimal set of programming languages
- The resulting language went through multiple phases and was named Ada
� Language Overview
- There are four major features of the Ada language
- Packages provide the means for encapsulation of data objects
- It includes extensive facilities for exception handling
- Allows program units to be generic
- It provides for concurrent execution of special program units and named tasks
� Evaluation
- The Ada language contains most of the concepts of software engineering and language design
- The development of a compiler was a difficult task and almost four years after the design of the Ada language was completed, did the Ada compiler appear
OBJECT-ORIENTED PROGRAMMING: SMALLTALK
� Design Process
- The concepts that led to the development of Smalltalk came out of the Ph.D. dissertation work of Alan Kay
� Language Overview
- The program units of Smalltalk are objects
- All computing in Smalltalk is done in the same fashion, passing messages to objects in order to invoke one of the methods assosciated with that object
COMBINING IMPERATIVE AND OBJECT-ORIENTED FEATURES: C++
� C++ uses C as a base for which to support many of the concepts introduced by Smalltalk
� Design Process
- Modifications to C included the addition of function parameter type checking and conversion, and classes
- Later the additions included inline functions, default parameters, and overloading of assignment operators
- With the above additions to C came the name C with Classes
- Not until the inclusion of virtual functions, was the name changed to C++
� Language Overview
- C++ provides a collection of predefined classes, along with the possibility of user-defined classes
- Operators in C++ may be overloaded, meaning that the user may create operators for existing operators
- Dynamic Binding is provided by virtual class functions
- Both functions and classes can be templated, which means that they can be parameterized
- It also includes exception handling
� Evaluation
- C++ has become extremely popular
- On the downside, C++ is extremely large and suffers some drawbacks similar to PL/I
� A Related Language: Eiffel
- Like C++ Eiffel combines both imperative and object-oriented features
PROGRAMMING THE WORLD WIDE WEB: JAVA
� Based on C++ however, numerous constructs have been removed, some changed, and a few others added
� The resulting language provides the power and flexibility of C++, however, it is smaller, simpler, and safer
� Design Process
- Java was designed for an application for which their appeared to be no other language satisfactory enough
- This started out as a programming language for embedded consumer electronics
- Later it proved to be a useful tool in programming for the web
� Language Overview
- Java employs both types and classes
- It does not have pointers, although its reference types provide some of the capabilities of pointers
- Uses a primitive Boolean type mainly for the control expressions of its control statements
- There are no record, union, or enumeration types
- Supports only single inheritance
- Provides garbage collection
� Evaluation
- Designers of Java have eliminated many of the features that are considered unsafe in C++
- Java is still considered a complex language and its lack of multiple inheritance is said to lead to some peculiar designs
- The use of Java has increased much faster than the use of any other language
|
||||||||
4768
|
dbpedia
|
0
| 31
|
http://l.web.umkc.edu/lz74d/CS441/Homework3.htm
|
en
|
Year of
|
[] |
[] |
[] |
[
""
] | null |
[] | null | null |
Major Contribution
Motivation
Impact on Language Implementation
to Development
for Development Readability Writability Reliability Source FORTRAN 1954 One of the features of FORTRAN I, and all of its successors except 90, that allow highly optimized compilers are that the types and storage for all variables are fixed before run time modules). The environment in which FORTRAN was developed was as follows: (1) Computers were still small, slow, and relatively unreliable. (2) The primary use of computers was for scientific computations. (3) There were no existing efficient ways to program computers. (4) Because of the high cost of computers compared to the cost of programmers, speed of the generated object code was the primary goal of the first FORTRAN compilers. So I think FORTRAN did increase the readability of the major programming language from the late 1940s to the mid-1950s for it was evolved more with the features of simplicity and orthogonality, getting more new control statement, data structures (Ext ending names to six characters, independent compilation, type names that could be parameter, character string handling, declarations and allow subprogram Increase writability too. The same reason as readability. However, I dont think its syntax considerations and expressivity is very good. Its reliability is high since it got parameter type checking (FORTRAN 90), aliasing (FORTRAN 0 and I had user-defined programs) and its comparatively high readability and writability. Sebesta, Robert W, Concepts of Programming Languages (5th ed.2003) Algol 58 1958 Formalized concept of data type, compound statements Good used for scientific programming, syntax be close to math notation Hard to read programs, Following code difficult Hard to write parameters, hard to write structure of programs Read and write hard then reliability hard, hard modify programs Sebesta, Robert W, Concepts of Programming Languages (5th ed.2003) LISP 1958 LISP was the first functional programming language. It greatly influenced artificial intelligence (AI) from its inception. LISP was designed in 1958 by John McCarthy and Marvin Minsky, while working on the MIT AI Project. One of the first goals of the project was to produce a list processing system. The Fortran List Processing Language already existed; however, it did not support recursion, conditional expressions, dynamic storage allocation or implicit deallocation. Dynamically allocated linked lists and implicit deallocation were requirements for symbolic differentiation. McCarthys and Minskys list processing system was to be used to implement a program named the Advice Taker, but the application provided the incentive to develop the new list processing language LISP. Since LISP is a functional language, assignment statements and variables are not necessary. Functions are applied to arguments. LISP also uses recursive function calls, rather than loops. The simplicity of the LISP language does make is less readable and harder to understand for someone not familiar with the language. However, the simplicity and orthogonality are impressive to some language researchers. The profuse use of parentheses can also decrease the readability of this language. The simplicity and orthogonality of LISP have a huge impact on writability. They make it easier to write programs in LISP. The lack of assignments and variables also increases the writability of the language. LISP is a fairly reliable functional programming language. It is the most widely used language for artificial intelligence. Most of the past inefficiencies of LISP code have been eliminated. The code is ran faster because it is compiled and not ran on an interpreter. LISP is a reliable functional programming language. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://users.erols.com/ziring/dopl.html
COBOL 1959 Business Application Programming Language
Hierarchial Data Structures
English like nonmathematical syntax.
COBOL was developed in 1959 to fulfill the needs of a growing base of business computer users. COBOL was not built in an attempt to create a universal programming language, but rather to create a practical language for business applications. COBOL succeded in that goal, resulting in the fact that there currently exist more lines of COBOL code than any other programming language. COBOL was the first language to implement English style commands rather than a more mathematics syntax. This decision fell in line with the goal of making COBOL usable for a broader base of programmers by making it more readable, and thus accessible to business professions with less mathematical and computer science background.
COBOL defines every variable is a special data division of the code, thus implementing a data dictionary which makes reading and understanding the code easier.
COBOL is the first language to implement heirarchial data structures, providing the first step toward data abstraction which makes programs more readable by managing complexity.
The use of English style in COBOL commands allows the programmer to think more in terms of the problem being solved and less in terms of the idiosyncrasies of the programming language and machine implementation. The abstraction provided by hierarchical data structures makes code less complex, thus easier to write. Finally, because of its business application focus, COBOL often trades power for usability. Therefore, while a programmer may have a difficult time writing scientific applications in COBOL because of the lack of powerful features, the usability gained in the area of business application allows the programmer to again focus on the problem not the language, improving writability. Reliability, as a function of how well a program is understood and can be modified correctly, is increased by COBOLs gains in readability and writeability. Clearly, based on its widespread use in business critical applications, COBOL has proven itself a reliable programming tool. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://www.legacyj.com/cobol/cobol_history.html
Algol 60 1960 Public algortihms, Machine independent Make modifications to Algol 58 Understanding difficult, BNF make hard to read Input and output statements Implementation inefficient, BNF hard to modify Sebesta, Robert W, Concepts of Programming Languages (5th ed. 2003) BASIC 1963 The Basic design came from Fortran and Algol 60. Algol 60 had a minor influence, but with the help of these two languages Basic was born. In 1963 Kemeny began the work on the complier for the first version of Basic. Basic was designed in the spring of 1963 especially for the liberal art students. John Kemeny and Thomas Kurtz felt that the language was needed for those students who didnt have the computer science background. Since, Basic was based off of Fortran and Agol 60 the readability of Basic already had a solid ground. Basic has been criticized for the poor structure of programs written in it, among other things. By Sebesta evaluation criteria, the language does indeed fare very poorly. (Sebasta, 67) Basic received its reliability on the success and the ease in which it can be learned, and implemented on any computer. Sebesta, Robert W, Concepts of Programming Languages (6th ed. 2004) PL\I 1965 PL/I represents the first large -scale attempt to design a language that could be used for a broad spectrum of application areas. First Concurrency. First exception handling. Switch-selectable recursion. First pointers. First array cross sections. In the 1970s, it enjoyed significant use in both business and scientific applications. By the early 1960s, the users of computers in industry had settled into two separate and quite different camps. Scientific programmers used the floating-point data type and arrays extensively. For business applications, people needed the decimal and character string data types. Until early 1963, IBM planners perceived the beginnings of a change in this situation. These perceptions quite naturally led to the concept of designing a single universal computer that would be capable of doing both floating-point and decimal arithmetic, and therefore both scientific and business applications. Along with this came the idea of a programming language that could be used for both business and scientific applications. Since Many of the new features of PL/I were poorly designed and too large and too much redundancy, it's readability is poor for me. Even though PL/I is not simple and orthogonal enough, but it was (and still is) actually used for both scientific and business applications. And from my point of view, its writability is good. Even PL/I was first one had exception handling, and ok writability, but its readability is so poor and not widely used as FORTRAN and COBOL, so I think its reliability is not so good. Sebesta, Robert W, Concepts of Programming Languages (5th ed. 2003) SIMULA 67 1967 SIMULA introduced important object-oriented programming concepts like classes and objects, inheritance, and dynamic binding. It was originally designed and implemented as a language for discrete event simulation. The basic concept of speaking in general and more specific terms had been adopted as a way of expressing reality in the context of a programming language SIMULA began with the idea of classes and objects to encapsulate procedures. This made the writability of languages to vastly improve, since all the procedures were in one location and removed the superfluous information. Classes increase reliability due to the object-oriented concepts, which were later developed in Java. Sebesta, Robert W, Concepts of Programming Languages (6th ed. 2004)
http://java.sun.com/people/jag/SimulaHistory.html
Pascal 1971 The major contributions of Pascal were the value-result method of passing parameters and the case statement for multiple selections.
The largest contribution that Pascal had on the programming world was the teaching of the language. In the early 1970s FORTRAN, PL/I, and ALGOL-W were common languages taught to computer science students. By the mid 1970s Pascal had become the most widely used language to teach students. Pascal was the standard language to teach for almost two decades until the early 1990s. Pascal has affected almost every block-structured language designed since its creation and still does today
Niklaus Wirth a member of the International Federation of Information Processing (IFIP) Working Group 2.1, which was created to continue the development of ALGOL in the mid 1960s. In August of 1965, Wirth and Tony Hoare contributed to that effort by presenting a proposal for additions and modifications to ALGOL 60. Their, proposal was rejected because it was considered to be too small to advance ALGOL 60. Later on, ALGOL 68 was released but Wirth and Hoare believed that it should not have been released because of its complexity. So they set out to develop and better version of ALGOL called ALGOL-W. From ALGOL-W the programming language known as Pascal was born. Pascal was created to be a simple language that would be widely used and easily teachable. Pascals impact on readability was huge because its simplicity and expressivity. Programs were easy to read through and it became the standard for readability for teaching young programmers how to write simple programs. Not only has Pascals impact on readability been enormous to programming, it also had a large impact on writability. It was easy for students to write small programs to learn the language quickly through in their classes. However, Pascal lacked many complex programming issues such as the impossibility of writing subprograms that take a parameter an array of variable length. Another problem with Pascal was the lack of any separate compilation capability. Although Pascal lacked the ability to do many things it was still considered to be a more stable than FORTRAN and C. It provided a safety net for young programmers to not allow them to make errors that they could have easily made in other languages. It did this because it was such and small language compared to it predecessors. Sebesta, Robert W, Concepts of Programming Languages (6th ed. 2004)
http://cgibin.erols.com/ziring/cgi-bin/cep/cep.pl?_key=Pascal
SNOBOL 1971 SNOBOL provided a language which could be used as if it were an object-oriented language, a logical programming language, a functional language, or a standard imperative language. This was made possible by changing the set of features used to write a program. SNOBOL was the fourth and final incarnation of a series of special purpose programming languages intended to make character string manipulation easier.
Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://www.wikipedia.org/wiki/SNOBOL
C 1978 Ken Thompson wanted to create a comfortable computing environment constructed according to his own design using whatever means necessary. By early 1973 the language and compiler were strong enough to rewrite the Unix kernel for the PDP-11. C was developed because in the late 1960s 3 companies by the names of Bell Telephone Laboratories, MIT, and GE were in a project development with Multics. Multics development could not be fulfilled in time and was becoming very expensive. The language shows considerable power to describe important concepts. Its easy to read and there are different types of libraries set up within the language to help the user. The relationship between arrays and pointer, and the way in which declaration syntax mimics expression syntax. They are also among its most frequently criticized features and often secure as stumbling blocks to the beginner. C is quirky, flawed, and an enormous success. While accidents of history surely helped, it evidently satisfied a need for a system implementation language efficient enough to displace assembly language, yet sufficiently abstract and fluent to describe algorithms and interactions in a wide variety of environments. Dennis Ritchie, http://cm.bell-labs.com/cm/cs/who/dmr/chist.html, 1993 PROLOG 1979 Prolog is a dynamic high level language with untyped variables, encouraging a natural level of abstraction, and discouraging you from hard-wiring early assumptions into your code. Its major contribution has been that it allows Prolog programmers to be closer to the problem to be solved, more adaptable to changes in that problem, and easier to maintain and support. Alain Colmerauer and Phillippe Roussel at the University of Aix-Marseille, with the help of Robert Kowalski at University of Edinburgh, developed the fundamental design for Prolog.
Colmerauer and Roussel were interested in natural language processing, and Kowalski was interested in automated theorem proving. The collaboration between them and the two universities on this new language continued until the mid 1970s. In 1972, their efforts led to the use of formal logic as the basis for a programming language. Kowalski's research provided the theoretical framework, while Colmerauer's resulted in the programming language. After that time both universities received little attention on their research and conducted independent studies on the prolog language until 1981. At that time the Japanese government was launching a large research project called the Fifth Generation Computing Systems (FGCS). The primary objective of this project was to develop intelligent machines, and Prolog was chosen as the basis for their study. This caused other researchers and governments to soon take notice of the Prolog language which started a 10 year long interest around the world in artificial intelligence and logic programming. Readability is difficult because of many unresolved problems with the language. The main problem with it is that it is hard to follow the order of the program and much of the programming is without any set basis of rules. This makes it very difficult for another programmer to figure out what he or she is reading without detailed notes to accompany the program. However, Prolog does offer unification which provides the ability for variables to instantiated values to provide matches. To sum up the impact on readability has been that it has lead to many user-defined programming languages.
Prolog has good writability and it is considered the standard in logic programming and AI programming. It is a fairly simple language to program with because it is based on facts, rules, or goals with a mainly structures composed of atomic propositions and logic operators.
Finally, the impact that prolog has had on writability has been the advantage and disadvantages that unification introduced to the word of programming, causing many future languages to copy its platform.
Reliability of Prolog is simply not good. There are many ways it is inefficient including infinite loops, the programmer MUST sometimes state control flow information in their programs. It is very easy for programmer to write statements that cause infinite loops resulting in total program failures. Also, there are problems of the closed-world assumption and negation. To sum up the impact of Prologs reliability it has basically been a model of what not to do in order to create a reliable program language. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
Roth, Allison, YOU CAN'T IGNORE PROLOG
http://web.archive.org/web/20010708010543/www.practical-applications.co.uk/Prolog/alli.html
SmallTalk 1980 First widely used implementation of object oriented programming.
Implemented first example of an graphical windowing human interface system both for program user interfaces and integrated development tools.
Smalltalk was originally developed by Dr. Alan Kay to extend the ideas of object oriented programming which were first developed in a limited form in Simula-67. Everything in Smalltalk is an object and information is passed between objects by way of messages. The object oriented programming framework of Smalltalk was developed concurrently with a plan to create a dynamic windowing human interface system which would bring computing to a less technical class of user. This windowing system is also integral to Smalltalks first implementation of an integrated software development environment. Therefore, Smalltalk is not only a programming language but also a software development tool which created a model for all modern graphical software development environments. Reading Smalltalk code is more like reading a text representation of semantic chart than it is reading computer code due to the complete implementation of object oriented concepts in Smalltalk. Therefore, Smalltalk is very readable to a programmer with knowledge of object oriented programming, but will make little sense otherwise. The simple expressive syntax of Smalltalk uses human nouns and verbs, leading to easy readability based on knowledge of English semantics. In terms of Writability, it must be considered that Smalltalk is both a programming language and an integrated graphical software development environment. As the first language to implement such a development environment, Smalltalk makes large gains in writeability across the overall programming process. Such a development environment is key to dealing with the many discrete segments of code which interact in the form of objects in Smalltalk. Individual segments of Smalltalk code text will have good writeability so long as the programmer subscribes to the concepts of object orientation. The use of English nouns and verbs also enhances writability by making use of existing semantic knowledge. The powerful abstraction tool of object oriented design will itself help to manage complexity, which make Smalltalk programs easy to understand, write, and modify. As an interpreted system originally designed to have some of the simulating capabilities as its predecessor Simula, Smalltalk is an excellent tool for prototyping and small systems but may not be robust enough to be reliable in very large projects. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://www.smalltalk.org/
Ada 1983 Ada was designed specifically for the use of the Department of Defense (DoD). It included many object-oriented programming features. It was designed to promote reliability. The requirement phase of Ada began in 1975 with the official version of the language being finalized in 1983. Ada included a numerous amount of features. Ada used to be the only widely used language that included exception handling. It was also one of the first languages to offer full support for abstract data types. Ada was developed for use by the Department of Defense as a high-level language used for embedded systems. The Ada language was based on the programming language Pascal. The impact on readability by Ada was in part due to the use of data types, control statements and structures. Additionally, the use of special words greatly increase the readability of this language. The Ada language is so large and complex that it decreases the writability of the language. However, there are also many factors that are beneficial to the writability of Ada. These include abstraction, expressivity and generic procedures. A compiler can generate a version of the generic procedure using a specific data type, reducing the amount of code to be written. Exception handling increased the reliability of Ada. The benefits of exception handling in the languages PL/I and CLU were also included in the design of Ada. In 1995, Ada began to support object-oriented programming. The complexity of the language may reduce the reliability; however, the readability and writability of the language has a positive impact on the reliability. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://users.erols.com/ziring/dopl.html
C++ 1984 Parameter type checking, derived classes, public/private access of components Evolved from C because modifications to improve its imperative features, object oriented programming Good Compilers, large & complex language C can be compiled as C++ programs, large and complex language C can be compiled as C++ programs, good compilers Sebesta, Robert W, Concepts of Programming Languages (5th ed. 2003) JAVA 1993 Java provides a secure, cross-platform way for code to be executed. Java also added several features to existing websites, such as arbitrary graphics, arbitrary data types, less load on the server, and more user interaction. Java was designed mainly to enable people to develop programs that use the Internet and the World-Wide Web. Java prevents casting between arbitrary variables. Only casts between numeric variables and between sub and superclasses of the same object are allowed. All conditional tests must return boolean values. Java supports both the "/* ... */" and "//" commenting conventions, but comments that begin with a "/**" are special. These comments should only be used before a method or class declaration. They indicate that the comment should be included in automatically generated documentation for that declaration. Java provided a new and higher level of reliability by eliminating most of the assignment coercions that are done in C++, and by using an "index range" method of accessing arrays. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://www.shu.ac.uk/java/aboutthisbook/pref2.htm
http://www.ibiblio.org/javafaq/javafaq.html
JavaScript 1995 Brought full interactivity and sophisticated user interface and typography concepts to the formerly static Web. Tasked with making Navigator's newly added Java support more accessible to non-Java programmers, Brendan Eich eventually decided that a loosely-typed scripting language suited the environment and audience, namely the few thousand web designers and developers who needed to be able to tie into page elements (such as forms, or frames, or images) without a bytecode compiler or knowledge of object-oriented software design. The language he created was christened "LiveScript," to reflect its dynamic nature, but was quickly (before the end of the Navigator 2.0 beta cycle) renamed JavaScript JavaScript's capabilities can't be extended or overridden by developers. A language that apparently required no skill to use, lacked an IDE and a reliable cross-platform debugger, and could only be tested in the context of the actual browser in which the page will be viewed, combined with a few highly publicized security flaws and several books aimed at non-programmers, caused many to write off JavaScript as a "simple" language for beginners and overshadowed its amazing potential. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://www.oreillynet.com/pub/a/javascript/2001/04/06/js_history.html
PHP 1995 One of the biggest strengths of PHP 3.0 was its strong extensibility features PHP was initially as a simple set of Perl scripts for tracking accesses to his online resume. He named this set of scripts 'Personal Home Page Tools'. As more functionality was required, Rasmus wrote a much larger C implementation, which was able to communicate with databases, and enabled users to develop simple dynamic Web applications Because PHP is based on Perl, it has many of the problems associated with Perl. As in Perl, an instruction can be written in a variety of ways, this reduces the readability of the language Because one can write an instruction in a variety of ways, this increases the writability of the language As the language's strongest attribute is its extensibility feature, this has forced PHP to be rather reliable Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://us2.php.net/history
|
||||||||
4768
|
dbpedia
|
1
| 92
|
http://www.chiltoncomputing.org.uk/acl/applications/cc/p007.htm
|
en
|
Chilton::ACL::Compiler Compiler 1966
|
[
"http://www.chiltoncomputing.org.uk/acl/pngs/hamburger.png",
"http://www.chiltoncomputing.org.uk/acl/pngs/magnifying-glass.png"
] |
[] |
[] |
[
"Compiler Compiler"
] | null |
[] | null |
Compiler Compiler 1966
|
en
|
/favicon.ico
| null |
Early Translator Writing Systems
Contents
1. Introduction
1.1 Early Autcodes
1.2 Fortran
1.3 Algol 58
1.4 Some Terminology
2. Early Translator Writing Systems
2.1 Syntax machine
2.2 PSYCO
2.3 BMCC
3. Second Generation Translator Writing Systems
3.1 COGENT
3.2 CGS
3.3 TMG
3.4 Meta II
4. Third Generation Translator Writing Systems
4.1 FSL
4.2 TGS
4.3 Tree-Meta
5. The Demise of Translator Writing Systems
6. References
1. Introduction
1.1 Early Autocodes
Prior to 1950, most computer programs were written in assembly code; that is a low-level programming language where a one-to-one relationship exists between instructions in the language and the computer's machine code instructions. By 1950, people were considering whether it would be possible for the machine to generate the assembly code rather than humans.
In 1950, William Schmitt implemented Short Code for the Univac 1 In 1952, Alick Glennie produced an autocode programming language for the Machester Mark 1 computer (the compiler was 750 instructions). Laning and Zierler produced an autocode for the the Whirlwind at about the same time. (Knuth lists 20 autocodes that predate Fortran and states that Glennie's autocode was the first one completed.)
1.2 Fortran
Work on a more ambitious autocode for the IBM 704 started around 1952. The proposal for Fortran I went to IBM Management in 1953, was specified by 1954 and compiler produced by 1957. The major deficiency of Fortran I was the absence of user-written subroutines and functions. These were added in Fortran II in 1958.
The need for a system to automatically write compilers had not really appeared so far as many computers still ran programs in assembler language or a single autocode. Implementing a single language on many different computers or implementing many languages on a single computer were still not major requirements.
1.3 Algol 58
By 1958, there was concern that programming languages would proliferate. There were already signs of this. In both the USA and Europe there was support towards defining a single programming language. A meeting was held in 1958 in Zurich which defined the language called IAL (International Algebraic Language) in the USA and Algol58 in Europe. This was followed quite closely by Algol60.
In the USA, languages such as JOVIAL, MAD and and NELIAC were defined as subsets of IAL. In Europe, there was enthusiasm for Algol 60 which had a number of unfortunate features:
No reserved words. How an implementation defined goto was up to it. Thus, allez, go to, goto and many other variants appeared.
No input/output statements. This ensured that every single Algol 60 implementation was different. In the UK, there were three quite different input/output systems on the Elliott, English Electric and Ferranti computers.
Features that were either difficult or impossible to implement on a range of computers either due to size or speed..
The main result of the attempt at a single universal programming language was a proliferation of languages and dialects that meant it was impossible to implement the compilers needed by a specific customer without some automation of the production of compilers.
There was a need for a lexical analysis subsystem to resolve the problems associated with the variety of dialects and languages. It was also needed to cover up the deficiencies in the way some languages had been defined, notably Fortran.
For Fortran, syntax analysis is relatively straightforward once the lexical problems are resolved. Algol is significantly different. In the Algol 60 Report the language is defined in BNF, a method for defining legal programs in the language. It also had statements in English that also described the language. The block structure facilities in Algol 60 made it more difficult to do the syntax analysis.
The semantic description of Algol 60 is also quite difficult. Features were defined that were difficult to compiler. Ambiguities existed. Meanwhile the architectures of computers were changing with different storage systems, radically different instruction codes, and speed based on pipelining specific operations. In consequence, generating efficient code across a range of computers was also becoming difficult.
The need for some automation in the production of compilers was needed.
1.4 Some Terminology
In describing translator writing systems, it is useful to define a set of terms that can be used irrespective of the system being described.
Backus-Naur Form (BNF)
A set of rewrite rules that define a programming language. A rule has the form:
<address> ::= <name> , <street> , <zip>
The non-terminal address can be replaced by a non-terminal name followed by a comma followed by a non-terminal street followed by a comma followed by a non-terminal zip. Rewrite rules are applied starting from a single non-terminal, program say, until a string is obtained that contains no non-terminal symbols. This is a program in the language.
BNF language
A language defined using BNF
parse tree
Given a program in the BNF languge, it is possible to describe all the productions applied to create the program as a tree starting from the non-terminal program. This is called a parse tree.
syntax-directed compilation
Any system that can take the definition of a BNF language and generates a parse tree for an arbitrary program in that language.
(Manchester Mark 1 Autocodes: Alick Glennie, Tony Brooker 1954) Syntax Machine A E Glennie, 1960 PSYCO Princeton Syntax Compiler E T Irons 1960 Brooker-Morris Compiler-Compiler 1960 SIG/PLAN Working Group 1 on Syntax Driven Compilers 1962 Howard Metcalf Fall 1962 Val Schorre Meta I : Jan 1963 Lee Schmidt March 1963 Val Schorre et al Meta II : Spring 1963 Schneider and Johnson Meta III : 1964 J F Rulifson SRI Meta III : 1964 Book LISP-META : 1965 Oppenheim and Haggerty Meta V : 1966 O'Neil Meta PI : 1968 J F Rulifson Tree-Meta : 1968 J A Feldman FSL:1962/3 L F Mondshein VITAL:1967 CMU Cabal:1967 W Wulf CMU PQCC:1980 Computer Associates TGS 1964 Computer Associates CGS 1962 J C Reynolds COGENT 1962 J C McClure TMG 1963 S C Johnson YACC 1970 Early Translator Writing Systems Early Translator Writing Systems
Early Translator Writing Systems
2. Early Translator Writing Systems
In 1960, several systems appeared aimed at easing the task of producing a compiler. They went under a variety of names , such as translator writing system, compiler-compiler and meta-compiler. As such a system is capable of generating a compiler, it may be possible for it to generate itself. Although not a requirement, it adds a certain elegance to the system and makes enhancement of the system relatively straightforward. A base system can be defined and this can be used to define a more powerful system and so on. Three of the earliest systems were:
Alick Glennie's Syntax Machine, 1960
Ned Irons PSYCO compiler, 1960
Brooker and Morris's Compiler-Compiler, 1960
The three activities were reasonably independent and differed in their approach.
The following functions are required explicitly or implicitly in most compilers:
Lexical Analysis: to recognise the basic symbols in the language
Syntax Analysis: a method for checking the correctness of a program in the language and providing a parse that illustrates the structure.
Semantic Analysis: production of a pseudo code that defines the meaning of the program. Optimisations appropriate to the target machine and code generation for that target machine may also be required.
2.1 Syntax Machine
The Syntax Machine defines a top-down parser for the language to be compiled having possible rules of the form:
<A> ::= <B> <C> <D> ... <X> <A> ::= <B> | <C> | <D> |... | <X> <A> ::= <B> <C> <D> ... {<X>}
The first parses an A as a B followed by a C ..etc. The second parses an A as a B or a C or a D ... The third parses an A as a B followed by a C .. by one or more X. Having this iteration, saves on recursion.
It is up to the person writing the translator for a specific language to ensure that no backup is required. However, the rules described above are converted into a pseudo flow chart that allows variations on this basic parsing. The Syntax Machine defines a machine-independent instruction set which defines the semantics of the program. It is possible to optimise the pseudo code before transforming it into the actual order code of a specific computer. The whole system was simulated on an IBM 650 and used to generate compiled code for both the IBM704 and IBM709. Many of the ideas that appear in the Syntax Machine were used in the S1, S2 and S3 Fortran compilers developed at AWRE Aldermaston for the IBM 7030 (Stretch) and later ported to the Ferranti Atlas II. However, by then, an operator precedence parser was used for syntax analysis.
2.2 PSYCO
Irons started his work on a syntax-directed compiler in 1959 while at Hanover for the summer.
PSYCO uses a bounded-context bottom-up parser with backup. That is at any stage in the recognition of a program, a finite set of previously recognised non-terminals and the next finite set of terminal systems will always define the action that the parser needs to take. PSYCO was mainly aimed at writing translators for Algol-like languages. The system started with a symbol table containing basic information and a parse of the Algol-like program collected declarative information about the variables used in the program. A second pass used this information to compile the executable part of the program.
Irons also recognised the need for handling errors in programs presented to the compiler.
2.3 Compiler-Compiler
The Brooker-Morris compiler-compiler (BMCC) has a diferent notation from either Glennie or Irons. The parser is a top-down recursive-descent parser with statements of the form:
FORMAT [SS] = [V] = [SAE]
It indicates that a source statement (SS) in the languge consists of a variable (V) followed by '=' followed by a signed arithmetic expression (SAE).
Associated with each FORMAT statement is a ROUTINE that defines the semantics of the statement:
ROUTINE [SS] = [V] = [SAE] LET [SAE] = [±?][T][±T*?] Ac = [±?][T] ....
The appropriate ROUTINE statement is passed the parse tree of the statement recognised. It can inspect the parse tree and decide what code should be generated.
The Compiler-Compiler effectively adds the definition of the language being defined to that of the Compiler-Compiler itself. Before releasing the final compiler, it is usual to remove the Compiler-Compiler's own statements from that of the language being compiled.
The Compiler-Compiler was used to produce Fortan II, Fortran IV, Algol, Extended Mercury Autode, Atlas Autocode, SOL, ACL, CPL, Elliott Autocode Mk III compilers for the Ferranti Atlas computer.
3. Early Translator Writing Systems
The early translator writing systems influenced a number of systems that followed soon after. We shall look at three of these and one other, TMG, that also comes later but appears to have not been influenced by the earlier systems.
3.1 COGENT
John Reynolds' COGENT system was developed at the Argonne national Laboratory around 1962. COGENT stand for COmpiler and GENeralised Translator. The objective was to unify the concept of syntax-directed compilation with the more general but primitive concept of recursive list-processing. It is similar to BMCC in that it has two main constructs productions and generator definitions which correspond somewhat to FORMAT and ROUTINE in BMCC.
An example of a set of productions for a simple language is:
(LETTER) = A,B,C,D,E. (STRING) = (LETTER),(STRING)(LETTER). (VARIABLE) = (STRING). (FACTOR) = (VARIABLE). (FACTOR) = (()(POLYNOMIAL)()). (TERM) = (FACTOR). (TERM = (TERM)*(FACTOR). (POLYNOMIAL) = (TERM),+(TERM),-(TERM). (POLYNOMIAL) = (POLYNOMIAL)+(TERM) (POLYNOMIAL) = (POLYNOMIAL)-(TERM)
As in BMCC, several productions can be written as a single production:
(TERM) = (FACTOR),(TERM)*(FACTOR)
COGENT uses the production statements to construct a parser, similar to the Irons parser, that takes the input and generates a parse tree. By giving each recognised production a unique internal number, the parse tree can be very compactly represented as a list structure. This list structure can be manipulated by the generator definitions and eventually output the compiled code. COGENT was implemented on the Control Data 3600. A novelty of the implementation was that backup spawned a set of processes that performed the various alternatives in parallel.
3.2 CGS
Computer Associates of Massachusetts was in the business of writing compilers. Compiler Generator System (CGS) was their first attempt at a general purpose translator writing system aimed at producing efficient object code with optimisation provided at several places in the system. The basic approach was a table-driven top down syntax analyser. It ran on an IBM 7090, a Burroughs D-825, and a CDC 1604.
The generated compiler had five phases:
Syntactic Analyzer: converts input string into a tree-representation of its syntax
Generator: transforms the tree into a sequence of macro-instructions
Optimizer: recognises and eliminates redundant computation (eg common subexpressions, invariant computation out of loops, etc)
Code Selector: assembles code fragments
Assembler: binds the code fragments in form required by the compiler environment.
The Syntactic Analyzer, Generator, and Code Selector are driven by tables defined for a specific language and machine code using a BNF-like definition of each. The Syntactic Analyzer is used to generate the relevant tables.
3.3 TMG
TMG (stands for TransMoGrifier) was developed for the PDP-7 by R. M. McClure at Texas Instruments in 1975 and appears not to have been influenced by the earlier systems. The system was ported to the IBM 7040, IBM 7090, and CDC 1604. It was used at Bell Labs and also by the Multics Project. At Bell Labs, Dennis Ritchie used TMG to produce the compiler for B that later became C. McIlroy and Morris used TMG to write the EPL compiler for Multics. As an aside, Stephen Johnson named his system YACC because they already had TMG at Bell Labs! TMG was used to implement a subset of PL/I by Cal Tech.
TMG was aimed at constructing simple one-pass translators for some specialized language with almost no optimisation of code generated. It was defined in a language TMGL with a compiler for TMGL on the target machine. Later, the system was defined in itself.
The system is somewhat similar to the Glennie Syntax Machine in that the syntax specification of a program consists of sets of statements with a label, a syntactic specification, and a semantic action that takes place if the syntactic entity is recognised. The syntactic part consists of a set of actions with possibly two exit labels which give where to go next in the case of the syntactic entity being recognised or not. If the syntactic part is recognised, the semantic part is executed.
In the first 18 months TMG was used to define two different FORTRAN compilers, a logic simulation system, several data format converters, a geometric description language translator, and the TMGL compiler itself.
The system is not really aimed at handling structured languages like Algol 60.
3.4 Meta II
The SIG/PLAN Working Group 1 on Syntax Directed Compilers met monthly in the conference room at the UCLA computing facility. John Backus attended one meeting to see how BNF was being used as a compiler writing language. Several separate attempts at defining a translator writing system came out of that working group and the ideas finally came together initially as a system called META but finally as META II.
META II was a simple translator writing system written by V Schorre et al in 1962.
In META II, the language for which a translator is required must be specified as a set of syntax rules. These very much resemble the BNF notation except that they contain code generation commands as part of the syntax definition. There is limited memory in META II so there is a need to output information soon after it has been recognised.
A restrictive top-down method of syntax analysis is used in the META II system. The restriction is that it does not allow back tracking or left recursion. The main goal is to match the complete program with the Syntax Rule whose name follows .META at the head of the program. The name of the Syntax Rule appears at the start of the rule followed by the = symbol. The right-hand side of this Syntax Rule indicates which entities have to be recognised to achieve this goal. The recogniser achieves its main goal by looking for these smaller entities, from left to right, as individual subgoals. These subgoals are themselves defined in terms of other entities, and so the process of recognising a complete program becomes a matter of looking for progressively smaller entities, right down to the level at which basic symbols such as characters and numbers are recognised.
A typical Meta II rule would be:
SET = 'C' $ ( 'D' / 'E' )
This recognises any string consisting of the letter C followed by any sequence involving the letters D and E.
Meta II rules can include .OUT statements that generate output code. Recognised basic symbols (.ID, .STRING, .NUMBER) are stored and can be output by .OUT(*). Also two unique labels can be output for each statement by writing *1 or *2.
Meta II is a very simple system that can be defined in itself. This is the complete definition:
.SYNTAX PROGRAM OUT1 = '*1' .OUT('GN1') / '*2' .OUT('GN2') / '*' .OUT('CI')/ .STRING .OUT('CL ' * ) ., OUTPUT = ('.OUT' '(' $OUT1')' / '.LABEL' .OUT('LB') OUT1 ) .OUT('OUT') ., EX3 = .ID .OUT('CLL' * ) / .STRING .OUT('TST' *) / '.ID' .OUT('ID') / '.NUMBER' .OUT('NUM') / '.STRING' .OUT('SR') / '(' EX1 ')' / '.EMPTY' .OUT('SET') / '$' .LABEL *1' EX3 .OUT('BT ' *1 ) .OUT( 'SET') ., EX2 = (EX3 .OUT('BF ' *1 ) / OUTPUT ) $(EX3 .OUT('BE') / OUTPUT) .LABEL *1 ., EX1 = EX2 $( '/' .OUT('BT ' *1) EX2) .LABEL *1 ., ST = .ID .LABEL * '=' EX1 '.,' .OUT('R') ., PROGRAM = '.SYNTAX' .ID .OUT('ADR' * ) $ ST '.END' .OUT('END') ., .END
4. Third Generation Translator Writing Systems
After the initial promise of early translator writing systems, there was considerable interest in producing better systems. The early systems had shown that the approach was viable. The understanding of the whole area of parsing and what was possible had improved. A theory of syntax analysis was evolving. The number of new languages arriving was still on the increase. Computer systems were evolving.
4.1 FSL
In the period 1963-67, Bob Floyd from Computer Associates had moved to Carnegie-Mellon University. Jay Earley, a doctoral student, was working on a parsing algorithm, the most efficient general context-free algorithm known. Alan Perlis et al were defining an enhanced version of Algol, Formula Algol, which would incorporate the manipulation of mathematical formulae within the language Algol. There was a need for a Formula Algol compiler. Jerome Feldman had completed his thesis, A Formal Semantics for Computer Oriented Languages, in 1964. This defined a compiler-compiler, FSL, that was capable of translating into machine language most of the existing programming languages. It was, therefore, used to produce the Formaula Algol Compiler for the Bendix G-21. FSL had two main subsections. A Syntax Loader that builds tables which will control the recognition and parsing of programs in the required language. A Semantic Loader builds a table describing the semantics of the instructions in the required language. The Syntax Loader generated a Production Language (PL), based on the work of Bob Floyd, that scanned the current input stream against the PL rules and decided on the action to take. The Semantic Loader defined the meaning of each language statement in a machine-independent code.
FSL was a major input t0 the CABAL Compiler-Compiler system produced at Carnegie-Mellon University in the period 1967 onwards.
4.2 TGS
TGS (sometimes called TRANGEN), Translator Generator System, was produced at Computer Associates as a follow on to CGS. It ran on an IBM 7094, CDC-1604, Univac M460 and GE-635 and was used to write translators for PL/I, ALGOL, and FORTRAN IV. Each compiler is driven by a set of tables, TRANTAB, described in a language called TRANDIR. Synatx Analysis was done by a system based on Floyd Productions. The code generation system was also used in later versions of the META systems.
4.3 Tree-Meta
After a number of intermediate systems, Rulifson defined Tree-Meta as a major advance over the earlier META systems. The TREE-META program is in two parts. The first part is a definition of the syntax for the source language, L, for which a translator is required and this is largely inherited from Meta II. The major difference is that instead of outputting code, the output is a tree built up as the syntax proceeds. The second part defines a set of code generation rules that generate the object code to be produced from this tree. The tree itself is quite flexible in format and is defined by the TREE-META user as he defines the syntax analyser for the language. The code generator can take note of any peculiarities in the form of the tree, if that will lead to better code being produced.
An example of the syntax recognise is:
.META EXPRESSION EXPRESSION = TERM $ ( '+' TERM :ADD[2] / '-' TERM :SUB[2] ) ; TERM = FACTOR $ ( '*' FACTOR :MULT[2] / '/' FACTOR :DIV[2] ); FACTOR = '+' PRIMARY / '-' PRIMARY :MIN[1] / PRIMARY ; PRIMARY = .ID / .NUM / '(' EXPRESSION ')' ; .END
The construct ADD[2] generates an ADD tree node from the top two items on the stack. The code generator instruction would be of the form:
ADD[-,-] => *1 ' PLUS ' *2 ;
Separating code generation from the building of the parse tree means that quite efficient code can be generated.
5. The Demise of Translator Writing Systems
Interest in Translator Writing Systems continued after the mid-1960s but the interest in producing a universal system capable of production-quality compilers decreased. This occurred for a number of reasons:
The number of new languages decreased and the requirements for new languages that did appear tended to be significantly different from the previous generation.
Phases of the compilation process were specified in independent modules like LEX and YACC so that a compiler writer could select a relevant set of modules to put together to create a specific compiler.
The speed of computers increased so that the need for the fastest compilation possible and the most efficient object code decreased.
6. References
|
|||||
4768
|
dbpedia
|
0
| 89
|
https://dbpedia.org/page/ALGOL
|
en
|
About: ALGOL
|
http://commons.wikimedia.org/wiki/Special:FilePath/1965_ALGOL-20_A_Language_Manual,_Fierst_et_al_-_cover.jpg?width=300
|
http://commons.wikimedia.org/wiki/Special:FilePath/1965_ALGOL-20_A_Language_Manual,_Fierst_et_al_-_cover.jpg?width=300
|
[
"https://dbpedia.org/statics/images/dbpedia_logo_land_120.png",
"http://commons.wikimedia.org/wiki/Special:FilePath/1965_ALGOL-20_A_Language_Manual,_Fierst_et_al_-_cover.jpg?width=300",
"https://dbpedia.org/statics/images/virt_power_no_border.png",
"https://dbpedia.org/statics/images/LoDLogo.gif",
"https://dbpedia.org/statics/images/sw-sparql-blue.png",
"https://dbpedia.org/statics/images/od_80x15_red_green.png",
"https://www.w3.org/Icons/valid-xhtml-rdfa"
] |
[] |
[] |
[
""
] | null |
[] | null |
ALGOL (/ˈælɡɒl, -ɡɔːl/; short for "Algorithmic Language") is a family of imperative computer programming languages originally developed in 1958. ALGOL heavily influenced many other languages and was the standard method for algorithm description used by the Association for Computing Machinery (ACM) in textbooks and academic sources for more than thirty years. There were three major specifications, named after the years they were first published: ALGOL 68 is substantially different from ALGOL 60 and was not well received, so in general "Algol" means ALGOL 60 and its dialects.
|
DBpedia
|
http://dbpedia.org/resource/ALGOL
|
dbo:abstract
ألغول هي عائلة لغات برمجة أمرية، طورت في منتصف الخمسينات أثرت على العديد من لغات البرمجة الأخرى. هذه اللغة كانت الطريقة المعيارية لوصف الخوارزمية المستخدمة من قبل في الكتب والمصادر الأكاديمية لما يزيد عن ثلاثين عاما. أخذت معظم لغات البرمجة الحديثة من هذه اللغة، بحيث يمكن القول أن هذه اللغة واحدة من أنجح أربع لغات عالية المستوى وهي فورتران وكوبول وليسب. تم تصميم هذه اللغة لتفادي بعض المشاكل والأخطاء الموجودة في لغة فورتران. وكانت أيضا أول لغة تنفذ تعريفات الدالة المتداخلة مع نطاق المفردات. وعلاوة على ذلك، كانت أول لغة برمجة أولت اهتماما مفصلا لعمل تعريف رسمي للغة والتوثيق. (ar)
ALGOL (zkratka pro ALGOrithmic Language) patří do skupiny imperativních počítačových programovacích jazyků, původně vyvinutý v polovině 50. let 20. století, který výrazně ovlivnil mnoho ostatních a byl standardní metodou pro popis algoritmů, používaný ACM v učebnicích a akademických pracích na dalších více než 30 let. Uvažujeme-li, že většina současných jazyků je „algol-like“, byl Algol pravděpodobně nejúspěšnější ze čtyř soudobých vyšších programovacích jazyků, Fortran, Lisp a Cobol. Byl navržen tak, aby se vyvaroval některých vnímaných problémů ve Fortranu a nakonec pozdvihl mnoho dalších programovacích jazyků, včetně jazyka B, Pascalu a jazyka C. ALGOL zavedl kód v blocích a párování begin a end pro jejich ohraničení, byl také prvním jazykem, který uměl definovat vnořené funkce se statickým rozsahem. Kromě toho to byl první programovací jazyk, který přitáhl pozornost k formálnímu definování jazyka a via zavedl Backusovu–Naurovu formu, principy zápisu konstrukce jazyka. Tři hlavní specifikace: * ALGOL 58 – původně navrhovaný název IAL (International Algorithmic Language). * ALGOL 60 – první zavedení jako X1 ALGOL 60 v polovině r. 1960 – přepracovaný 1963 * ALGOL 68 – přepracovaný v roce 1973 – zavedl nové prvky jako dynamická pole, řezání, paralelismus, identifikaci operátorů a další rozmanité možnosti rozšíření. Před tím než prof. Wirth vyvinul Pascal, založil svůj na ALGOLu 60. Algol-W (pro IBM S/360) měl být novou generací ALGOLu, ale komise pro ALGOL 68 se raději rozhodla pro design, který byl více komplexní a pokročilý, než čistý a zjednodušený ALGOL 60. Oficiální názvy verzí ALGOLu jsou odvozené od roku, ve kterém byly prvně publikovány. Algol 68 je podstatně odlišný od Algolu 60, ale nebyl dobře přijat, takže obecně „Algolem“ myslíme Algol 60 a jeho odvozeniny. Části ALGOLovité (algol-like) syntaxe jsou občas stále používány jako pseudokód. (cs)
ALGOL és l'acrònim d'ALGOrithmic Language. Aquest llenguatge imperatiu d'alt nivell va ser originalment desenvolupat a mitjans dels anys 1950, tenint com a primera versió oficial l'ALGOL58. El seu objectiu era evadir alguns dels problemes trobats a FORTRAN. La gran novetat del llenguatge, implementada en la majoria dels d'avui en dia, fou l'ús de blocs d'instruccions delimitats per parelles de "inici-fi" (begin-end). A més la seva sintaxi es va convertir en la manera estàndard de reportar els algoritmes durant molts anys. Tot i no fer-se famós en àmbits comercials (fora de les universitats), va influir clarament sobre posteriors llenguatges molt més famosos com Pascal, C, i Ada.A més va tenir revisions i successions com qualsevol altre llenguatge viu (ALGOL60, ALGOL68, i ALGOL-W). El programa "Hola món", en ALGOL, és el següent: BEGINFILE F(KIND=REMOTE);EBCDIC ARRAY E[0:8];REPLACE E BY "HOLA MON!";WRITE(F, *, E);END. (ca)
Η ALGOL (συντομογραφία των αγγλικών λέξεων ALGOrithmic Language) είναι μια οικογένεια προστακτικών γλωσσών προγραμματισμού που αναπτύχθηκε στα μέσα της δεκαετίας του 1950 και άσκησε σημαντική επιρροή σε πολλές άλλες γλώσσες, ενώ έγινε ο ντε φάκτο τρόπος που περιγράφονταν οι αλγόριθμοι σε βιβλία και ακαδημαϊκά έργα για τα επόμενα 30 χρόνια. Σχεδιάστηκε ώστε να διορθωθούν κάποια από τα προβλήματα της FORTRAN και οδήγησε σε πολλές γλώσσες, μεταξύ αυτών στη , τη , την Pascal, τη Simula και τη C. Η ALGOL εισήγαγε τις ενότητες κώδικα (code blocks) και τα ζεύγη begin και end για την οριοθέτησή τους, ενώ ήταν η πρώτη γλώσσα που υλοποίησε εμφωλευμένους ορισμούς συναρτήσεων (nested function definitions) με . Συχνά χρησιμοποιούνται τμήματα σύνταξης που μοιάζουν με ALGOL σαν ψευδοκώδικας (περιγραφή αλγορίθμων που απευθύνεται σε αναγνώστες). Υπήρξαν τρεις σημαντικοί ορισμοί: * ALGOL 58 - αρχικά προτάθηκε η ονομασία IAL ('International Algorithmic Language). * ALGOL 60 - αρχικά υλοποιήθηκε σαν X1 ALGOL 60 στα μέσα της δεκαετίας του 1960 - με αναθεώρηση το 1963 * ALGOL 68 - αναθεωρήθηκε το 1973 - εισήγαγε νέα στοιχεία όπως οι ευέλικτοι πίνακες (flexible arrays), τα slices, ο παραλληλισμός, η ταυτοποίηση τελεστών (operator identification) και διάφορες άλλες επεκτάσεις. Ο Niklaus Wirth βάσισε τη δική του ALGOL W στην ALGOL 60 πριν αναπτύξει την Pascal. Η Algol-W επρόκειτο να ήταν η επόμενη γενιά της ALGOL αλλά η επιτροπή της ALGOL 68 αποφάσισε μια σχεδίαση που ήταν πολυπλοκότερη και πιο προχωρημένη από μια καθαρότερη και απλοποιημένη ALGOL 60. Οι επίσημες εκδόσεις της ALGOL ονομάζονται σύμφωνα με το χρόνο που δημοσιεύτηκαν αρχικά. Η Algol 68 ήταν αρκετά διαφορετική από την Algol 60 αλλά δεν έτυχε καλής υποδοχής, με αποτέλεσμα ο όρος "Algol" γενικά να σημαίνει την Algol 60 και της διαλέκτους της. (el)
ALGOL (/ˈælɡɒl, -ɡɔːl/; short for "Algorithmic Language") is a family of imperative computer programming languages originally developed in 1958. ALGOL heavily influenced many other languages and was the standard method for algorithm description used by the Association for Computing Machinery (ACM) in textbooks and academic sources for more than thirty years. In the sense that the syntax of most modern languages is "Algol-like", it was arguably more influential than three other high-level programming languages among which it was roughly contemporary: FORTRAN, Lisp, and COBOL. It was designed to avoid some of the perceived problems with FORTRAN and eventually gave rise to many other programming languages, including PL/I, Simula, BCPL, B, Pascal, and C. ALGOL introduced code blocks and the begin...end pairs for delimiting them. It was also the first language implementing nested function definitions with lexical scope. Moreover, it was the first programming language which gave detailed attention to formal language definition and through the Algol 60 Report introduced Backus–Naur form, a principal formal grammar notation for language design. There were three major specifications, named after the years they were first published: * ALGOL 58 – originally proposed to be called IAL, for International Algebraic Language. * ALGOL 60 – first implemented as X1 ALGOL 60 in 1961. Revised 1963. * ALGOL 68 – introduced new elements including flexible arrays, slices, parallelism, operator identification. Revised 1973. ALGOL 68 is substantially different from ALGOL 60 and was not well received, so in general "Algol" means ALGOL 60 and its dialects. (en)
ALGOL (eine Abkürzung für „Algorithmic Language“), meist Algol geschrieben, ist der Name einer Familie von Programmiersprachen, die ab Ende der 1950er Jahre bis in die 1980er Jahre Verwendung fanden. Trotz etlicher Gemeinsamkeiten in Struktur, Syntax und Semantik sind diese Sprachen eigenständig und haben unterschiedliche Entstehungsgeschichten. Alle Algol-Varianten haben erheblichen Einfluss auf die weitere Entwicklung von Programmiersprachen gehabt. Die Sprachen C und Pascal etwa sind Weiterentwicklungen von Algol 60. Es gab folgende Fassungen von Algol: * Algol 58, ein bekannter Ableger ist Jovial oder * Algol 60 * Algol 68 * Algol W (de)
Algolo estas nomo (valizvorto formita el ALGOritma Lingvo) de pluraj programlingvoj por «scienca komputado» ellaboritaj de internacia grupo de komputikistoj kadre de IFIP. Unue lingvo kun tia nomo aperis en 1958 por tri celoj: 1. * esti laŭeble proksima al la tradicia matematika simbolaro, 2. * esti facile legebla por iĝi rimedo por publikigo de algoritmoj kaj 3. * esti tradukebla en maŝinlingvojn. El pluraj sinsekvaj projektoj konsiderindan disvastiĝon ricevis la lingvoj Algolo-60, ALGOL-W kaj Algolo-68. (eo)
Se denomina ALGOL (o Algol) a un lenguaje de programación. La voz es un acrónimo de las palabras inglesas Algorithmic Language ('lenguaje algorítmico'). Fue muy popular en las universidades durante los años 1960, pero no llegó a cuajar como lenguaje de utilización comercial. Sin embargo, Algol influyó profundamente en varios lenguajes posteriores que sí alcanzaron gran difusión, como Pascal, C y Ada. Hacia 1965 dos corrientes se distinguieron sobre el tema de un sucesor para Algol. Como resultado se definieron los lenguajes que es un lenguaje minimalista, rápidamente implementado y distribuido y, por otra parte, que para la época está en la frontera entre un lenguaje para programar en él y un lenguaje para investigar sobre él. (es)
ALGOL deitzen zaio (edo Algol) Programazio-lengoaia bati. Hitza ingelesezko Algorithmic Language hitzen (lengoaia algoritmikoa) akronimoa da. Oso popularra izan zen unibertsitateetan , baina ez zen gauzatu gero erabilera komertzialerako lengoaia gisa. Hala ere, geroago asko zabaldu ziren zenbait lengoaiatan Algolek eragin sakona izan zuen, adibidez, Pascal, C eta Ada. 1965ean bi korronte bereizi ziren Algol lengoaiaren bilakaeran. Emaitza gisa bi lengoaia berri definitu ziren: Algol W lengoaia minimalista, azkar inplementatua eta zabaldua, eta bestaldetik, Algol 68, garai hartan programatzeko lengoaien eta lengoaia ikergaien arteko mugan kokatzen zena. (eu)
Algol est un langage de programmation créé à la fin des années 1950. Dérivé d'un projet de l'UNESCO d'abord appelé IAL (International Algebraic Language), son nom est l'acronyme d'algorithmic language (avec un clin d'œil à l'étoile β Persei). Son objectif était de décrire algorithmiquement des problèmes de programmation. Les principales différences au niveau de la conception par rapport à Fortran furent l'utilisation de blocs marqués par BEGIN END, permettant variables locales et tableaux dynamiques, et surtout la récursivité, concepts qui seront largement repris par ses successeurs. Le langage existe en plusieurs versions : Algol 58, Algol 60, Algol W. Quant à Algol 68, quoiqu'il porte un nom similaire et ait été élaboré par un groupe IFIP, on ne parle pas d'une version d'Algol, mais d'un nouveau langage sur des bases très différentes utilisé sur (fr)
Teanga ardleibhéil ríomhchláraithe a saothraíodh san Eoraip sna 1950idí chun oibrithe matamaiticiúla is eolaíochta a dhéanamh, ag an am céanna a bhí Fortran á shaothrú sna Stáit Aontaithe don ghnó céanna. (ga)
ALGOL(アルゴル)は、命令型プログラミング言語ファミリーの1つ。名前「ALGOL」は「アルゴリズム言語」を意味する英語「algorithmic language」に由来する。1950年代中ごろに開発され、多くの言語に影響を及ぼし、ACMや教科書や学術論文などでアルゴリズム記述のデファクトスタンダードとして30年以上使われた。現代の多くの言語が「ALGOL系」あるいは「ALGOL風」(algol-like) とされているという意味で、ほぼ同世代の高水準言語である FORTRAN、LISP、COBOL に比べて最も成功したと言うこともできる。FORTRANで明らかとなった問題を防ぐよう設計され、BCPL、B、Pascal、Simula、Cといった様々なプログラミング言語に影響を与えた。ALGOLは「begin と end で囲む」という構文によるブロック構造を導入し、制御構造を自在に入れ子(ネスト)にできる初の広まった言語となった。また構文の形式的定義を真剣に検討した最初のプログラミング言語でもあり、"Algol 60 Report" で導入されたバッカス・ナウア記法は、その後のコンピュータ言語等の構文の形式的定義を示す手法として(プログラミング言語だけに限られず)定番の記法となっている。 (ja)
Algol is een zeer invloedrijke reeks programmeertalen, ontworpen tussen 1958 en 1973. De naam is een afkorting voor Algorithmic Language. Algol is voortgekomen uit de wens om een universeel bruikbare, machine-onafhankelijke programmeertaal te maken die voor mensen eenvoudig begrijpelijk moest zijn en daarom aansloot bij de gebruikelijke wiskundige notatie en manieren van uitdrukken in natuurlijke talen zoals het Engels of Nederlands. Dit streven is mislukt: de universele programmeertaal is er niet gekomen. Maar de meeste sindsdien en ook nu nog ontwikkelde programmeertalen gebruiken allerlei ideeën die in een van de versies van Algol zijn geïntroduceerd. (nl)
알골(ALGOL, Algorithmic Language의 준말)은 1950년대 미국에서 만들어진 포트란에 대항하여 유럽의 학자들을 중심으로 개발된 프로그래밍 언어이다. 알고리즘의 연구개발, 수치 계산과 논리 연산에 이용하기 위한 목적으로 만들어졌다. ALGOL은 여러 ALGOL계 언어의 총칭으로 ALGOL이란 이름의 프로그래밍 언어는 존재하지 않는다. (ko)
ALGOL (z ang. ALGOrithmic Language) – język programowania, który odegrał ważną rolę w historii informatyki. Wpłynął istotnie na kształtowanie się innych języków, w tym Pascala. Od momentu powstania przez około 20 lat ALGOL (lub jego dialekty) był de facto standardem opisu algorytmów w publikacjach naukowych i podręcznikach. Prace nad Algolem rozpoczęto w drugiej połowie lat 50. XX wieku. Formalną specyfikację języka opublikowano w raporcie ALGOL 58, następnie język rozwinięto w raportach ALGOL 60 i ALGOL 68. Te właśnie wersje zdobyły uznanie informatyków. O ile ALGOL 58 oparty był na FORTRANie i brakowało w nim wielu niezbędnych dla programisty konstrukcji, ALGOL 60 stanowił przełom w dziedzinie języków programowania. ALGOL 60 był pierwszym językiem zaprojektowanym we współpracy międzynarodowej – w pracach nad nim uczestniczyli między innymi Peter Naur i John Backus. Do opisu języka wykorzystano stworzoną przez Backusa podczas projektowania Fortranu notację BNF, zmodyfikowaną przez Naura. Raport o Algolu 60 opublikowano w maju 1960, poprawiono go w 1962. Na podstawie języka ALGOL 60 Niklaus Wirth stworzył , który posłużył mu następnie do zdefiniowania Pascala. W Algolu 60 wprowadzono: * instrukcje blokowe, pozwalające na grupowanie instrukcji w bloki * przekazywanie parametrów do procedur przez wartość i przez nazwę * rekurencję – wywoływanie procedur przez siebie * tablice dynamiczne – ich rozmiar mógł być ustalany w trakcie działania programu. * słowa kluczowe – symbole użyte jako słowa kluczowe nie mogą być użyte jako identyfikatory w programie * typy danych definiowane przez użytkownika – użytkownik mógł zdefiniować abstrakcyjne typy danych najlepiej pasujące do konkretnego problemu ALGOL 60 miał również wady, które zdecydowały o tym, że nie odniósł sukcesu przemysłowego. Nie określono w nim żadnych instrukcji wejścia/wyjścia, pozostawiając je konkretnym implementacjom, które na dodatek były utrudnione przez jego elastyczność i niezależność od sprzętu. Nie uzyskał też wsparcia firm komputerowych (głównie ze strony dominującego na rynku w latach 60. IBM, który zainteresowany był rozwojem Cobola i PL/I). ALGOL 68 był rozwinięciem idei Algolu 60, wprowadzono w nim wreszcie wejścia/wyjścia. Ścisłe określenie języka posunięto jednak do granic: opublikowany w styczniu 1969 roku raport języka był tak formalny, że wielu informatyków uznało go za całkowicie nieczytelny. Mimo to ALGOL 68 był pierwszym z niewielu języków zdefiniowanych w sposób formalny przed implementacją, które odniosły pewien sukces przemysłowy. (pl)
ALGOL (abbreviazione di ALGOrithmic Language, ossia linguaggio algoritmico) è un linguaggio di programmazione sviluppato nel 1958. In un certo senso fu una reazione al Fortran; introdusse concetti fondamentali come lo stack dei record di attivazione e per certi versi si può considerare capostipite di una grande famiglia di linguaggi di notevole successo, dal Pascal al C. Per una trentina d'anni è stato lo standard de facto con cui si rappresentavano gli algoritmi. L'ALGOL utilizza blocchi di istruzioni delimitati da coppie di begin ed end (come il Pascal); è stato il primo linguaggio ad usare questa sintassi molto utilizzata anche in didattica: frammenti di sintassi simile all'ALGOL sono tuttora utilizzati a volte come notazione per gli algoritmi chiamata . Esistono tre versioni ufficiali di ALGOL: , ed ALGOL 68. Di queste, solo ALGOL 60 si è diffusa (Niklaus Wirth basò il proprio sull'ALGOL 60 prima di cominciare a sviluppare il Pascal). I nomi ufficiali delle versioni di ALGOL sono dovuti all'anno di pubblicazione. L'ALGOL 58 era conosciuto originariamente come IAL (l'acronimo di International Algorithmic Language.) (it)
O ALGOL é uma família de linguagens de programação de alto nível voltadas principalmente para aplicações científicas. Seu nome provém das palavras "ALGOrithmic Language" (Linguagem Algorítmica). O nome é escrito ALGOL ou Algol, dependendo da fonte de consulta. Várias linguagens razoavelmente diferentes receberam o nome de ALGOL ou Algol, adicionando ao nome letras ou números, por descenderem diretamente do ALGOL original. (pt)
ALGOL är ett programspråk ursprungligen utvecklat 1958. Namnet är en förkortning av Algorithmic Language. Algol konstruerades av en kommitté av europeiska och amerikanska forskare, delvis som en reaktion på det strax tidigare utvecklade språket Fortran. Det innehöll ett antal då revolutionerande idéer och har starkt influerat de flesta efterföljande programspråken. Det var det första språket som beskrevs av en formell syntax och introducerade många av de numera vedertagna kontrollstrukturerna som if-then-else och while-do. Dock fick Algol aldrig någon större spridning jämfört med de samtida Fortran och COBOL, bland annat för att de båda första Algol-versionerna inte innehöll någon standardiserad definition av hur inmatning och utmatning av data skulle ske, det delegerades till konkreta implementeringar. Framträdande personer inom kommittén var amerikanen John Backus och dansken Peter Naur som också har gett namn åt Backus-Naur-form, den formalism som beskriver syntaxen i Algol, och många senare språk. Den första officiella versionen av språket var och senare kom Algol 60 och . Niklaus Wirth utvecklade också en egen version som var föregångaren till Pascal. De officiella versionerna är namngivna efter det år de publicerades. Algol 68 skiljer sig i många avseenden radikalt från de övriga varianterna av Algol. Det beskrevs av en komplex två-nivås grammatik skapad av som bland annat medgav att i stor utsträckning även beskriva språkets semantik, förutom dess syntax. Algol 68 innehöll också nya datatyper som referenser och sammansatta strukturer jämförbara med COBOL:s. (sv)
ALGOL,名稱源自演算法語言(英語:ALGOrithmic Language)的縮寫,是一族指令式編程語言,發展於1950年代中期,對許多其它程式語言產生了重大影響。计算机协会在教科書及學術文章採用此語言做為描述演算法的標準語法超過三十年。 由大多數近代程式語言皆使用類似ALGOL的語法來看,ALGOL可與差不多同時期的FORTRAN、LISP及COBOL並列為四大最有影響力的高階語言。ALGOL被設計用來避免FORTRAN中一些已知的問題,最終引領了許多其它程式語言的興起,包括PL/I、Simula、BCPL、B、Pascal及C。 ALGOL引入了代码块,並用begin⋯end來分隔。它是第一個利用詞法作用域實作巢狀函式的語言,也是第一個注重形式語言定義的語言,並在ALGOL 60報告中引入了巴科斯範式來作為設計語言形式文法的原則。 ALGOL主要有三種規格,以初次發表的年份命名: * ALGOL 58 – 原提議命名為IAL,代表International Algebraic Language。 * ALGOL 60 – 在1960年代中期首先實作成X1 ALGOL 60,1963年修訂。 * ALGOL 68 – 引入許多新元素,像是可變陣列、切片、平行化及算子識別。1973年修訂。 尼克勞斯·維爾特在發展Pascal之前,在ALGOL 60的基礎下建立了ALGOL W。ALGOL W本是下一代ALGOL的提議,但ALGOL委員會決定採用更先進複雜的設計,而不是一個簡潔化ALGOL 60。 (zh)
Алго́л (англ. Algol от algorithmic language — «алгоритмический язык») — название ряда языков программирования, применяемых при составлении программ для решения научно-технических задач на ЭВМ. Разработан комитетом по языку высокого уровня IFIP в 1958-1960 годах (Алгол 58, Алгол 60). Кардинально переработан в 1964-1968 годах (Алгол 68). Один из первых языков высокого уровня. Был популярен в Европе, в том числе в СССР, в качестве языка как практического программирования, так и академического языка (языка публикации алгоритмов в научных работах), но в США и Канаде не смог превзойти распространённый там Фортран. Оказал заметное влияние на все разработанные позднее императивные языки программирования — в частности, на язык Pascal. Обычно названием Алгол (без уточнения версии) именуют Алгол 60, в то время как Алгол 68 рассматривается как самостоятельный язык. (ru)
АЛГОЛ або ALGOL (скорочення від англ. ALGOrithmic Language — алгоритмічна мова) — сімейство імперативних мов програмування, першу версію розроблено в середині 1950-х, і стала де факто стандартною мовою публікації алгоритмів на наступні 30 років. АЛГОЛ було розроблено для подолання деяких проблем, які були притаманні для Фортрана та дала поштовх для розвитку багатьох інших мов (серед них — Паскаль). У АЛГОЛ застосовувались операторні дужки, це перша мова, яка для їх розмежування застосувала пару begin та end. Фрагменти АЛГОЛ-подібного синтаксису досі використовуються у деяких нотаціях алгоритмів. Офіційні версії АЛГОЛу називають за роками, коли вони вперше були опубліковані. Існує три великі офіційні гілки сімейства АЛГОЛ: 1. * АЛГОЛ 58 — спочатку відомий як IAL (англ. International Algorithmic Language — Міжнародна Алгоритмічна Мова); 2. * АЛГОЛ 60 — версія 1963 року; 3. * АЛГОЛ 68 — версія 1973 року; Ніклаус Вірт взяв АЛГОЛ 60 за основу для свого АЛГОЛ-В, перед тим як перейти до розробки Паскалю. АЛГОЛ-В мав би стати наступним поколінням АЛГОЛ, але комітет АЛГОЛ 68 віддав перевагу більш складному й розвиненому дизайну, перед спрощеним і прозорішим АЛГОЛ 60. (uk)
rdfs:comment
ألغول هي عائلة لغات برمجة أمرية، طورت في منتصف الخمسينات أثرت على العديد من لغات البرمجة الأخرى. هذه اللغة كانت الطريقة المعيارية لوصف الخوارزمية المستخدمة من قبل في الكتب والمصادر الأكاديمية لما يزيد عن ثلاثين عاما. أخذت معظم لغات البرمجة الحديثة من هذه اللغة، بحيث يمكن القول أن هذه اللغة واحدة من أنجح أربع لغات عالية المستوى وهي فورتران وكوبول وليسب. تم تصميم هذه اللغة لتفادي بعض المشاكل والأخطاء الموجودة في لغة فورتران. وكانت أيضا أول لغة تنفذ تعريفات الدالة المتداخلة مع نطاق المفردات. وعلاوة على ذلك، كانت أول لغة برمجة أولت اهتماما مفصلا لعمل تعريف رسمي للغة والتوثيق. (ar)
Algolo estas nomo (valizvorto formita el ALGOritma Lingvo) de pluraj programlingvoj por «scienca komputado» ellaboritaj de internacia grupo de komputikistoj kadre de IFIP. Unue lingvo kun tia nomo aperis en 1958 por tri celoj: 1. * esti laŭeble proksima al la tradicia matematika simbolaro, 2. * esti facile legebla por iĝi rimedo por publikigo de algoritmoj kaj 3. * esti tradukebla en maŝinlingvojn. El pluraj sinsekvaj projektoj konsiderindan disvastiĝon ricevis la lingvoj Algolo-60, ALGOL-W kaj Algolo-68. (eo)
Teanga ardleibhéil ríomhchláraithe a saothraíodh san Eoraip sna 1950idí chun oibrithe matamaiticiúla is eolaíochta a dhéanamh, ag an am céanna a bhí Fortran á shaothrú sna Stáit Aontaithe don ghnó céanna. (ga)
ALGOL(アルゴル)は、命令型プログラミング言語ファミリーの1つ。名前「ALGOL」は「アルゴリズム言語」を意味する英語「algorithmic language」に由来する。1950年代中ごろに開発され、多くの言語に影響を及ぼし、ACMや教科書や学術論文などでアルゴリズム記述のデファクトスタンダードとして30年以上使われた。現代の多くの言語が「ALGOL系」あるいは「ALGOL風」(algol-like) とされているという意味で、ほぼ同世代の高水準言語である FORTRAN、LISP、COBOL に比べて最も成功したと言うこともできる。FORTRANで明らかとなった問題を防ぐよう設計され、BCPL、B、Pascal、Simula、Cといった様々なプログラミング言語に影響を与えた。ALGOLは「begin と end で囲む」という構文によるブロック構造を導入し、制御構造を自在に入れ子(ネスト)にできる初の広まった言語となった。また構文の形式的定義を真剣に検討した最初のプログラミング言語でもあり、"Algol 60 Report" で導入されたバッカス・ナウア記法は、その後のコンピュータ言語等の構文の形式的定義を示す手法として(プログラミング言語だけに限られず)定番の記法となっている。 (ja)
알골(ALGOL, Algorithmic Language의 준말)은 1950년대 미국에서 만들어진 포트란에 대항하여 유럽의 학자들을 중심으로 개발된 프로그래밍 언어이다. 알고리즘의 연구개발, 수치 계산과 논리 연산에 이용하기 위한 목적으로 만들어졌다. ALGOL은 여러 ALGOL계 언어의 총칭으로 ALGOL이란 이름의 프로그래밍 언어는 존재하지 않는다. (ko)
O ALGOL é uma família de linguagens de programação de alto nível voltadas principalmente para aplicações científicas. Seu nome provém das palavras "ALGOrithmic Language" (Linguagem Algorítmica). O nome é escrito ALGOL ou Algol, dependendo da fonte de consulta. Várias linguagens razoavelmente diferentes receberam o nome de ALGOL ou Algol, adicionando ao nome letras ou números, por descenderem diretamente do ALGOL original. (pt)
ALGOL és l'acrònim d'ALGOrithmic Language. Aquest llenguatge imperatiu d'alt nivell va ser originalment desenvolupat a mitjans dels anys 1950, tenint com a primera versió oficial l'ALGOL58. El seu objectiu era evadir alguns dels problemes trobats a FORTRAN. La gran novetat del llenguatge, implementada en la majoria dels d'avui en dia, fou l'ús de blocs d'instruccions delimitats per parelles de "inici-fi" (begin-end). A més la seva sintaxi es va convertir en la manera estàndard de reportar els algoritmes durant molts anys. Tot i no fer-se famós en àmbits comercials (fora de les universitats), va influir clarament sobre posteriors llenguatges molt més famosos com Pascal, C, i Ada.A més va tenir revisions i successions com qualsevol altre llenguatge viu (ALGOL60, ALGOL68, i ALGOL-W). (ca)
ALGOL (zkratka pro ALGOrithmic Language) patří do skupiny imperativních počítačových programovacích jazyků, původně vyvinutý v polovině 50. let 20. století, který výrazně ovlivnil mnoho ostatních a byl standardní metodou pro popis algoritmů, používaný ACM v učebnicích a akademických pracích na dalších více než 30 let. Uvažujeme-li, že většina současných jazyků je „algol-like“, byl Algol pravděpodobně nejúspěšnější ze čtyř soudobých vyšších programovacích jazyků, Fortran, Lisp a Cobol. Byl navržen tak, aby se vyvaroval některých vnímaných problémů ve Fortranu a nakonec pozdvihl mnoho dalších programovacích jazyků, včetně jazyka B, Pascalu a jazyka C. (cs)
ALGOL (/ˈælɡɒl, -ɡɔːl/; short for "Algorithmic Language") is a family of imperative computer programming languages originally developed in 1958. ALGOL heavily influenced many other languages and was the standard method for algorithm description used by the Association for Computing Machinery (ACM) in textbooks and academic sources for more than thirty years. There were three major specifications, named after the years they were first published: ALGOL 68 is substantially different from ALGOL 60 and was not well received, so in general "Algol" means ALGOL 60 and its dialects. (en)
ALGOL (eine Abkürzung für „Algorithmic Language“), meist Algol geschrieben, ist der Name einer Familie von Programmiersprachen, die ab Ende der 1950er Jahre bis in die 1980er Jahre Verwendung fanden. Trotz etlicher Gemeinsamkeiten in Struktur, Syntax und Semantik sind diese Sprachen eigenständig und haben unterschiedliche Entstehungsgeschichten. Alle Algol-Varianten haben erheblichen Einfluss auf die weitere Entwicklung von Programmiersprachen gehabt. Die Sprachen C und Pascal etwa sind Weiterentwicklungen von Algol 60. Es gab folgende Fassungen von Algol: (de)
Η ALGOL (συντομογραφία των αγγλικών λέξεων ALGOrithmic Language) είναι μια οικογένεια προστακτικών γλωσσών προγραμματισμού που αναπτύχθηκε στα μέσα της δεκαετίας του 1950 και άσκησε σημαντική επιρροή σε πολλές άλλες γλώσσες, ενώ έγινε ο ντε φάκτο τρόπος που περιγράφονταν οι αλγόριθμοι σε βιβλία και ακαδημαϊκά έργα για τα επόμενα 30 χρόνια. Σχεδιάστηκε ώστε να διορθωθούν κάποια από τα προβλήματα της FORTRAN και οδήγησε σε πολλές γλώσσες, μεταξύ αυτών στη , τη , την Pascal, τη Simula και τη C. Η ALGOL εισήγαγε τις ενότητες κώδικα (code blocks) και τα ζεύγη begin και end για την οριοθέτησή τους, ενώ ήταν η πρώτη γλώσσα που υλοποίησε εμφωλευμένους ορισμούς συναρτήσεων (nested function definitions) με . Συχνά χρησιμοποιούνται τμήματα σύνταξης που μοιάζουν με ALGOL σαν ψευδοκώδικας (περιγραφή αλ (el)
ALGOL deitzen zaio (edo Algol) Programazio-lengoaia bati. Hitza ingelesezko Algorithmic Language hitzen (lengoaia algoritmikoa) akronimoa da. Oso popularra izan zen unibertsitateetan , baina ez zen gauzatu gero erabilera komertzialerako lengoaia gisa. Hala ere, geroago asko zabaldu ziren zenbait lengoaiatan Algolek eragin sakona izan zuen, adibidez, Pascal, C eta Ada. (eu)
Se denomina ALGOL (o Algol) a un lenguaje de programación. La voz es un acrónimo de las palabras inglesas Algorithmic Language ('lenguaje algorítmico'). Fue muy popular en las universidades durante los años 1960, pero no llegó a cuajar como lenguaje de utilización comercial. Sin embargo, Algol influyó profundamente en varios lenguajes posteriores que sí alcanzaron gran difusión, como Pascal, C y Ada. (es)
Algol est un langage de programmation créé à la fin des années 1950. Dérivé d'un projet de l'UNESCO d'abord appelé IAL (International Algebraic Language), son nom est l'acronyme d'algorithmic language (avec un clin d'œil à l'étoile β Persei). Son objectif était de décrire algorithmiquement des problèmes de programmation. Les principales différences au niveau de la conception par rapport à Fortran furent l'utilisation de blocs marqués par BEGIN END, permettant variables locales et tableaux dynamiques, et surtout la récursivité, concepts qui seront largement repris par ses successeurs. (fr)
ALGOL (abbreviazione di ALGOrithmic Language, ossia linguaggio algoritmico) è un linguaggio di programmazione sviluppato nel 1958. In un certo senso fu una reazione al Fortran; introdusse concetti fondamentali come lo stack dei record di attivazione e per certi versi si può considerare capostipite di una grande famiglia di linguaggi di notevole successo, dal Pascal al C. (it)
Algol is een zeer invloedrijke reeks programmeertalen, ontworpen tussen 1958 en 1973. De naam is een afkorting voor Algorithmic Language. Algol is voortgekomen uit de wens om een universeel bruikbare, machine-onafhankelijke programmeertaal te maken die voor mensen eenvoudig begrijpelijk moest zijn en daarom aansloot bij de gebruikelijke wiskundige notatie en manieren van uitdrukken in natuurlijke talen zoals het Engels of Nederlands. (nl)
ALGOL (z ang. ALGOrithmic Language) – język programowania, który odegrał ważną rolę w historii informatyki. Wpłynął istotnie na kształtowanie się innych języków, w tym Pascala. Od momentu powstania przez około 20 lat ALGOL (lub jego dialekty) był de facto standardem opisu algorytmów w publikacjach naukowych i podręcznikach. W Algolu 60 wprowadzono: (pl)
Алго́л (англ. Algol от algorithmic language — «алгоритмический язык») — название ряда языков программирования, применяемых при составлении программ для решения научно-технических задач на ЭВМ. Разработан комитетом по языку высокого уровня IFIP в 1958-1960 годах (Алгол 58, Алгол 60). Кардинально переработан в 1964-1968 годах (Алгол 68). Один из первых языков высокого уровня. Был популярен в Европе, в том числе в СССР, в качестве языка как практического программирования, так и академического языка (языка публикации алгоритмов в научных работах), но в США и Канаде не смог превзойти распространённый там Фортран. Оказал заметное влияние на все разработанные позднее императивные языки программирования — в частности, на язык Pascal. (ru)
ALGOL är ett programspråk ursprungligen utvecklat 1958. Namnet är en förkortning av Algorithmic Language. Algol konstruerades av en kommitté av europeiska och amerikanska forskare, delvis som en reaktion på det strax tidigare utvecklade språket Fortran. Det innehöll ett antal då revolutionerande idéer och har starkt influerat de flesta efterföljande programspråken. Det var det första språket som beskrevs av en formell syntax och introducerade många av de numera vedertagna kontrollstrukturerna som if-then-else och while-do. Dock fick Algol aldrig någon större spridning jämfört med de samtida Fortran och COBOL, bland annat för att de båda första Algol-versionerna inte innehöll någon standardiserad definition av hur inmatning och utmatning av data skulle ske, det delegerades till konkreta imp (sv)
АЛГОЛ або ALGOL (скорочення від англ. ALGOrithmic Language — алгоритмічна мова) — сімейство імперативних мов програмування, першу версію розроблено в середині 1950-х, і стала де факто стандартною мовою публікації алгоритмів на наступні 30 років. АЛГОЛ було розроблено для подолання деяких проблем, які були притаманні для Фортрана та дала поштовх для розвитку багатьох інших мов (серед них — Паскаль). У АЛГОЛ застосовувались операторні дужки, це перша мова, яка для їх розмежування застосувала пару begin та end. Фрагменти АЛГОЛ-подібного синтаксису досі використовуються у деяких нотаціях алгоритмів. (uk)
ALGOL,名稱源自演算法語言(英語:ALGOrithmic Language)的縮寫,是一族指令式編程語言,發展於1950年代中期,對許多其它程式語言產生了重大影響。计算机协会在教科書及學術文章採用此語言做為描述演算法的標準語法超過三十年。 由大多數近代程式語言皆使用類似ALGOL的語法來看,ALGOL可與差不多同時期的FORTRAN、LISP及COBOL並列為四大最有影響力的高階語言。ALGOL被設計用來避免FORTRAN中一些已知的問題,最終引領了許多其它程式語言的興起,包括PL/I、Simula、BCPL、B、Pascal及C。 ALGOL引入了代码块,並用begin⋯end來分隔。它是第一個利用詞法作用域實作巢狀函式的語言,也是第一個注重形式語言定義的語言,並在ALGOL 60報告中引入了巴科斯範式來作為設計語言形式文法的原則。 ALGOL主要有三種規格,以初次發表的年份命名: * ALGOL 58 – 原提議命名為IAL,代表International Algebraic Language。 * ALGOL 60 – 在1960年代中期首先實作成X1 ALGOL 60,1963年修訂。 * ALGOL 68 – 引入許多新元素,像是可變陣列、切片、平行化及算子識別。1973年修訂。 (zh)
|
||||
4768
|
dbpedia
|
3
| 33
|
https://www.britannica.com/technology/computer/IBM-develops-FORTRAN
|
en
|
Computer - Programming, FORTRAN, IBM
|
[
"https://cdn.britannica.com/mendel/eb-logo/MendelNewThistleLogo.png",
"https://cdn.britannica.com/mendel/eb-logo/MendelNewThistleLogo.png",
"https://cdn.britannica.com/77/170477-004-B774BDDF/Laptop-computer.jpg",
"https://cdn.britannica.com/61/74061-004-9EAD20C7/computer-peripherals-images-laser-printer-inkjet-layout.jpg",
"https://cdn.britannica.com/67/74067-004-02138267/Gordon-E-Moore-law-number-transistors-computer.jpg",
"https://cdn.britannica.com/63/74063-004-1E01870E/hard-drive.jpg",
"https://cdn.britannica.com/42/20542-004-771D1C10/laser-DVD-player-CD-focus-point-storage.jpg",
"https://cdn.britannica.com/64/74064-004-1192BCB2/mouse-personal-computer.jpg",
"https://cdn.britannica.com/65/74065-004-5A9ED975/computer-laser-printer.jpg",
"https://cdn.britannica.com/66/74066-004-DDBF4DD5/colour-Inkjet-printer-Colour-inkjet-printers-amounts.jpg",
"https://cdn.britannica.com/82/179682-138-95AEFEA2/quantum-computers.jpg?w=400&h=225&c=crop",
"https://cdn.britannica.com/37/24337-004-B57997F0/area-networks-bus-Ethernet-token-IBM-office.jpg",
"https://cdn.britannica.com/mendel-resources/3-124/images/shared/default3.png?v=3.124.12",
"https://cdn.britannica.com/63/166063-131-0E023062/computer-chip-processing-unit-history-society-science.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/93/149293-131-2CD523EA/Technician-computer-system-console-UNIVAC-Corona-Annex-June-1-1981.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/93/149293-131-2CD523EA/Technician-computer-system-console-UNIVAC-Corona-Annex-June-1-1981.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/88/129488-131-09AB5E83/Internet-blue-screen-blog-society-history-media-2009.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/09/198609-131-0FECF4D1/insides-desktop-computer.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/30/199930-131-B3D1D347/computer.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/88/129488-131-09AB5E83/Internet-blue-screen-blog-society-history-media-2009.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/70/62770-131-C97CD1F7/Black-widow-spider.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/51/190751-131-B431C216/soccer-ball-goal.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/70/191970-131-A85628DA/Color-wheel-light-color-spectrum.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/73/124073-131-EA3EACC6/Tug-of-war-US-Naval-Academy-Maryland-Annapolis-2005.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/24/129624-131-37C4C022/flags-soccer-balls-Futbol-football-arts-blog-2009.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/36/162636-131-E4AA93A0/Colosseum-Rome-Italy.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/41/166941-131-E27FD3A6/Yingxiu-school-China-Sichuan-earthquake-May-2008.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/94/186694-138-E7FAC9E8/Elliott-803-United-Kingdom.jpg?w=800&h=450&c=crop",
"https://cdn.britannica.com/95/186695-138-46018768/ICL-2966-circuit-technology-United-Kingdom.jpg?w=800&h=450&c=crop"
] |
[] |
[] |
[
"computer",
"encyclopedia",
"encyclopeadia",
"britannica",
"article"
] | null |
[
"William Morton Pottenger",
"David Hemmendinger"
] |
2001-05-11T00:00:00+00:00
|
Computer - Programming, FORTRAN, IBM: In the early 1950s John Backus convinced his managers at IBM to let him put together a team to design a language and write a compiler for it. He had a machine in mind: the IBM 704, which had built-in floating-point math operations. That the 704 used floating-point representation made it especially useful for scientific work, and Backus believed that a scientifically oriented programming language would make the machine even more attractive. Still, he understood the resistance to anything that slowed a machine down, and he set out to produce a language and a compiler that would produce code that
|
en
|
/favicon.png
|
Encyclopedia Britannica
|
https://www.britannica.com/technology/computer/IBM-develops-FORTRAN
|
In the early 1950s John Backus convinced his managers at IBM to let him put together a team to design a language and write a compiler for it. He had a machine in mind: the IBM 704, which had built-in floating-point math operations. That the 704 used floating-point representation made it especially useful for scientific work, and Backus believed that a scientifically oriented programming language would make the machine even more attractive. Still, he understood the resistance to anything that slowed a machine down, and he set out to produce a language and a compiler that would produce code that ran virtually as fast as hand-coded machine language—and at the same time made the program-writing process a lot easier.
By 1954 Backus and a team of programmers had designed the language, which they called FORTRAN (Formula Translation). Programs written in FORTRAN looked a lot more like mathematics than machine instructions:
DO 10 J = 1,11
I = 11 − J
Y = F(A(I + 1))
IF (400 − Y) 4,8,8
4 PRINT 5,1
5 FORMAT (I10, 10H TOO LARGE)
The compiler was written, and the language was released with a professional-looking typeset manual (a first for programming languages) in 1957.
FORTRAN took another step toward making programming more accessible, allowing comments in the programs. The ability to insert annotations, marked to be ignored by the translator program but readable by a human, meant that a well-annotated program could be read in a certain sense by people with no programming knowledge at all. For the first time a nonprogrammer could get an idea what a program did—or at least what it was intended to do—by reading (part of) the code. It was an obvious but powerful step in opening up computers to a wider audience.
FORTRAN has continued to evolve, and it retains a large user base in academia and among scientists.
COBOL
About the time that Backus and his team invented FORTRAN, Hopper’s group at UNIVAC released Math-matic, a FORTRAN-like language for UNIVAC computers. It was slower than FORTRAN and not particularly successful. Another language developed at Hopper’s laboratory at the same time had more influence. Flow-matic used a more English-like syntax and vocabulary:
1 COMPARE PART-NUMBER (A) TO PART-NUMBER (B);
IF GREATER GO TO OPERATION 13;
IF EQUAL GO TO OPERATION 4;
OTHERWISE GO TO OPERATION 2.
Flow-matic led to the development by Hopper’s group of COBOL (Common Business-Oriented Language) in 1959. COBOL was explicitly a business programming language with a very verbose English-like style. It became central to the wide acceptance of computers by business after 1959.
ALGOL
Although both FORTRAN and COBOL were universal languages (meaning that they could, in principle, be used to solve any problem that a computer could unravel), FORTRAN was better suited for mathematicians and engineers, whereas COBOL was explicitly a business programming language.
During the late 1950s a multitude of programming languages appeared. This proliferation of incompatible specialized languages spurred an interest in the United States and Europe to create a single “second-generation” language. A transatlantic committee soon formed to determine specifications for ALGOL (Algorithmic Language), as the new language would be called. Backus, on the American side, and Heinz Rutishauser, on the European side, were among the most influential committee members.
Although ALGOL introduced some important language ideas, it was not a commercial success. Customers preferred a known specialized language, such as FORTRAN or COBOL, to an unknown general-programming language. Only Pascal, a scientific programming-language offshoot of ALGOL, survives.
Control programs
In order to make the early computers truly useful and efficient, two major innovations in software were needed. One was high-level programming languages (as described in the preceding section, FORTRAN, COBOL, and ALGOL). The other was control. Today the systemwide control functions of a computer are generally subsumed under the term operating system, or OS. An OS handles the behind-the-scenes activities of a computer, such as orchestrating the transitions from one program to another and managing access to disk storage and peripheral devices.
The need for some kind of supervisor program was quickly recognized, but the design requirements for such a program were daunting. The supervisor program would have to run in parallel with an application program somehow, monitor its actions in some way, and seize control when necessary. Moreover, the essential—and difficult—feature of even a rudimentary supervisor program was the interrupt facility. It had to be able to stop a running program when necessary but save the state of the program and all registers so that after the interruption was over the program could be restarted from where it left off.
The first computer with such a true interrupt system was the UNIVAC 1103A, which had a single interrupt triggered by one fixed condition. In 1959 the Lincoln Labs TX2 generalized the interrupt capability, making it possible to set various interrupt conditions under software control. However, it would be one company, IBM, that would create, and dominate, a market for business computers. IBM established its primacy primarily through one invention: the IBM 360 operating system.
The IBM 360
IBM had been selling business machines since early in the century and had built Howard Aiken’s computer to his architectural specifications. But the company had been slow to implement the stored-program digital computer architecture of the early 1950s. It did develop the IBM 650, a (like UNIVAC) decimal implementation of the IAS plan—and the first computer to sell more than 1,000 units.
The invention of the transistor in 1947 led IBM to reengineer its early machines from electromechanical or vacuum tube to transistor technology in the late 1950s (although the UNIVAC Model 80, delivered in 1958, was the first transistor computer). These transistorized machines are commonly referred to as second-generation computers.
Two IBM inventions, the magnetic disk and the high-speed chain printer, led to an expansion of the market and to the unprecedented sale of 12,000 computers of one model: the IBM 1401. The chain printer required a lot of magnetic core memory, and IBM engineers packaged the printer support, core memory, and disk support into the 1401, one of the first computers to use this solid-state technology.
IBM had several lines of computers developed by independent groups of engineers within the company: a scientific-technical line, a commercial data-processing line, an accounting line, a decimal machine line, and a line of supercomputers. Each line had a distinct hardware-dependent operating system, and each required separate development and maintenance of its associated application software. In the early 1960s IBM began designing a machine that would take the best of all these disparate lines, add some new technology and new ideas, and replace all the company’s computers with one single line, the 360. At an estimated development cost of $5 billion, IBM literally bet the company’s future on this new, untested architecture.
The 360 was in fact an architecture, not a single machine. Designers G.M. Amdahl, F.P. Brooks, and G.A. Blaauw explicitly separated the 360 architecture from its implementation details. The 360 architecture was intended to span a wide range of machine implementations and multiple generations of machines. The first 360 models were hybrid transistor–integrated circuit machines. Integrated circuit computers are commonly referred to as third-generation computers.
Key to the architecture was the operating system. OS/360 ran on all machines built to the 360 architecture—initially six machines spanning a wide range of performance characteristics and later many more machines. It had a shielded supervisory system (unlike the 1401, which could be interfered with by application programs), and it reserved certain operations as privileged in that they could be performed only by the supervisor program.
The first IBM 360 computers were delivered in 1965. The 360 architecture represented a continental divide in the relative importance of hardware and software. After the 360, computers were defined by their operating systems.
The market, on the other hand, was defined by IBM. In the late 1950s and into the 1960s, it was common to refer to the computer industry as “IBM and the Seven Dwarfs,” a reference to the relatively diminutive market share of its nearest rivals—Sperry Rand (UNIVAC), Control Data Corporation (CDC), Honeywell, Burroughs, General Electric (GE), RCA, and National Cash Register Co. During this time IBM had some 60–70 percent of all computer sales. The 360 did nothing to lessen the giant’s dominance. When the market did open up somewhat, it was not due to the efforts of, nor was it in favor of, the dwarfs. Yet, while “IBM and the Seven Dwarfs” (soon reduced to “IBM and the BUNCH of Five,” BUNCH being an acronym for Burroughs, UNIVAC, NCR, CDC, and Honeywell) continued to build Big Iron, a fundamental change was taking place in how computers were accessed.
|
||||
4768
|
dbpedia
|
1
| 21
|
https://datatron.blogspot.com/2018/07/algol-58-international-algebraic.html
|
en
|
The Burroughs 205 and 220 Blog: Algol
|
https://datatron.blogspot.com/favicon.ico
|
https://datatron.blogspot.com/favicon.ico
|
[
"https://resources.blogblog.com/img/icon18_edit_allbkg.gif",
"https://3.bp.blogspot.com/-DAY4Y69Bfls/VL4_Kc85dcI/AAAAAAAAC7Y/5l1lU1Nu0TE/s220/B205Prime.jpg"
] |
[] |
[] |
[
""
] | null |
[
"Paul Kimpel"
] | null |
By the early 1950s, the number and variety of electronic computers were increasing, the scope of their application to science, business, a...
|
en
|
https://datatron.blogspot.com/favicon.ico
|
https://datatron.blogspot.com/2018/07/algol-58-international-algebraic.html
| ||||
4768
|
dbpedia
|
2
| 23
|
https://ebs-integrator.com/en/blog/programming-timeline-part2
|
en
|
EBS Integrator
|
[
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=384&q=75 384w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=384&q=75 384w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=3840&q=75 3840w"
] |
[] |
[] |
[
""
] | null |
[] | null |
en
|
/favicon.ico
|
https://ebs-integrator.com/en/blog/programming-timeline-part2
|
Let’s take a deep dive in time and look at how computer science & programming languages began. Follow us as we go through the ages and look at the timeline, major personalities, and events that through their ingenuity paved the road to our current Technological and Digital Era.
The history of programming
Want to see the father of computers and the first ever programmer as they struggled in the 1800’s, or check out how a 17th-century philosopher helped us develop binary code that helps YOU read these words on your screen. Well, then you absolutely should check out our earlier entry in the series.
And if you’ve just finished that one, let's do a recap of where we left off.
Fortran code is said to be 20 times shorter than its analogue in handwritten assembly code.
The community was doubtful of it at the time due to performance concerns, but the fact that programmers could write more code quicker — it was an easy choice from the economical viewpoint.
This how the code interface looked in Fortran back then:
It wasn't like entering everything into a computer and having it appear on a screen, there were no screens... or even a keyboard.
Computer Program from 1954
Like Mr. Jacquard more than a hundred years ago, they had to encode it on punched cards.
The only difference is that we're talking about an IBM mainframe, and you couldn't touch the IBM mainframe. You had people who did that.
You write your program on that paper with a pen:
An improvement on the Fortran language
Since we're talking about Mr. Backus, let's go back to 1958, when Algol was created by a committee including European and American computer scientists.
Algol (also known as Algorithmic Language) is a high-level imperative computer programming language that introduced a wealth of features that would later influence and be used in the creation of Pascal, C#, C++ and Java, among others.
Matter of fact most languages now days have syntaxes inspired by Algol and it’s considered amongst the most influential programming languages ever. It’s said that Algol was an attempt to improve on some intrinsic problems met with Fortran and its wildly considered as an “evolution” of sorts. While we’re on the subject, John Backus along side Peter Naur developed the Backus-Naur Form of describing programming languages based on ALGOL 58-60. So, you could consider Backus as one of the original fathers of many languages we use today. Here’s how “Hello, World!” looks in Algol-60: BEGIN DISPLAY (“Hello, world!”) END This is starting to look quite reasonable.
Let’s get some clojure for computer sciences
We’re still in the 1958 as we’re about to witness another giant of “programming languages”, and a father-language to many revolutionary ideas at this time – be born.
LISP (aka LISt Processor) – a list depended high-level programming language (3GL) developed by John McCarthy at “Massachusetts Institute of Technology”. The language has a long history of changing dialects (like Racket, Clojure, Common lisp and Scheme).
Originally intended as a formal notation for computations influenced by lambda calculus and artificial intelligence; it surprised everyone after being introduced into the IBM 704 and turning out to be a fully functioning practical tool for solving problems.
Lisp is notable for pioneering many ideas in computer sciences like: tree data structures, dynamic typing, conditions, automatic storage management, recursion, high-order of functions and self-hosting compiler.
It is also notable for giving students a migraine, due to its overreliance on parenthesis, so much so they gave it nicknames like “Lost in stupid parenthesis” or “Lots of irritating superfluous parenthesis”. And here’s how “Hello, World!” would look in Lisp: (PRINT “Hello, World!”) Though in 1958 Lisp is still in its infancy and doesn’t see much broad use, this brings us to…
An old acquaintance
At this current time most of if not all computers are working in either low-programming languages like opcode/ASM or the only high-level programming language available Fortran. And if by some reason you want to upgrade your computer, you’ll have to write all that code from scratch.
So, let’s make a small leap to end of 1959. We’re going to meet someone, we’ve already discussed.
And that someone is Grace Hopper; the rear admiral of the US Navy and a computer scientist she has already made an impact on history of programming before by creating the A-0 compiler system.
Now however she’s attending the CODASYL or the “Conference on Data System Languages” as a technical aide and main expert.
The committee is formed in an attempt to guide the development of standardization of programming languages or create a single language that could be used across different machines regardless of hardware.
The result was the “Common Business Orientated Language”, or COBOL for short – an “English-like” Programming language heavily influenced by FLOW-MATIC created by Ms. Hopper;
Primary goal of COBOL was to lower the barrier of entry into programming; before the creation of High-level languages, programming was a realm exclusive to computer scientists and experts.
Now however, other enthusiasts from different professions like businesspeople, doctors, engineers, teachers and many other could incorporate computation into their work. To deal with underlying hardware each computing machine had to have its own COBOL compiler. But critically these compilers could accept the same COBOL source code. The notion of “Write once, apply everywhere” was created; for at this time, most computers use machine code or assembly which is very CPU specific. Let’s see how “Hello, World!” looks in COBOL: IDENTIFICATION DIVISION. PROGRAM-ID. hello-world. PROCEDURE DIVISION. DISPLAY "Hello, world!" That folks, is what we call a “language designed by committee”.
The arrival of the minicomputer
Hey mama welcome to the 60’s!! Wo-oh-oh-oh… Oh sorry, old habits.
But as you might have guessed, 1960’s has rolled around, and someone had a bright idea
How about we stop writing code in pencil and spending weeks on encoding it into punch-cards and use the conveniently invented teletype machines that news agencies and governments use to send messages to each other; and use those to “talk” with the computer?
With that sorted, humanity paved the way to a more general-purpose high-level programming languages designed to be used by everyone, everywhere.
And with the emergence of minicomputers on the market, there was a sharp need of a programming language/compiler compact enough to meet the demand.
Thus, BASIC was born – “Beginners’ All-purpose Symbolic Instruction Code”
Originally designed by John G. Kemeny and Thomas E. Kurtz in 1964; Basic’s core principal and objective was “ease of use” and “compactness”.
“Minicomputer” – PDP-11 So compact in fact that it is generally considered as the first commercial “interpreter” being able to make use of all those 4 kB of memory available on those “mini”-computers. “Basic” was amazing for teaching programming principles, it was easy to understand and follow; especially for people who didn’t have a strong understanding of mathematics or computers in general. Now days we know BASIC under its many dialects that appeared after its inception in 1964 – Microsoft BASIC; Visual BASIC; Atari BASIC; Commodore BASIC; Applesoft BASIC and dozen other variations. However, they all come from the “Dartmouth BASIC”. Aside from being the originator Dartmouth Basic ran with an interactive programming environment available to all students and teachers called “Dartmouth Time Sharing System” (DTSS); the first time-sharing operating system! 10 PRINT “Hello, World!” 20 GO TO 10 120 END
Family tree of programming languages. Most of the following have their roots deep down in the languages we’ve mentioned earlier. By 1970s and onwards to 1980s we had: PASCAL (1970); Smalltalk (1972); C (1972); SQL (1972); Scheme (1975); C++ (1980); Ada (1983); Objective-C (1983); Perl (1987) and later the 90s rolled around and we have our favourite: Haskel and Python (1990); Visual Basic (1991); Java, PhP and JavaScript in 1995 and so many others.
History of programming: End of todays session
We believe went too fast over there and overloaded the capacitors; we seem to be going back into 2021!
Well, I suppose that’s it for now, something went horribly wrong and I’m afraid we’re stuck in our time again. Perhaps later we’ll be coming back to the 70s and exploring more how our programming world evolved through the ages.
This was our take on the history of programming from the 1950s to 1960s.
If you think it’s a great idea, maybe leave a like and a comment saying so? What do you think is the most fascinating part of programming history? While you ponder that, I’ll go back researching the incredible ways you can improve your business and tech company with digital transformation.
Stay classy tech and business nerds!
|
||||||
4768
|
dbpedia
|
1
| 60
|
https://www.geeksforgeeks.org/the-evolution-of-programming-languages/
|
en
|
The Evolution of Programming Languages
|
[
"https://media.geeksforgeeks.org/gfg-gg-logo.svg",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/Google-news.svg",
"https://media.geeksforgeeks.org/auth/profile/2328y629trkwky4q6lb4",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/Google-news.svg",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/new-premium-rbanner-us.png",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/gfgFooterLogo.png",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/googleplay.png",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/appstore.png",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/suggestChangeIcon.png",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/createImprovementIcon.png"
] |
[] |
[] |
[
"Data Structures",
"Algorithms",
"Python",
"Java",
"C",
"C++",
"JavaScript",
"Android Development",
"SQL",
"Data Science",
"Machine Learning",
"PHP",
"Web Development",
"System Design",
"Tutorial",
"Technical Blogs",
"Interview Experience",
"Interview Preparation",
"Programming",
"Competitive Programming",
"Jobs",
"Coding Contests",
"GATE CSE",
"HTML",
"CSS",
"React",
"NodeJS",
"Placement",
"Aptitude",
"Quiz",
"Computer Science",
"Programming Examples",
"GeeksforGeeks Courses",
"Puzzles",
"SSC",
"Banking",
"UPSC",
"Commerce",
"Finance",
"CBSE",
"School",
"k12",
"General Knowledge",
"News",
"Mathematics",
"Exams"
] | null |
[
"GeeksforGeeks"
] |
2021-01-26T12:09:26
|
A Computer Science portal for geeks. It contains well written, well thought and well explained computer science and programming articles, quizzes and practice/competitive programming/company interview Questions.
|
en
|
GeeksforGeeks
|
https://www.geeksforgeeks.org/the-evolution-of-programming-languages/
|
Last Updated : 11 Jul, 2022
Improve
Programming Language is indeed the fundamental unit of today’s tech world. It is considered as the set of commands and instructions that we give to the machines to perform a particular task. For example, if you give some set of instructions to add two numbers then the machine will do it for you and tell you the correct answer accordingly. But do you know that Programming Languages are having a long and rich history of their evolution? And with a similar concern, here in this article, we’ll take a look at the evolution of Programming Languages over the period.
In the computer world, we have about 500+ programming languages with having their own syntax and features. And if you type who’s the father of the computer, then the search engine will show you the result as to Charles Babbage but the father of the computer didn’t write the first code. It was Ada Lovelace who has written the first-ever computer programming language and the year was 1883.
1883: The Journey starts from here…!!
In the early days, Charles Babbage had made the device, but he was confused about how to give instructions to the machine, and then Ada Lovelace wrote the instructions for the analytical engine.
The device was made by Charles Babbage and the code was written by Ada Lovelace for computing Bernoulli’s number.
First time in history that the capability of computer devices was judged.
1949: Assembly Language
It is a type of low-level language.
It mainly consists of instructions (kind of symbols) that only machines could understand.
In today’s time also assembly language is used in real-time programs such as simulation flight navigation systems and medical equipment eg – Fly-by-wire (FBW) systems.
It is also used to create computer viruses.
1952: Autocode
Developed by Alick Glennie.
The first compiled computer programming language.
COBOL and FORTRAN are the languages referred to as Autocode.
1957: FORTRAN
Developers are John Backus and IBM.
It was designed for numeric computation and scientific computing.
Software for NASA probes voyager-1 (space probe) and voyager-2 (space probe) was originally written in FORTRAN 5.
1958: ALGOL
ALGOL stands for ALGOrithmic Language.
The initial phase of the most popular programming languages of C, C++, and JAVA.
It was also the first language implementing the nested function and has a simple syntax than FORTRAN.
The first programming language to have a code block like “begin” that indicates that your program has started and “end” means you have ended your code.
1959: COBOL
It stands for COmmon Business-Oriented Language.
In 1997, 80% of the world’s business ran on Cobol.
The US internal revenue service scrambled its path to COBOL-based IMF (individual master file) in order to pay the tens of millions of payments mandated by the coronavirus aid, relief, and economic security.
1964: BASIC
It stands for beginners All-purpose symbolic instruction code.
In 1991 Microsoft released Visual Basic, an updated version of Basic
The first microcomputer version of Basic was co-written by Bill Gates, Paul Allen, and Monte Davidoff for their newly-formed company, Microsoft.
1972: C
It is a general-purpose, procedural programming language and the most popular programming language till now.
All the code that was previously written in assembly language gets replaced by the C language like operating system, kernel, and many other applications.
It can be used in implementing an operating system, embedded system, and also on the website using the Common Gateway Interface (CGI).
C is the mother of almost all higher-level programming languages like C#, D, Go, Java, JavaScript, Limbo, LPC, Perl, PHP, Python, and Unix’s C shell.
Some other programming languages that are popular among programmers are listed below.
Please Login to comment...
|
|||||
4768
|
dbpedia
|
0
| 7
|
https://craftofcoding.wordpress.com/tag/algol/
|
en
|
Algol – The Craft of Coding
|
[
"https://craftofcoding.wordpress.com/wp-content/uploads/2021/03/algorithm35.png?w=572",
"https://craftofcoding.wordpress.com/wp-content/uploads/2021/02/wirthlanguages1-1.png?w=599",
"https://craftofcoding.wordpress.com/wp-content/uploads/2021/02/wirthlanguages2.png?w=620",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://pixel.wp.com/b.gif?v=noscript"
] |
[] |
[] |
[
""
] | null |
[] |
2021-04-06T20:03:37+00:00
|
Posts about Algol written by spqr
|
en
|
https://s1.wp.com/i/favicon.ico
|
The Craft of Coding
|
https://craftofcoding.wordpress.com/tag/algol/
|
The Sieve of Eratosthenes likely first appeared in computing in 1961 as Algorithm 35 in Communications of the ACM (p.151). It, likely many other algorithms of the time was shown in Algol-60.
To try and implement it as close as possible to the original syntax, given that there are few if any Algol-60 compilers out there, I chose to reproduce the algorithm in its entirety in Algol-68 using the Algol-68 Genie (a68g) specification. It wasn’t a lot of fun.
PROC sieve = (INT nmax, REF INT j) []INT: ( [nmax]INT primes; INT i; primes[1] := 1; primes[2] := 2; primes[3] := 3; j := 3; FOR n FROM 3 BY 2 TO nmax DO i := 3; l1: IF primes[i] <= sqrt(n) THEN GOTO a1 ELSE GOTO a2 FI; a1: IF n/primes[i] = n%primes[i] THEN GOTO b1 ELSE GOTO b2 FI; b2: i := i + 1; GOTO l1; a2: primes[j] := n; j := j + 1; b1: print("") OD; j := j - 1; primes ); main: ( INT n, numpr; print("Upper limit to calculate sieve? "); read(n); [n]INT primes; primes := sieve(n,numpr); FOR i FROM 1 TO numpr DO print(whole(primes[i],5)); IF i MOD 10 = 0 THEN print(new line) FI OD )
It was an interesting experience, partially because the documentation for a68g is okay, but it is filled with the same obtuse stuff in the original reports for Algol-68. There is a lot to this language, and trying to find simple things, like how to pass back a variable from a subprogram is trying. There are things I inherently like, like the ability to specify the size of the array to be created within the function (proc seems like a procedure, but behaves more like a function). And I like that assignment is :=, and equality is =. It’s also nice to use a language where it is easy to return an array from a function. Here is the program executed with the value of n set to 1000.
Upper limit to calculate sieve? 1000 +1 +2 +3 +5 +7 +11 +13 +17 +19 +23 +29 +31 +37 +41 +43 +47 +53 +59 +61 +67 +71 +73 +79 +83 +89 +97 +101 +103 +107 +109 +113 +127 +131 +137 +139 +149 +151 +157 +163 +167 +173 +179 +181 +191 +193 +197 +199 +211 +223 +227 +229 +233 +239 +241 +251 +257 +263 +269 +271 +277 +281 +283 +293 +307 +311 +313 +317 +331 +337 +347 +349 +353 +359 +367 +373 +379 +383 +389 +397 +401 +409 +419 +421 +431 +433 +439 +443 +449 +457 +461 +463 +467 +479 +487 +491 +499 +503 +509 +521 +523 +541 +547 +557 +563 +569 +571 +577 +587 +593 +599 +601 +607 +613 +617 +619 +631 +641 +643 +647 +653 +659 +661 +673 +677 +683 +691 +701 +709 +719 +727 +733 +739 +743 +751 +757 +761 +769 +773 +787 +797 +809 +811 +821 +823 +827 +829 +839 +853 +857 +859 +863 +877 +881 +883 +887 +907 +911 +919 +929 +937 +941 +947 +953 +967 +971 +977 +983 +991 +997
Language design in the 1960’s was dominated by attempts to improve upon Algol60. Hoare noted in 1973 [1] that ALGOL-60 was “… a language so far ahead of its time, that it was not only an improvement on its predecessors, but also on nearly all its successors”. Niklaus Wirth was a graduate student at University of California, Berkeley when he started playing around with languages. He joined a research group which was engaged with the implementation and improvement of a dialect of ALGOL-58, NELIAC. He described the compiler as “an intricate mess“, and the process one of “1% science, and 99% sorcery” [1].
The first language Wirth designed leading to his dissertation was Euler, as Wirth himself put it “a trip with the bush knife through the jungle of language features and facilities” [1]. Euler had academic elegance, but no real practical value, however it did catch the attention of the IFIP Working Group, engaged in designing the successor to ALGOL-60. There seemed to be two camps here, one which wanted to push the boundaries of language design and another which wanted to extend ALGOL-60. Wirth belonged to the latter group. In 1965, three reports by Wirth, Seegmüller, and Wijngaarden described three different quasi-complete languages. The complexity of the design process, involving far too many people, eventually led Wirth to become disheartened with the design process, and he went off to develop his own version of ALGOL-60s successor. Designed with contributions from Tony Hoare, this language would become ALGOL-W.
Wirth’s first significant language was PL360, a byproduct of the ALGOL-W effort. The IBM 360 upon which ALGOL-W was implemented offered the choice of assembly language or Fortran compilers, neither of which was very attractive. PL360 [2] was a tool with which to implement ALGOL-W. ALGOL-W had a number of applications, but was deficient as a systems programming language. PL360 was to become more successful than ALGOL-W, largely because ALGOL-W was a complex language, and the target computers inadequate. In the fall of 1967, Wirth returned to Switzerland to begin work on the language that would become most closely associated with his language design efforts – Pascal. Wirth would go on to create a trinity of languages, neither of which were that closely related, but all that had one thing in common, they were “ALGOL-like” languages – Pascal, Modula-2 and Oberon.
Pascal – Based largely on Algol W, including the use of it’s while and case statements, and record structures. There were syntactic differences, however Algol 60 was almost a subset of Pascal.
Modula-2 – Wirth noted that Modula-2 “includes all aspects of Pascal, and extends them with the module concept”.
Oberon – Evolved from Modula-2 by very few additions, and several subtractions.
Note that “ALGOL-like” really implied – a procedural paradigm, a rigorously defined syntax, traditional mathematical notation (with the nonsense of symbols like ++), block structure providing identifier scope, the availability of recursion, and a strict, static data typing system. [3]
[1] Wirth, N., “From programming language design to computer construction”, CACM, 28(2), pp. 160-164 (1985)
[2] Wirth, N., “PL360, a programming language for the 360 computers”, Journal of the ACM, 15(1), pp.34-74 (1968)
[3] Wirth, N., “Modula-2 and Oberon”, in ACM Conf. on History of Programming Languages, pp.3-1-3-10 (2007)
The cool thing about some languages is that you don’t have to be a guru in order to interpret them.
Take Algol 60 for example. Algol wasn’t the best language in the world, but it had some nice features. For example you could use a if statement within an assignment to assign values. Here’s an example:
postage := if weight ≤ 20 then 40 else if weight ≤ 125 then 80 else if weight ≤ 500 then 120 else 200;
Okay, some people will say “But C can do that too!” Yes it can, but certainly not as elegantly. Here is the C expression, using a ternary operator.
postage = (weight <= 20) ? 40 : (weight <= 125) ? 80 : (weight <= 500) ? 120 : 200 ;
It basically embeds a ternary conditional each time the “else if” is invoked. It just isn’t very elegant to look at, let alone write, or understand.
After the freewheeling fifties, programming languages entered a more bohemian time, with many languages appearing, and those 1950s languages evolving, as new programming ideologies evolved. First out of the gate was ALGOL 60 which evolved from the fledgling Algol 58. It was common practice to “re-design” a language quite considerably, rather than make tweaks to an existing language. As such languages such as Algol 58 often fell out of favour as people adopted newer versions of a language. Algol was to spawn what would eventually become quite an issue in the computing community: augmentations, extensions, and derivations, leading to numerous dialects of a language. ALGOL 68 appeared just before its contemporaries C and Pascal, however due to inherent complexities, never really hit it off. (ALGOL 68 was used by European defense agencies, however the US decided to hedge its bets with Ada.) There was some controversy over the design of a successor to Algol 60, with Wirth and Hoare going on to develop their own successor, Algol-W. Algol 68 was a major revision to Algol 60, whereas Algol-W included more subtle changes. The 1960’s also saw a number of updates to both Fortran and Cobol.
Languages had also begun to diversify into differing realms. In 1964, Kenneth E. Iverson introduced APL, (A Programming Language) a concise symbol-based language adept at dealing with arrays. Early versions of the language contained no control structures, and whilst Cobol may have strayed too far into “English-language” syntax, APL may have gone too far with its mathematical syntax. APL was popular with those doing computer graphics, but its use declined in the 1980s, due in part to the advent of languages such as MATLAB and GNU Octave. The 1960s also saw the second generation of programming languages evolve, those influenced by the likes of Fortran or Algol. In 1964 PL/I (Programming Language One) made its appearance, supposedly for use in data processing, numerical computation, scientific computing, and systems programming. However it was a more complex language than either Fortran or Cobol, from which it had evolved, and was not extremely successful. Algol also influenced the design of Simula, the first OO language which appeared in 1967. In an educational context, Logo appeared in 1967, famous for its use of “turtle graphics”, influenced by Lisp.
P.S. If you want to have some fun with a Logo interpreter, check this out.
|
||||
4768
|
dbpedia
|
0
| 94
|
https://forums.theregister.com/forum/all/2020/05/15/algol_60_at_60/
|
en
|
ALGOL 60 at 60: The greatest computer language you've never used and grandaddy of the programming family tree • The Register Forums
|
https://forums.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
|
https://forums.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
|
[
"https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/user_icon_white_extents_16x16.png",
"https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/user_icon_white_filled_extents_16x16.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/std/reg_logo_no_strapline.png",
"https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/magnifying_glass_white_extents_16x16.png",
"https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/icon/burger_menu_white_16x16.png",
"https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/icon/burger_menu_white_close_16x16.png",
"https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2Zr0U1Vkz04-aS1Sgk6xsgAAAAIA&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/back_to_page.png",
"https://forums.theregister.com/design_picker/c00f80f04b0eaf0123d821f6c9488fc1cb55fd0a/graphics/icons/unstar.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/feeds.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/coat_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/facepalm_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png",
"https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=3Zr0U1Vkz04-aS1Sgk6xsgAAAAIA&t=ct%3Dns%26unitnum%3D426raptor%3Dfalcon%26pos%3Dtop%26test%3D0",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=4Zr0U1Vkz04-aS1Sgk6xsgAAAAIA&t=ct%3Dns%26unitnum%3D426raptor%3Deagle%26pos%3Dmid%26test%3D0",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/childcatcher_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/boffin_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/coffee_keyboard_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/facepalm_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/devil_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/gold_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/unhappy_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/happy_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/windows_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/angel_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/go_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_down_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/stop_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/go_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/wtf_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/fail_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/happy_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/meh_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/unhappy_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/angel_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/devil_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/flame_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/mushroom_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/megaphone_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/trollface_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/facepalm_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/holmes_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/headmaster_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/coat_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/joke_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/coffee_keyboard_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/alert_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/black_helicopters_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/boffin_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pirate_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/alien_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/terminator_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/big_brother_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/childcatcher_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/linux_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/gimp_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/windows_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/it_angle_32.png",
"https://forums.theregister.com/design_picker/d518b499f8a6e2c65d4d8c49aca8299d54b03012/graphics/icon/vulture_white.png",
"https://forums.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg",
"https://forums.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg",
"https://forums.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg",
"https://forums.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/std/sitpublogo_2022.png",
"https://forums.theregister.com/Design/graphics/std/transparent_pixel.png"
] |
[] |
[] |
[
""
] | null |
[
"Anonymous Coward",
"Jason Bloomberg",
"Ron Martin",
"Ken Moorhouse",
"disgruntled yank",
"Primus Secundus Tertius",
"Doctor Syntax",
"shawn.grinter",
"Graham Cobb",
"John Sturdy"
] |
2020-05-15T00:00:00
|
en
|
/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
|
https://forums.theregister.com/forum/all/2020/05/15/algol_60_at_60/
|
No love for CORAL 66?
Back in the day, in order to get a military application running, I had to take the assembler output of a Coral 66 program and basically rewrite it completely in proper assembler, because the compiler wasn't merely not optimised, it seemed positively designed to waste as many CPU cycles as possible. It used the 9989 microprocessor, and completely ignored the 9989's register windowing system, instead creating boilerplate functions every single time.
Kids today...I'm the one feeding the punched tape repeatedly through the reader to get the floating point libraries to work. Arguments over bracket syntax, tabs versus space and to semicolon or not are really a sign that nowadays there is very little to complain about.
It started my career move...
We had an Elliott 803 at Rugby College (later absorbed into Lanchester Poly). Although I was studying Applied Physics, the computer part of the course fired me up and I went into computing from that point on. I remember the console speaker that, although abused to play music, did give you a sense of what your program was doing. Later on, in my 6800/8008/8080 hobby days, I used a transistor radio for the same thing.
There were two no-no's: The first was to remember how little memory you had to play with so large arrays were not possible and, if attempted, resulted in a subscript overflow message (iirc - SUBSCROFLO). The second was to make sure your plotter programmes completed. The plotter involved an extra paper tape load for the operator. If your plot failed then the whole machine had to be restarted. The plotting code was probably an early form of overlay.
Before we had magnetic tape installed the paper tape reader was something to behold. The output from the reader had to be caught in a basket as the speed was so high. The computer operator was a very smart young lady too.... Happy days. ;-)
I've used Algol-60 at school
I did my AO-Level Computer Studies in 1983 it 1983 or 1982 (can't remember!) and I did my project in Algol-60. Because that's what we ran on our Research Machines 380Zs. It was a program to solve the N-Queens problem, using the process of making different permutations of the numbers 1-N, and I did it using recursion (because I could). Little did I realise at the time that it was really out of date, and there's no reason we couldn't have done it in Pascal or C. But that's all we had.
Ugh! PASCAL
We used it in school (UMASS/Amherst) as a teaching language. Too "wordy" for me, especially the IO statements. But it did get the job done.
When I started working, it was at Data General, who had an Algol-like language called DG/L, which I absolutely loved. It was my language of choice for little utility programs on our AOS and AOS/VS systems.
Then, of course, came the Sun workstations, with UNIX and C.
Game over.
Bonus for the CALCOMP drum plotters. I worked part time at the comuter center in school, we had one (3 or 4 foot wide). The number of aborted plots due to pen failure was astounding, as was the pen budget!
Algol 68 is not ALGOL 60
Algol 68 is basically a different language from Algol 60—and the Algol 60 designers, most notably Djisktra, were less than impressed by it, and along with others on the committee issued a "minority report" disowning Algol 68. Syntax aside,and leaving out the object-oriented stuff, Algol 68 actually bears a distinct resemblance to C++. Algol 60, on the other hand, begat Pascal. So really they are different languages sharing part of a name.
I never got to use Algol 68 because it was difficult to write a compiler for and nowhere I worked had one. Burroughs Algol (a variant of ALGOL 60), on the other hand, was available. Time pressure, and perhaps a desire not to be seen to know too much about the "old iron", meant I never did write any Burroughs Algol. That was perhaps my loss. I don't think I missed anything by not writing Algol 68.
BTW Burroughs Algol was implemented in hardware. It was a stack machine with display registers and hardware support for resizing arrays.
ALGOL 60 was the first language I learned, at school in the Computer Club. Turn round was a bit longer than ten minutes. We used to write programs on coding sheets, which were then taken to the nearby university and typed up by data prep ladies, run by the operators on the ICL1909, and the coding sheets, pack of cards and printout was returned to us a week later. That did make you concentrate fairly hard on program correctness; my first program calculated primes and I've still got the output somewhere. After a while a friend and I learned that we could get off the bus on the way home, walk up to the university and punch our own cards on IBM 029 punches, and then watch while the operators ran the program. That made things a lot faster and meant we could write bigger programs - my favourite was a linear regression program for the results of our physics experiments. It made them look much more 'official' :)
And then we discovered the unversity had a free access PDP-8 so we learned BASIC on DECtape and the joys of typing into an ASR-33 and Friden Flexowriters. And then ...
... the university got a copy of the POP-2 compiler. Still my favourite language ever! Programs to synthesise English using Chomsky's grammars etc etc. Machine Intelligence 1, 2, 3 ...
My first computer language...
... was Algol W, at university in the mid-70s. Sequence, selection and iteration - what more do you need? :) It set the way for how I think about programming, and I found it difficult to think in other ways. For example, we later covered the functional programming language SASL (a precursor of Miranda, which in turn preceded Haskell), and I found it almost impossible to think that way, until one day the penny simply dropped, and I had no problems after that.
For Raymond, up above, I still have my copy of the MoD "Blue Book" standard for CORAL 66. Used that at university on a CTL mini, and in professional life on Ferranti Argus and CTL/ITL minis.
My real love was Pascal, and I remember how chuffed I was the day I discovered that although I had been using it for many months, I hadn't realised that our implementation of it did not include GOTO statements! I still have a free implementation of it on my home PC - very useful for occasional puzzle solving.
Others I used included Fortran IV and BCPL at university, and ASM-86, PL/M-86, and PL/I in professional life.
But then I became a consultant :)
Early Uni Computing
Back in the early 70's Bangor University Computing Department kept what was the previous workhorse 803 in a back room, after migrating to an ICL4130. The computing department manager would oversee a rigorous test on the 803 operating procedures before allowing any member of the great unwashed student body unsupervised access to said machine. I've still got my copy of Algol A104 compiler tape in a drawer somewhere. As an introduction into the wide world of programming this was an ideal education in small memory management and code optimisation, useful on my next processor, an intel 4040. How did the compile procedure go? Power supply to 'on', tape in reader, press the control button to 'read', hit the 'operate' bar, press the control button back to 'normal', hit the 'operate' bar... And listen to the soothing chatter of the loudspeaker connected to the overflow bit... Kids these days, they don't believe you!
ALGOL 60 was the first language I learned, in 1973. At Warwick Uni, they reckoned that virtually any science undergrad would need computer skills, so this was a compulsory course.
They ran an ICL 4130 and a 4120 at the time - designed by Elliott, so presumably somebody at Elliott was an ALGOL fan.
Never used it in any commercial setting, but it laid the basis for the structure used in any number of languages since. When writing COBOL, I often sketched things out using ALGOL to check things made sense before getting the coding sheets dirty.
Who remembers Backus-Naur ?
Not quite how I remember it
"you basically march up to the machine, the machine's got the ALGOL system loaded, you run your programme, it produces gibberish"
When I was in the sixth form, one of my maths teachers had a contact at the local Poly. We were low priority, do we got to leave our precious punched tapes with the operator who would run it for us in time for us to pick up the gibberish the next day.
Needless to say I thought computers were complete bollocks.
A couple of years later when I was at University, and they had these lineprinter style terminals, I started to think wow, maybe there's something in this; soon after I got access to a green screen terminal.
The rest is history.
Fall 1965: Gunn High School in Palo Alto CA... the school district's fist 'computer' class (Perhaps one of the first HS classes anywhere?)
They only had a keypunch machine at the school... and at the first class admitted they were still looking for a computer to use, as the District had quashed plans to allow their own in-house computer to be used by [gasp] students!
Next class, they announced they had FOUND a computer very nearby:
The Burroughs B5500 mainframe at SLAC!!! (Stanford Linear Accelerator Center).
No problem for students there!!! We ran our class exercises in B5500 ALGOL.
Early Algol68 Compilers
In the early 1980’s UCLA had an active group of graduate students working in a compiler for Algol68 for the IBM 360/91 under Robert Ugalus. The book, “the Informal introduction to Algol68” was the Bible for the language semantics. The book created its own set of specialized words like propositity and lapsity to describe in precious words the language semantics. The underlsying memory model was the stack retention model with variants called the Cactus stack for its threaded nature. The language semantics, referred in previous posts as European sensibilities was quite well done. It lacked a machine description in favor of very precise descriptions of the language semantics, very worthwhile reading for those who like reading compilers semantics by jumping into the way back machine. As Joe Bob Briggs used to say, “check it out”.
Algol 68 on a ICL 4100
When at Warwick University in 1974 I was using Algol 68 for my computing project. My maze solver did eventually work though as mentioned the bug cycle was more like one day as you had to post your punch card stack into the pigeon hole of the computer suite and wait for them to process it. I found by trial and error that why my maze solver never ran correctly on Friday afternoons was because they used to take half of the magnetic memory offline to clean it or something. The largest maze I could create an a Friday was 3 x 3!
Algol 60 on the HP 2100A
Well, I've used it -- it was the best language to use on the HP 2100A, because you needed only one paper tape, because Algol 60 was designed to work with a single pass compiler. Apart from it being a better language anyway.
FORTRAN IV, as it was then, needed a two paper tapes. So, even for a little FORTRAN program, you had to load the first pass compiler tape (then wind it up on its spool), then put your program tape through and get the intermediate tape. You'd then load the second compiler paper tape, and feed it the intermediate tape. It would then produce the binary. If anything went wrong, you had to start again.
Even then proper design made a huge difference. Declaring your variables first, and declaring functions before you use them, makes good sense anyway, and has the excellent side-effect of improving usability by one-pass compilation.
The influence of ALGOL 60
You can see the influence of ALGOL 60 on modern languages most clearly if you compare programs written in 1960 versions of ALGOL, FORTRAN, COBOL, and LISP (which were the most widespread languages at the time). The ALGOL 60 program will (for the most part) be readily readable by someone who has learned C, Java, or C# and nothing else. Understanding the FORTRAN, COBOL, or (in particular) LISP programs would require a good deal of explanation, but understanding the ALGOL 60 program would mainly be realising that begin, end, and := correspond to curly braces and = in C, Java, and C#. Look, for example, at the Absmax procedure at https://en.wikipedia.org/wiki/ALGOL_60#Examples_and_portability_issues
FORTRAN and COBOL continued to evolve into something completely unlike their 1960 forms while still retaining their names -- even up to today. ALGOL mainly evolved into languages with different names, such as Pascal, C, CPL, Simula and many others. So ALGOL is not really more of a dead language than FORTRAN II and COBOL 60. There was a computer scientist in the late 60s that was asked "What will programming languages look like in 2000?". He answered "I don't know, but I'm pretty sure one of them will be called FORTRAN". This was a pretty good prediction, as Fortran (only name change is dropping the all caps) still exists, but looks nothing like FORTRAN 66, which was the dominant version at the time. You can argue that the modern versions of FORTRAN and COBOL owe more to ALGOL 60 than they do to the 1960 versions of themselves.
My first programing language was Burroughs EXTENDED Algol 60 on a B5000 in 1965. Burroughs always extended the basic language. The beauty of the Burroughs' extensions is that their ALGOL 60 was wedded to the hardware design, and their compilers are wall written in the language they compiled. At that time is was ALGOL, COBOL and ESBOL(?SP) which was superset of ALGOL used to program the MCP. I did in fact program the MCP. In ESBOL one had wide open access to the hardware itself.
The B5000 hardware had an interesting limitation: Code was generated into program segment strings which were block limited to 1024 words. A word was 48 bits. Why? The program segment string S register was 10 bits. If say, a BEGIN code END; block exceeded 1024 words, then the compiler yielded a "program segment string exceeds 1024" error. This of course nailed me in my first program. Easy enough to repair.
The B5000. B6000 series also had virtual memory. Program segment strings need not be memory resident. And the 1024 word segments could be viewed a VM. Each program segment string terminated n a descriptor that referenced the program reference table (PRT). This referenced location could be a disk address to access the next segment. Etc.
The B6000 series, I worked on the B6500, generalized the PRT into what was called the cactus stack. This enabled the extended version of ALGOL 60 to have multiple dimensional arrays of tasks. Super feature. An array element had the parent node that could link to family member tasks. The first time I used the feature I had a "death in the family error." This meant that a member the family hierarchy crashed.
I can go on and on about this amazing hardware/ALGOL fusion. I have to say it's unforgettable since I'm soon to be 80 years old. I've written software in some 20 or so languages during my career as a computer scientist, and Burroughs extended ALGOL 60 is definitely the mid-wife of almost all of them.
Bill
Algol was OK, but all wasn't plain sailing.
I programmed a lot in Algol 60 between 1967 and 1971 at the University of Salford on the English Electric KDF9. I was doing research into Computer Aided Circuit design using Y matrices. When I started paper tape was in use created and edited on Friden Flexowriters. My program was fairly large for the time and a relatively large roll of tape. There were two ways of changing the program - put the paper tape in the flexowriter and set it to copy, up to the line that you wanted to change, type in the changes, re-sync the rest of the tape and let it copy the rest OR get out the little tape splicing machine and hack out a section, gluing your amendment in. Compiling and running was effectively twice per day since you had to write out a job slip and put your tape in a box for the computer operators to load, compile and execute. Then - REVOLUTION - the kdf9 was front-ended with a pdp8 running a system called COTAN ( I faintly recall from more than half a century ago). This had a disc where you could keep your program and edit it using ksr33 teleprinters (if you were patient). Then submit your job electronically no less.
Unfortunately the COTAN system had a sort-of anti -Algol design - it was based on 80 column card images. Students were allocated a certain (insufficient) maximum number of cards. Algol 60 was not Fortran, so my beautiful indented layout of the "begin" and "ends" of the paper tape which used quite a lot of blank characters didn't fit in the allocated card space. Answer - eliminate most of the comments and the blanks. This resulted in a more or less "rectangular" program of solid text 80 characters wide - not that the Algol compilers cared but the very devil to read, edit, and debug.
I still have some printouts (somewhere) of the resulting mess together with some of the little boxes originally used to submit the paper tape jobs.
I don't hunger after the "old days" of computing.
A lot of people program in ALGOL 60
There is a lot of programming done in ALGOL 60 on Burroughs machines (now Unisys MCP machines). Burroughs extended the language with IO, some an extension of FORTRAN-style IO, but most programmers did their own direct formatting, since the FORTRAN IO was interpreted and slow.
I believe Burroughs ALGOL was based on Elliot ALGOL. I later worked with another Burroughs guy who developed a language based on Elliot ALGOL for the Apple II. This was at a company called Netcomm in Australia.
Actually Don Knuth wrote one of the first Burroughs ALGOL compilers on a summer break as a student. It was on the B200 (from memory), but it predated the B5000 ALGOL compiler.
Burroughs ALGOL is a really heavy-duty systems language that makes C look like the toy that it is. It is a shame that C has effectively killed language development. If ALGOL or its next generation (ALGOL-68, ALGOL-W, Pascal, CPL, etc) had continued, we’d probably have pretty solid languages by now, rather than the rather flimsy C.
1965
Having done my degree at Woolwich Polytechnic (Tommy Flowers was there long before me!) from 1965 onwards while working as a rocket scientist at the MoD, I was one of the early progammers who never actually went in to the computer room where the program was run. I simply submitted my Algol programs on coding sheets which would be run and sent back. I did occasionally see the computer through a window!
I then taught Algol at Braintree college on an Elliott 803.
Nothing compares with MIRFAC, probably the most sophisticed language ever running on the MoD's COSMOS computer though. Simon Lavington reviews this machine.
Re: Facebook simply would not exist today if not for Bletchley Park
Apart from Colossus Bletchley Park had nothing to do with electronic digital computers. All data analysis was done on Hollerith-type card machines.
Actually Britain was far ahead of the US in developing commercial computers. Just look at the history of LEO. Britain also pioneered practical computers such as EDSAC which is the forerunner for all today's computers and can bee seen at the National Museum of Computer, located on the Bletchley Park Estate. Spend 3 hours there and see the Harwell Dekatron - the world's oldest original working computer.
While there see a working Enigma/Bombe and the Lorenz/Colossus.
My advice is to see TNMoC first then another pop down to Bletchley Park for a brief tour to see the Mansion and other interesting facts about their role in the the second world war.
|
||||
4768
|
dbpedia
|
2
| 19
|
https://www.theregister.com/2020/05/15/algol_60_at_60/
|
en
|
ALGOL 60 at 60: The greatest computer language you've never used and grandaddy of the programming family tree
|
https://www.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
|
https://www.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
|
[
"https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/user_icon_white_extents_16x16.png",
"https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/user_icon_white_filled_extents_16x16.png",
"https://www.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/std/reg_logo_no_strapline.png",
"https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/magnifying_glass_white_extents_16x16.png",
"https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/icon/burger_menu_white_16x16.png",
"https://www.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/icon/burger_menu_white_close_16x16.png",
"https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2Zr0U7-FDMakn8JOTZBPM5QAAABM&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0",
"https://www.theregister.com/design_picker/f5daacc84b9722c1e31ba85f836c37e4ad993fc4/graphics/icons/bubble_comment_white.png",
"https://www.theregister.com/design_picker/f5daacc84b9722c1e31ba85f836c37e4ad993fc4/graphics/icons/bubble_comment_white.png",
"https://www.theregister.com/design_picker/d518b499f8a6e2c65d4d8c49aca8299d54b03012/graphics/icon/vulture_red.svg",
"https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icons/social_share_icon.svg",
"https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2Zr0U7-FDMakn8JOTZBPM5QAAABM&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0",
"https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44Zr0U7-FDMakn8JOTZBPM5QAAABM&t=ct%3Dns%26unitnum%3D426raptor%3Dfalcon%26pos%3Dmid%26test%3D0",
"https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33Zr0U7-FDMakn8JOTZBPM5QAAABM&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0",
"https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44Zr0U7-FDMakn8JOTZBPM5QAAABM&t=ct%3Dns%26unitnum%3D426raptor%3Dfalcon%26pos%3Dmid%26test%3D0",
"https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33Zr0U7-FDMakn8JOTZBPM5QAAABM&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0",
"https://regmedia.co.uk/2020/05/14/algol_60_tapes.jpg?x=648&y=372&infer_y=1",
"https://regmedia.co.uk/2020/05/14/algol_plotter.jpg?x=648&y=324&infer_y=1",
"https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icons/social_share_icon.svg",
"https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icons/social_share_icon.svg",
"https://www.theregister.com/design_picker/f5daacc84b9722c1e31ba85f836c37e4ad993fc4/graphics/icons/bubble_comment_white.png",
"https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=6&c=66Zr0U7-FDMakn8JOTZBPM5QAAABM&t=ct%3Dns%26unitnum%3D6%26raptor%3Dhawk%26pos%3Dbtm%26test%3D0",
"https://www.theregister.com/design_picker/d518b499f8a6e2c65d4d8c49aca8299d54b03012/graphics/icon/vulture_white.png",
"https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg",
"https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg",
"https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg",
"https://www.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/std/sitpublogo_2022.png",
"https://www.theregister.com/Design/graphics/std/transparent_pixel.png"
] |
[] |
[] |
[
""
] | null |
[
"Richard Speed"
] |
2020-05-15T00:00:00
|
Back to the time when tape was king
|
en
|
/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
|
https://www.theregister.com/2020/05/15/algol_60_at_60/
|
2020 marks 60 years since ALGOL 60 laid the groundwork for a multitude of computer languages.
The Register spoke to The National Museum of Computing's Peter Onion and Andrew Herbert to learn a bit more about the good old days of punch tapes.
ALGOL 60 was the successor to ALGOL 58, which debuted in 1958. ALGOL 58 had introduced the concept of code blocks (replete with begin and end delimiting pairs), but ALGOL 60 took these starting points of structured programming and ran with them, giving rise to familiar faces such as Pascal and C, as well as the likes of B and Simula.
"In the 1950s most code was originally written in machine code or assembly code," said Herbert, former director of Microsoft Research in Cambridge, with every computer having its own particular twist on things. A first generation of languages, called "Autocode", existed for coding problems like equations which could then be translated into machine code, but lacked the bells and whistles of today. Worse, some had features that others lacked, making hopping between systems tricky.
"There was an Autocode for the [Elliott] 803," said Onion, "but it only supported expressions like A + B = C, so if you've got a complex equation, you have to break it down into individual single binary operations. So there was still a lot of hard work to be done by the programmer."
"Fortran," said Herbert, "emerged as the first real programming language for scientific and numeric work. That convinced people that having higher-level languages (as they called them then – they were pretty primitive by modern standards) made programmers more productive."
The overhead of compiling, and inefficiencies in the compilers themselves, meant that machine code remained king of the performance hill, but for those doing science work, the ability to churn out some code to solve a problem and then simply move on to the next was appealing.
"Fortran," Herbert continued, "was more like an autocode," before laughing, "It still is in some ways!
"And a bunch of people thought you could do better."
Enter the International Federation for Information Processing (IFIP), which Herbert recalled "had a whole bunch of committees who looked at standards and problems in computing".
One group started on the design of what was then called an "Algorithmic Language": a language for writing algorithms. The output, in 1958, described the language "ALGOL 58". However, as engineers began to create compilers for the new system, they found "all kinds of things hadn't really been thought about or worked through properly," recalled Herbert.
And so there were revisions and changes. A periodical called "The ALGOL Bulletin" detailed the travails of those involved as the problems and the weaknesses in the language were dealt with (or at least attempted).
The process was not unlike an open-source mailing list today, but in paper form.
Eventually, Herbert told us, "they published the ALGOL 60 report, which is the baseline that everyone then worked to."
The committees were under pressure and also suffered a little from differing international approaches. The American side had a lot of experience in Fortran and were seeking something that could quickly be made to work on their computers, while the Europeans were a little more cerebral and had, Herbert laughed, "terrible notions like beauty and elegance in mind for the language".
"People were sorting out some of the things that we now take for granted like ideas in structured programming, data structures, data types," he added.
Seeking solutions to the problem of portability of programmers between systems and code between hardware generations as well as avoiding the pain of having to rewrite programs every time a new iteration of computer arrived, vendors embraced the language with variants cropping up over many manufacturers.
Alas, those seeking a handy-dandy "HELLO WORLD" example will be disappointed. The Achilles' heel of the language that would go on to inspire so many others was that it lacked standard input/output capabilities.
"The defining committee couldn't agree on how to do input/output," said Herbert. "They decided that would be left to a library, and that library would be user dependent."
"In this case," added Onion, "the user being the compiler writer."
Oh dear. The omission pretty much did for vendor independence as manufacturers naturally went their own way, leaving large chunks of code incompatible between systems. There were also elements of ALGOL 60 that were open to interpretation, leaving it a little compromised from the start.
While ALGOL ploughed its furrow, Fortran continued to be developed in parallel. "People in the Fortran world," explained Herbert, "saw ideas in ALGOL they quite liked and brought them across." As the decades passed, Fortran remained the centre of gravity for scientific computing while ALGOL became more of an academic language, used for teaching computer science ideas.
"It was quite heavily used in the scientific community," Herbert said. "Most mainframe manufacturers supported it."
Some of the team behind ALGOL 60 stayed with the project and went on to come up with ALGOL 68, which, as far as Herbert is concerned, "nailed all the things that ALGOL 60 had left a bit vague".
Indeed, it was hard to avoid in the 1970s for those taking computer science courses. This hack has fond memories of the successor language, while the grandfather of Reg sub-editor Richard Currie had a hand in the development of ALGOL 68-R and RS.
"It had the world's most exotic input output system," Herbert laughed.
It was also, sadly for its enthusiasts, a bit of a dead end. Despite ALGOL 68-R becoming widely used in (particularly British) military applications for a time, it would take until the 1970s for a full implementation of ALGOL 68 to become available.
The last edition of The ALGOL Bulletin was published in 1988, with its editor noting: "ALGOL 68 as a language is very stable. It is used and loved by those who understand its benefits, and ignored (or misquoted) by the rest."
The story of ALGOL 60 is not so much of the language's eventual fate, but also of those that it inspired. ALGOL W, based on a proposal for ALGOL X, by Niklaus Wirth and QuickSort creator Tony Hoare would go on to inspire Wirth's Pascal and Modula-2. Pascal's influence continues to be felt today.
ALGOL 60 also heavily influenced the Combined Programming Language (CPL), developed in the 1960s but not implemented until the following decade. CPL in turn led to Basic CPL (BCPL), from which B descended. The B language was further developed to become C.
Tony Hoare was responsible for the implementation of ALGOL 60 on the Elliott 803 computer, an example of which remains operational at The National Museum of Computing, although compiling and running a program on that hardware is a little different to the development environments to which coders are now accustomed.
First, the compiler must be loaded from paper tape. The ALGOL program itself is then fed into the tape reader and "it sort of chunters away," remarked Onion, "for anything between 30 seconds to perhaps 15 or 20 minutes during the compilation."
Behold the chuntering
Once compiled, a program would be free to use the space originally occupied by the compiler. Doing so would, however, not win the programmer any popularity awards since the next user would have to reload the compiler again. Leaving it in memory meant that multiple programs could be run.
"That made it very popular for teaching," said Herbert, "because you can have a line of students, each with their paper tape with their programme in their hand and you basically march up to the machine, the machine's got the ALGOL system loaded, you run your programme, it produces gibberish, you go away and think about it and the next student runs their programme."
With paper tape being king, Onion observed that the experience of programming taught a bit of focus: "When your edit, compile, edit, compile cycle starts to get above about 10 minutes, you start to pay an awful lot of attention to your source code…"
The National Museum of Computing has two Elliott machines in its collection, a 1962 803B (which was donated after spending 15 years lurking in a barn following its decommissioning) and a 903. Both are fully operational and can be seen running once the museum is able to open its doors once again.
The 803B, which is maintained by Onion, also features a Calcomp drum plotter as well as some additional input/output features.
As for taking the ALGOL 60 itself out for a spin today, there are a few options for those not fortunate enough to have an Elliott 803 or 903 to hand. MARST will translate ALGOL 60 to C or one can get a feel for the whole 803 experience via a simulator.
Although as ALGOL 60 turns 60, you could just fire up a modern programming language. Lurking within will likely be the ideas of ALGOL's designers. ®
|
|||
4768
|
dbpedia
|
0
| 57
|
https://programminglanguages.info/language/fortran/
|
en
|
Fortran Programming Language Information & Resources • programminglanguages.info
|
[
"https://programminglanguages.info/img/logo.svg",
"https://upload.wikimedia.org/wikipedia/commons/b/b8/Fortran_logo.svg",
"https://programminglanguages.info/img/preview/programming-languages-influence-network-2021-black.jpg",
"https://geeksta.net/img/large/fortran-77-green.jpg"
] |
[] |
[] |
[
""
] | null |
[
"Ramiro Gómez"
] | null |
Fortran is a third generation, compiled, imperative programming language that is especially suited to numeric computation and scientific computing.
|
en
|
/img/logo.svg
| null | ||||||
4768
|
dbpedia
|
2
| 58
|
https://www.geeksforgeeks.org/the-evolution-of-programming-languages/
|
en
|
The Evolution of Programming Languages
|
[
"https://media.geeksforgeeks.org/gfg-gg-logo.svg",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/Google-news.svg",
"https://media.geeksforgeeks.org/auth/profile/2328y629trkwky4q6lb4",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/Google-news.svg",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/new-premium-rbanner-us.png",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/gfgFooterLogo.png",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/googleplay.png",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/appstore.png",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/suggestChangeIcon.png",
"https://media.geeksforgeeks.org/auth-dashboard-uploads/createImprovementIcon.png"
] |
[] |
[] |
[
"Data Structures",
"Algorithms",
"Python",
"Java",
"C",
"C++",
"JavaScript",
"Android Development",
"SQL",
"Data Science",
"Machine Learning",
"PHP",
"Web Development",
"System Design",
"Tutorial",
"Technical Blogs",
"Interview Experience",
"Interview Preparation",
"Programming",
"Competitive Programming",
"Jobs",
"Coding Contests",
"GATE CSE",
"HTML",
"CSS",
"React",
"NodeJS",
"Placement",
"Aptitude",
"Quiz",
"Computer Science",
"Programming Examples",
"GeeksforGeeks Courses",
"Puzzles",
"SSC",
"Banking",
"UPSC",
"Commerce",
"Finance",
"CBSE",
"School",
"k12",
"General Knowledge",
"News",
"Mathematics",
"Exams"
] | null |
[
"GeeksforGeeks"
] |
2021-01-26T12:09:26
|
A Computer Science portal for geeks. It contains well written, well thought and well explained computer science and programming articles, quizzes and practice/competitive programming/company interview Questions.
|
en
|
GeeksforGeeks
|
https://www.geeksforgeeks.org/the-evolution-of-programming-languages/
|
Last Updated : 11 Jul, 2022
Improve
Programming Language is indeed the fundamental unit of today’s tech world. It is considered as the set of commands and instructions that we give to the machines to perform a particular task. For example, if you give some set of instructions to add two numbers then the machine will do it for you and tell you the correct answer accordingly. But do you know that Programming Languages are having a long and rich history of their evolution? And with a similar concern, here in this article, we’ll take a look at the evolution of Programming Languages over the period.
In the computer world, we have about 500+ programming languages with having their own syntax and features. And if you type who’s the father of the computer, then the search engine will show you the result as to Charles Babbage but the father of the computer didn’t write the first code. It was Ada Lovelace who has written the first-ever computer programming language and the year was 1883.
1883: The Journey starts from here…!!
In the early days, Charles Babbage had made the device, but he was confused about how to give instructions to the machine, and then Ada Lovelace wrote the instructions for the analytical engine.
The device was made by Charles Babbage and the code was written by Ada Lovelace for computing Bernoulli’s number.
First time in history that the capability of computer devices was judged.
1949: Assembly Language
It is a type of low-level language.
It mainly consists of instructions (kind of symbols) that only machines could understand.
In today’s time also assembly language is used in real-time programs such as simulation flight navigation systems and medical equipment eg – Fly-by-wire (FBW) systems.
It is also used to create computer viruses.
1952: Autocode
Developed by Alick Glennie.
The first compiled computer programming language.
COBOL and FORTRAN are the languages referred to as Autocode.
1957: FORTRAN
Developers are John Backus and IBM.
It was designed for numeric computation and scientific computing.
Software for NASA probes voyager-1 (space probe) and voyager-2 (space probe) was originally written in FORTRAN 5.
1958: ALGOL
ALGOL stands for ALGOrithmic Language.
The initial phase of the most popular programming languages of C, C++, and JAVA.
It was also the first language implementing the nested function and has a simple syntax than FORTRAN.
The first programming language to have a code block like “begin” that indicates that your program has started and “end” means you have ended your code.
1959: COBOL
It stands for COmmon Business-Oriented Language.
In 1997, 80% of the world’s business ran on Cobol.
The US internal revenue service scrambled its path to COBOL-based IMF (individual master file) in order to pay the tens of millions of payments mandated by the coronavirus aid, relief, and economic security.
1964: BASIC
It stands for beginners All-purpose symbolic instruction code.
In 1991 Microsoft released Visual Basic, an updated version of Basic
The first microcomputer version of Basic was co-written by Bill Gates, Paul Allen, and Monte Davidoff for their newly-formed company, Microsoft.
1972: C
It is a general-purpose, procedural programming language and the most popular programming language till now.
All the code that was previously written in assembly language gets replaced by the C language like operating system, kernel, and many other applications.
It can be used in implementing an operating system, embedded system, and also on the website using the Common Gateway Interface (CGI).
C is the mother of almost all higher-level programming languages like C#, D, Go, Java, JavaScript, Limbo, LPC, Perl, PHP, Python, and Unix’s C shell.
Some other programming languages that are popular among programmers are listed below.
Please Login to comment...
|
|||||
4768
|
dbpedia
|
3
| 14
|
https://thehistoryofcomputing.net/website/algol
|
en
|
The History of Computing: ALGOL
|
http://assets.libsyn.com/content/73786964?height=250&width=250&overlay=true
|
http://assets.libsyn.com/content/73786964?height=250&width=250&overlay=true
|
[
"https://ssl-static.libsyn.com/p/assets/platform/websuite/deezer-logo.png",
"https://ssl-static.libsyn.com/p/assets/platform/websuite/stitcher.png",
"https://ssl-static.libsyn.com/p/assets/platform/websuite/radio-public.png",
"https://assets.libsyn.com/secure/show/197558/become-a-patron-button-1.png"
] |
[] |
[] |
[
""
] | null |
[] | null |
Today we’re going to cover a computer programming language many might not have heard of, ALGOL. ALGOL was written in 1958. It wasn’t like many of the other languages in that it was built by committee. The Association for Computing Machinery and the German Society of Applied Mathematics and Mechanics were floating around ideas for a universal computer programming language. Members from the ACM were a who’s who of people influential in the transition from custom computers that were the size of small homes to mainframes. John Backus of IBM had written a programming language called Speedcoding and then Fortran. Joseph Wegstein had been involved in the development of COBOL. Alan Perlis had been involved in Whirlwind and was with the Carnegie Institute of Technology. Charles Katz had worked with Grace Hopper on UNIVAC and FLOW-MATIC. The Germans were equally as influential. Frederich Bauer had brought us the stack method while at the Technical University of Munich. Hermann Bottenbruch from The Institute for Applied Mathematics had written a paper on constructing languages. Klaus Samelson had worked on a computer called PERM that was similar to the MIT Whirlwind project. He’d come into computing while studying Eigenvalues. Heinz Ritishauser had written a number of papers on programming techniques and had codeveloped the language Superplan while at the The Swiss Federal Institute of Technology. This is where the meeting would be hosted. They went from May 27th to June 2nd in 1958 and initially called the language they would develop as IAL, or the International Algebraic Language. But would expand the name to ALGOL, short for Algorithmic Language. They brought us code blocks, the concept that you have a pair of words or symbols that would begin and end a stanza of code, like begin and end. They introduced nested scoped functions. They wrote the whole language right there. You would name a variable by simply saying integer or setting the variable as a := 1. You would substantiate a for and define the steps to perform until - the root of what we would now call a for loop. You could read a variable in from a punch card. It had built-in SIN and COSIN. It was line based and fairly simple functional programming by today’s standards. They defined how to handle special characters, built boolean operators, floating point notation. It even had portable types. And by the end had a compiler that would run on the Z22 computer from Konrad Zuse. While some of Backus’ best work it effectively competed with FORTRAN and never really gained traction at IBM. But it influenced almost everything that happened afterwards. Languages were popping up all over the place and in order to bring in more programmers, they wanted a formalized way to allow languages to flourish, but with a standardized notation system so algorithms could be published and shared and developers could follow along with logic. One outcome of the ALGOL project was the Backus–Naur form, which was the first such standardization. That would be expanded by Danish Peter Naur for ALGOL 60, thus the name. In ALGOL 60 they would meet in Paris, also adding Father John McCarthy, Julien Green, Bernard Vauquois, Adriaan van Wijngaarden, and Michael Woodger. It got refined, yet a bit more complicated. FORTRAN and COBOL use continued to rage on, but academics loved ALGOL. And the original implementation now referred to as the ZMMD implementation, gave way to X1 ALGOL, Case ALGOL, ZAM in Poland, GOGOL, VALGOL, RegneCentralen ALGOL, Whetstone ALGOL for physics, Chinese ALGOL, ALGAMS, NU ALGOL out of Norway, ALGEK out of Russia, Dartmouth ALGOL, DG/L, USS 90 Algol, Elliot ALGOL, the ALGOL Translator, Kidsgrove Algol, JOVIAL, Burroughs ALGOL, Niklaus Firths ALGOL W, which led to Pascal, MALGOL, and the last would be S-algol in 1979. But it got overly complicated and overly formal. Individual developers wanted more flexibility here and there. Some wanted simpler languages. Some needed more complicated languages. ALGOL didn’t disappear as much as it evolved into other languages. Those were coming out fast and with a committee to approve changes to ALGOL, they were much slower to iterate. You see, ALGOL profoundly shaped how we think of programming languages. That formalization was critical to paving the way for generations of developers who brought us future languages. ALGOL would end up being the parent of CPL and through CPL, BCPL, C, C++, and through that Objective-C. From ALGOL also sprang Simula and through Simula, Smalltalk. And Pascal and from there, Modula and Delphi. It was only used for a few years but it spawned so much of what developers use to build software today. In fact, other languages evolved as anti-ALGOL-derivitives, looking at how you did something and deciding to do it totally differently. And so we owe this crew our thanks. They helped to legitimize a new doctrine, a new career, computer programmer. They inspired. They coded. And in so doing, they helped bring us into the world of functional programming and set structures that allowed the the next generation of great thinkers to go even further, directly influencing people like Adele Goldberg and Alan Kay. And it’s okay that the name of this massive contribution is mostly lost to the annals of history. Because ultimately, the impact is not. So think about this - what can we do to help shape the world we live in? Whether it be through raw creation, iteration, standardization, or formalization - we all have a role to play in this world. I look forward to hearing more about yours as it evolves!
|
en
|
https://assets.libsyn.com/secure/content/47114459?height=16&width=16
|
https://thehistoryofcomputing.net/website/algol
|
Today we’re going to cover a computer programming language many might not have heard of, ALGOL.
ALGOL was written in 1958. It wasn’t like many of the other languages in that it was built by committee. The Association for Computing Machinery and the German Society of Applied Mathematics and Mechanics were floating around ideas for a universal computer programming language.
Members from the ACM were a who’s who of people influential in the transition from custom computers that were the size of small homes to mainframes. John Backus of IBM had written a programming language called Speedcoding and then Fortran. Joseph Wegstein had been involved in the development of COBOL. Alan Perlis had been involved in Whirlwind and was with the Carnegie Institute of Technology. Charles Katz had worked with Grace Hopper on UNIVAC and FLOW-MATIC.
The Germans were equally as influential. Frederich Bauer had brought us the stack method while at the Technical University of Munich. Hermann Bottenbruch from The Institute for Applied Mathematics had written a paper on constructing languages. Klaus Samelson had worked on a computer called PERM that was similar to the MIT Whirlwind project. He’d come into computing while studying Eigenvalues.
Heinz Ritishauser had written a number of papers on programming techniques and had codeveloped the language Superplan while at the The Swiss Federal Institute of Technology. This is where the meeting would be hosted.
They went from May 27th to June 2nd in 1958 and initially called the language they would develop as IAL, or the International Algebraic Language. But would expand the name to ALGOL, short for Algorithmic Language. They brought us code blocks, the concept that you have a pair of words or symbols that would begin and end a stanza of code, like begin and end. They introduced nested scoped functions. They wrote the whole language right there. You would name a variable by simply saying integer or setting the variable as a := 1. You would substantiate a for and define the steps to perform until - the root of what we would now call a for loop. You could read a variable in from a punch card. It had built-in SIN and COSIN. It was line based and fairly simple functional programming by today’s standards. They defined how to handle special characters, built boolean operators, floating point notation. It even had portable types.
And by the end had a compiler that would run on the Z22 computer from Konrad Zuse. While some of Backus’ best work it effectively competed with FORTRAN and never really gained traction at IBM. But it influenced almost everything that happened afterwards.
Languages were popping up all over the place and in order to bring in more programmers, they wanted a formalized way to allow languages to flourish, but with a standardized notation system so algorithms could be published and shared and developers could follow along with logic. One outcome of the ALGOL project was the Backus–Naur form, which was the first such standardization. That would be expanded by Danish Peter Naur for ALGOL 60, thus the name.
In ALGOL 60 they would meet in Paris, also adding Father John McCarthy, Julien Green, Bernard Vauquois, Adriaan van Wijngaarden, and Michael Woodger. It got refined, yet a bit more complicated. FORTRAN and COBOL use continued to rage on, but academics loved ALGOL.
And the original implementation now referred to as the ZMMD implementation, gave way to X1 ALGOL, Case ALGOL, ZAM in Poland, GOGOL, VALGOL, RegneCentralen ALGOL, Whetstone ALGOL for physics, Chinese ALGOL, ALGAMS, NU ALGOL out of Norway, ALGEK out of Russia, Dartmouth ALGOL, DG/L, USS 90 Algol, Elliot ALGOL, the ALGOL Translator, Kidsgrove Algol, JOVIAL, Burroughs ALGOL, Niklaus Firths ALGOL W, which led to Pascal, MALGOL, and the last would be S-algol in 1979.
But it got overly complicated and overly formal. Individual developers wanted more flexibility here and there. Some wanted simpler languages. Some needed more complicated languages. ALGOL didn’t disappear as much as it evolved into other languages. Those were coming out fast and with a committee to approve changes to ALGOL, they were much slower to iterate.
You see, ALGOL profoundly shaped how we think of programming languages. That formalization was critical to paving the way for generations of developers who brought us future languages. ALGOL would end up being the parent of CPL and through CPL, BCPL, C, C++, and through that Objective-C. From ALGOL also sprang Simula and through Simula, Smalltalk. And Pascal and from there, Modula and Delphi. It was only used for a few years but it spawned so much of what developers use to build software today.
In fact, other languages evolved as anti-ALGOL-derivitives, looking at how you did something and deciding to do it totally differently.
And so we owe this crew our thanks. They helped to legitimize a new doctrine, a new career, computer programmer. They inspired. They coded. And in so doing, they helped bring us into the world of functional programming and set structures that allowed the the next generation of great thinkers to go even further, directly influencing people like Adele Goldberg and Alan Kay.
And it’s okay that the name of this massive contribution is mostly lost to the annals of history. Because ultimately, the impact is not. So think about this - what can we do to help shape the world we live in? Whether it be through raw creation, iteration, standardization, or formalization - we all have a role to play in this world. I look forward to hearing more about yours as it evolves!
|
|||
4768
|
dbpedia
|
0
| 16
|
http://groups.umd.umich.edu/cis/course.des/cis400/algol/algol.html
|
en
|
The ALGOL Programming Language
|
[
"http://groups.umd.umich.edu/cis/course.des/cis400/algol/algol.jpg",
"http://groups.umd.umich.edu/cis/course.des/cis400/prev_btn.gif",
"http://groups.umd.umich.edu/cis/course.des/cis400/home_btn.gif",
"http://groups.umd.umich.edu/cis/course.des/cis400/next_btn.gif"
] |
[] |
[] |
[
""
] | null |
[] | null | null |
Significant Language Features
ALGOL was the first second-generation programming language and its characteristics are typical of the entire generation. First consider the data structures, which are very close to first generation structures. In ALGOL 60 the block structure was introduced: the ability to create blocks of statements for the scope of variables and the extent of influence of control statements. Along with that, two different means of passing parameters to subprograms; call by value and call by name. Structured control statements: if - then - else and the use of a general condition for iteration control were also features, as was the concept of recursion: the ability of a procedure to call itself.
One of the greatest impacts ALGOL 60 had was a result of its description as found in Naur (1963). A major contribution of this report was the introduction of BNF notation for defining the syntax of the language. Overall, ALGOL is considered to be perhaps the most orthogonal programming language, meaning it has a relatively small number of basic constructs and a set of rules for combining those constructs. Every construct has a type associated with it and there are no restrictions on those types. In addition, most constructs produce values. Several of ALGOL�s other characteristics are listed below:
|
||||||||
4768
|
dbpedia
|
2
| 5
|
http://groups.umd.umich.edu/cis/course.des/cis400/maxim/lectures/chp2.htm
|
en
|
Chapter 2
|
[] |
[] |
[] |
[
""
] | null |
[] | null | null |
Evolution of the Major Programming Languages
Zuse�s Plankalk�l
� Plankalk�l is the name of a programming language written by a German scientist by the name of Konrad Zuse, between 1936 and 1945
� This language was never implemented
� Its name means program calculus
MINIMAL HARDWARE PROGRAMMING: PSEUDOCODE
� Deficiencies in machine code and assembly language code led to the development of somewhat higher�level languages
� Short code � developed by John Mauchly in 1949
- Implemented with a pure interpreter
- Simplified the programming process; however, it was approximately 50 times slower than machine code
� Speedcoding � developed by John Backus in 1954
- Extended machine languages to include floating-point operations
- Backus claimed that problems that could take two weeks to program in machine code could be programmed in a few hours using Speedcoding
� The UNIVAC �compiling system � developed by a team led by Grace Hopper between 1951 and 1953
- It made machine code source programs much shorter; however it was still quite primitive
THE IBM 704 AND FORTRAN
� Historical Background
- The first widely accepted high-level language was FORTRAN
- FORTRAN was developed in large measure for the capabilities of the IBM 704 system
- The IBM 704 system was first introduced in 1954
� Design Process
- Plans for FORTRAN were actually begun before the announcement of the IBM 704 system
- The report titled �The IBM Mathematical FORmula TRANslating System� described the first version of FORTRAN
- It stated that FORTRAN could provide the efficiency of hand-coded programs and the ease of programming the interpretive pseudocode systems
- It also stated that it would eliminate coding errors and the debugging errors
� FORTRAN I Overview
- All of FORTRAN I�s control statements were based on 704 instructions
- There are no data-typing statements; instead variables whose names began with I, J, K, L, M, and N are implicitly integer type, and all others were implicitly floating-point type
- FORTRANS claim that the machine code produced by the compiler was as efficient as what could be produced by hand was nearly achieved
� FORTRAN II Overview
- Introduced in the spring of 1958
- Fixes a lot of the bugs in FORTRAN I and adds some significant features, such as, independent compilation of subroutines
� FORTRAN IV, FORTRAN 77, and FORTRAN 90
- FORTRAN III, although developed, it was never a big success
- FORTRAN IV became one the most widely distributed programming languages until 1978
- FORTRAN IV is a big improvement over FORTRAN II because of additions to it, such as, explicit type declaration for variables, a logical IF construct, and the capability of passing subprograms
- FORTRAN 77 is introduced in 1977, it retains most of the FORTRAN IV features and adds to it, character string handling, logical loop control statements, and an IF with an optional ELSE clause
- FORTRAN 90 is the latest version of FORTRAN is dramatically different than FORTRAN 77
- It supplies a collection of built in functions for array operations
- Arrays can be dynamically allocated and deallocated on command
- New control statements, such as, CASE, EXIT, and CYCLE are added
- Recursion is now possible
FUNCTIONAL PROGRAMMING: LISP
� The first functional programming language was invented to provide language features for list processing, which grew out of the applications in artificial intelligence
� The Beginnings of Artificial Intelligence and List Processing
- Interest in AI appears in the mid-1950�s
- This interest comes from the study of linguistics, psychology, and mathematics
- Linguistics because of the interest in the processing of a natural language
- Psychology because of the interest in modeling human information storage and retrieval, along with other brain processes
- Mathematics because of the interest in mechanizing certain intelligent processes
- The concept of list processing was introduced in 1956 by Allen Newell, J.C. Shaw, and Herbert Simon
- A few languages are developed and implemented for list processing but none become really popular until the development of LISP
� LISP Design Process
- Developed by John McCarthy and Marvin Minsky at MIT, because of their dislike for the other languages developed for AI
� Language Overview
- There are only two data structures, atoms and lists
- Atoms are either symbols or numeric literals
- Lists are specified by delimiting their elements with parentheses and may be simple or nested lists
- A list is stored internally as a single-linked list structure
� Processes in Functional Programming
- All computation in a functional program is accomplished by applying functions to arguments
- Iterative processes can be specified with recursive function calls, which means no loops
� The Syntax of LISP
- Program code and data have exactly the same form
� Evaluation
- LISP is still the most widely used language for AI
- In the 1970�s and 1980�s many different dialects of LISP were developed, which led to the problem of portability
- In order to solve the problem of LISP�s portability issue, COMMON LISP was developed as a standard version
� Two Descendants of LISP
- These two dialects are Scheme and COMMON LISP
- Scheme was developed in 1975
- It is relatively small in size and uses static scooping exclusively
- Scheme functions can be values of expressions and elements of lists; they can be assigned to variables, passed as parameters, and returned as values.� These features were not available in the early versions of LISP
- COMMON LISP was developed in 1984
- It takes the features of a few dialects and scheme and combines them into one language; while Scheme is a relatively small language, COMMON LISP is relatively large language and uses dynamic as well as static scooping
� Related Languages
- Three related functional programming languages are ML, Miranda, and Haskell
- MetaLanguage or ML was designed in the 1980�s by Robin Milner
- ML is a functional language, which supports imperative programming
- Miranda was designed by David Turner in the 1980�s as well; It is based on ML, SASL, and KRC
- Haskell is based on Miranda and was designed by Hudak and Fasel in 1992
THE FIRST STEP TOWARD SOPHISTICATION: ALGOL 60
� Historical Background
- Efforts to design a universal language inspired the development of the programming language ALGOL 60
- In the late 1950�s, all of the languages that had been developed thus far were for single architectures only
- Two user groups SHARE and USE showed an interest in forming a committee through the ACM organization, in order to study and recommend the design and implementation of a universal programming language
- Another group in Europe (GAMM) also was interested and in 1958 the ACM committee along with GAMM officially agreed to a joint language design project
- FORTRAN was not considered because of its sole ownership by IBM.
� Early Design Process
- The syntax of the language should be close to standard mathematical notation
- It should be possible to use the language for the description of computing processes in publications
- And programs in the new language must be mechanically translatable into machine language
� ALGOL 58 Overview
- The first language designed in this project was ALGOL 58
- ALGOL comes from the original name of the language, which is the International ALGOrithmic Language
- Many of the features in ALGOL 58 were derived from FORTRAN
- It generalized FORTRAN�s features and added several new constructs and concepts
� ALGOL 60 Design Process
- ALGOL 58 was furiously debated, resulting in modifications and additions to the language and the development of ALGOL 60
� ALGOL 60 Overview
- ALGOL 60 is result of the modifications made to ALGOL 58
- Introduces the concept of block structure, which allows parts of programs to localized
- Passing parameters by means of pass by value and pass by name are introduced
- Procedures are allowed to be recursive
- Stack-dynamic arrays are now allowed
� ALGOL 60 Evaluation
- ALGOL 60 succeeded in becoming the only acceptable formal means of communicating algorithms
- All imperative languages formed after 1960 are either direct or indirect descendants to ALGOL 60
- It was the first language whose syntax was formally described
- Unfortunately, ALGOL 60 never achieved widespread use, because it was too flexible and it lacked input and output statements
COMPUTERIZING BUSINESS RECORDS: COBOL
� Although COBOL has been used more than any other programming language it has ahd little effect on the design of any other languages, with the exception of PL/I.
� Historical Background
- Similar to ALGOL, COBOL was designed by a committee
- Three other languages for business applications that existed before COBOL were FLOW-MATIC, AIMACO, and COMTRAN
� COBOL Design Process
- The biggest concern regarding this new application language was that it be easy to use, even at the expense of being less powerful
- The language specifications for COBOL were published in 1960
� Evaluation
- COBOL originated a number of concepts, such as, constructs for macros, implementation of hierarchical data structures, allowed connotative names
- It was the first language whose use was mandated by the Department of Defense
- The poor performance of the early compilers made COBOL expensive to use, fortunately with the advent of better compiler designs and the mandate of the Defense Department COBOL became very popular
THE BEGINNINGS OF TIME SHARING: BASIC
� Similar to COBOL, BASIC has had widespread use, however it has gotten little respect
� Design Process
- BASIC was designed at DARTMOUTH University in the 1960�s by John Kemeny and Thomas Kurtz
- The goals of the BASIC system were to make it easy for non-science students to learn, be pleasant and friendly, provide fast turnaround for homework, allow free and private access, and consider user time more important than computer time
- The combination of the second, third, and fourth goals mentioned above led to the timeshared aspect of BASIC
� Language Overview
- The original version was very small having only fourteen different statements, and was not interactive, for there was no way to get user input from the terminal
- Overall, it was very limited making it very easy to learn
� Evaluation
- BASIC was the first widely used method of remote access to a computer
- Most of its design came from FORTRAN with some minor influence by ALGOL 60
- Reasons for BASIC�s success are probably it�s ease for which it can be learned and implemented
- Two contemporary versions of BASIC are QuickBasic and Visual BASIC
EVERYTHING FOR EVERYBODY: PL/I
� PL/I represents the first large-scale attempt to design a language that could be used for a broad spectrum of application areas
� Historical Background
- Developed as an IBM product
- Designed to replace FORTRAN, COBOL, LISP and the systems applications of assembly language
� Design Process
- The first version of PL/I was originally named FORTRAN VI because the initial design was just an extension of FORTRAN IV
- The design along with the name was quickly dropped and the new design was called NPL for new programming language
- The name change to PL/I resulted in the avoidance of confusing the name NPL with the National Physical Laboratory in England, since this was were PL/I was developed
� Language Overview
- PL/I included the best of ALGOL 60 (recursion and block structure), FORTRAN IV (separate compilation with communication through global data), and COBOL (data structures, input/output, and report generating facilities), along with a few new constructs
- PL/I was the first language to have programs allowed to create concurrently executing tasks, the possibility to detect and handle 23 different types of exceptions, procedures allowed to be use recursively, pointers included as a data type, and reference to the cross sections of arrays
� Evaluation
- The problem with PL/I is it�s complexity due to it�s large size
- It also suffered from what are now considered to be poorly designed constructs
- However, it did get significant use in both business and scientific applications
TWO EARLY DYNAMIC LANGUAGES: APL AND SNOBOL
� Neither of these languages are based on any previous language and neither have had much affect on any later languages
� The only similarity between APL and SNOBOL is there dynamic typing and dynamic storage allocation
� Origins and Characteristics of APL
- Kenneth E Iverson designed APL in 1960
- It was intended to be a vehicle for describing computer architecture not as a programming language
- It has many powerful operators in order to allow arrays to be manipulated as if they were scalar variables, however, this� creates a problem for implementers of the language
- Although it is not widely used, it is still in use today.
� Origins and Characteristics of SNOBOL
- Designed by D.J. Farber, R.E. Griswold, and F.P. Polensky in the early 1960�s
- It was designed specifically for text processing.
- However, it is rarely used nowadays because it is slower than some other languages
THE BEGINNINGS OF DATA ABSTRACTION: SIMULA 67
� Design Process
- Kristen Nygraad and Ole-Johan Dahl developed Simula I between 1962 and 1964
- It was designed exclusively for system simulation
� Language Overview
- SIMULA 67 is an extension of ALGOL 60
- Allowed for support of coroutines using the class construct, which is important because the idea of data abstraction starts here
ORTHOGONAL DESIGN: ALGOL 68
� Design Process
- Introduces the design criteria of orthogonality, which results in features, such as, the inclusion of user-defined data types and implicit heap-dynamic arrays
� Evaluation
- Introduces a significant number of features not previously used
- Although it may be contrasted with PL/I, PL/I gained a far greater acceptance than ALGOL 68
SOME IMPORTANT DESCENDANTS OF THE ALGOL�S
� All imperative languages including the object-oriented languages, designed since 1960 owe some of their design to ALGOL 60
� Pascal
- Designed by Niklaus Wirth
- Highly used for teaching programming; because of this it lacks some features, such as, separate compilation and passing an array of variable length
� C
- Originally designed for systems programming by Dennis Ritchie in 1972
- Because of its adequate control structures and data structuring facilities C is adaptive to many application areas
- A reason for both like and dislike of C is its lack of complete type checking
- C tends to be a very flexible language
� Modula-2
- Designed by Niklaus Wirth in 1976 (also designed Pascal)
- Distinguishing features include modules, which provide support for abstract data types, procedures as types, and low-level facilities for systems programming and coroutines
� Modula-3
- Based on Modula-2 and Modula-2+
- To Modula-2 it adds classes and objects for support of object-oriented programming, exception handling, garbagfe collection, and support for concurrency
� Oberon
- Loosely based on Modula-2 was written by Niklaus Wirth
- Although many features were added to Modula-2 to come up with Oberon many were subtracted also.
- The main feature added is type extension, which supports object oriented programming
- Among those removed are variant records, opaque types, enumeration types, sub-range types, the CARDINAL type, non-integer array indexes, the with statement, and the for statement
� Delphi
- A hybrid language, similar to C++ in that it was created by adding object-oriented support, along with other things, to an existing imperative language
- It is derived from Pascal, therefore making it much more elegant and safe than C++
PROGRAMMING BASED ON LOGIC: PROLOG
� The use of a formal logic notation to communicate computational processes to a computer
� Design Processes
- Alain Colmerauer, Phillip Roussel, and Robert Kowalski developed the design of Prolog
- Prolog uses a method for specifying predicate calculus propositions and a restricted form of resolution
� Language Overview
- Prolog�s programs consist of a collection of statements or facts
- The most common use of Prolog is as a kind of database
� Evaluation
- There are two major reasons why logic programming has not become widespread
- The first reason is that it has been proven to highly inefficient
- The second reason is that it has been shown to effective for only few areas of application, such as, certain kinds of database management and some areas of AI
HISTORY�S LARGEST DESIGN EFFORT: ADA
� Ada was originally developed for the Department of Defense
� Historical Background
- The Army, Navy, and Air Force all proposed the development of a high-level language for embedded systems in 1974 as an attempt to standardize their own embedded systems
� Design Process
- The committee assigned to this task was responsible for identifying requirements for a new Department of Defense high-level language, evaluating the existing languages to determine whether there was a viable candidate, and recommend adoption or implementation of a minimal set of programming languages
- The resulting language went through multiple phases and was named Ada
� Language Overview
- There are four major features of the Ada language
- Packages provide the means for encapsulation of data objects
- It includes extensive facilities for exception handling
- Allows program units to be generic
- It provides for concurrent execution of special program units and named tasks
� Evaluation
- The Ada language contains most of the concepts of software engineering and language design
- The development of a compiler was a difficult task and almost four years after the design of the Ada language was completed, did the Ada compiler appear
OBJECT-ORIENTED PROGRAMMING: SMALLTALK
� Design Process
- The concepts that led to the development of Smalltalk came out of the Ph.D. dissertation work of Alan Kay
� Language Overview
- The program units of Smalltalk are objects
- All computing in Smalltalk is done in the same fashion, passing messages to objects in order to invoke one of the methods assosciated with that object
COMBINING IMPERATIVE AND OBJECT-ORIENTED FEATURES: C++
� C++ uses C as a base for which to support many of the concepts introduced by Smalltalk
� Design Process
- Modifications to C included the addition of function parameter type checking and conversion, and classes
- Later the additions included inline functions, default parameters, and overloading of assignment operators
- With the above additions to C came the name C with Classes
- Not until the inclusion of virtual functions, was the name changed to C++
� Language Overview
- C++ provides a collection of predefined classes, along with the possibility of user-defined classes
- Operators in C++ may be overloaded, meaning that the user may create operators for existing operators
- Dynamic Binding is provided by virtual class functions
- Both functions and classes can be templated, which means that they can be parameterized
- It also includes exception handling
� Evaluation
- C++ has become extremely popular
- On the downside, C++ is extremely large and suffers some drawbacks similar to PL/I
� A Related Language: Eiffel
- Like C++ Eiffel combines both imperative and object-oriented features
PROGRAMMING THE WORLD WIDE WEB: JAVA
� Based on C++ however, numerous constructs have been removed, some changed, and a few others added
� The resulting language provides the power and flexibility of C++, however, it is smaller, simpler, and safer
� Design Process
- Java was designed for an application for which their appeared to be no other language satisfactory enough
- This started out as a programming language for embedded consumer electronics
- Later it proved to be a useful tool in programming for the web
� Language Overview
- Java employs both types and classes
- It does not have pointers, although its reference types provide some of the capabilities of pointers
- Uses a primitive Boolean type mainly for the control expressions of its control statements
- There are no record, union, or enumeration types
- Supports only single inheritance
- Provides garbage collection
� Evaluation
- Designers of Java have eliminated many of the features that are considered unsafe in C++
- Java is still considered a complex language and its lack of multiple inheritance is said to lead to some peculiar designs
- The use of Java has increased much faster than the use of any other language
|
||||||||
4768
|
dbpedia
|
3
| 63
|
https://unacademy.com/content/bank-exam/study-material/computer-knowledge/algol/
|
en
|
Notes on ALGOL
|
[
"https://unacademy.com/content/wp-content/uploads/sites/2/2021/06/mobile-logo.svg",
"https://unacademy.com/content/wp-content/uploads/sites/2/2021/06/logo-1.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165846/avatar.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165845/user.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165843/settings.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165841/holidays.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165839/logout.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165837/u_facebook-f.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165835/u_youtube.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165834/u_twitter.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165832/u_instagram.svg",
"https://static.uacdn.net/wp-content/uploads/sites/2/2023/03/22165830/u_linkedin.svg",
"https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/exam-updates.svg",
"https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/study-material.svg",
"https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/tests-and-practices.svg",
"https://unacademy.com/content/wp-content/uploads/sites/2/2023/01/rankers-guide.svg",
"https://unacademy.com/content/wp-content/uploads/sites/2/2021/08/faq.svg",
"https://static.uacdn.net/production/_next/static/images/un_white_logo.svg?q=75&w=256 1x, https://static.uacdn.net/production/_next/static/images/un_white_logo.svg?q=75&w=384 2x",
"https://static.uacdn.net/production/_next/static/images/footer/learner.svg?q=75&w=48",
"https://static.uacdn.net/production/_next/static/images/footer/educator.svg?q=75&w=48",
"https://static.uacdn.net/production/_next/static/images/footer/parent.svg?q=75&w=48",
"https://unacademy.com/content/wp-content/uploads/sites/2/2021/06/mobile-logo.svg"
] |
[] |
[] |
[
""
] | null |
[] |
2022-05-19T13:19:40+00:00
|
Complete summary of ALGOL. Features and facts of ALGOL. Brief info on ALGOL 58, ALGOL 60 and ALGOL 68; their implementations.
|
en
|
Unacademy
|
https://unacademy.com/content/bank-exam/study-material/computer-knowledge/algol/
|
ALGOL is a computer programming language that was created in 1958–60 by an international committee of the Association of Computing Machinery (ACM) chaired by Alan J. Perlis of Carnegie Mellon University for publishing algorithms and performing calculations. ALGOL, like LISP, included recursive subprograms, which were procedures that could be used to solve an issue by reducing it to a smaller problem of the same type. ALGOL stands for algorithmic language and was previously known as IAL. It’s a group of portable programming languages for scientific computations that influenced a lot of other languages. For nearly 30 years, the ACM has utilised ALGOL as the standard approach for generating algorithms in textbooks and academics.
Features of ALGOL:
BNF Notation – Backus-Naur Form is a formal means for describing the syntax of a computer language. Think of it as short-hand for computer languages.
Statement Blocks – This is a syntactic means for grouping a series of programming statements together. Think of this like putting a group of related items in a box, and treating the box as a unit.
Parameter Passing – Specifically, there are two methods for passing information to and from a subprogram, pass by value, and pass by reference. Think of these as a way to protect information entering a subprogram from modification, one sends a copy, the other sends the actual data.
Dynamic Arrays – These are a list of values where the size of the list can be determined at run-time. Think of these as flexible arrays, arrays that adjust to the need.
Reserved Words – These are predefined words that have special meaning to a computer language. Think of these like your name. It represents you and the things you can do.
Major ALGOL specifications:
ALGOL 58
ALGOL 58, formerly known as IAL, is a member of the ALGOL programming language family. ALGOL 60 quickly surpassed it as an early compromise design.
The core concept of the compound statement was introduced in ALGOL 58, but it was limited to control flow only, and it was not bound to identifier scope like Algol 60’s blocks were.
According to Perlis, the language was first offered as IAL (International Algebraic Language), but this was rejected as a “‘unspeakable’ and pretentious acronym.”
IBM attempted to implement ALGOL 58, however the development was shortly abandoned due to rivalry with FORTRAN.
BALGOL, a version for the Burroughs 220, evolved along similar lines, but preserved much of ALGOL 58’s original personality.
The major contribution of ALGOL 58 was to later languages; it served as the foundation for JOVIAL, MAD, NELIAC, and ALGO.
ALGOL 58’s influence on ALGOL 60:
Both IAL and ALGOL 60 support arrays with arbitrary lower and higher subscript boundaries, as well as integer expressions to create subscript bounds.
Procedure declarations and their related identifier scopes can be nested in both IAL and ALGOL 60.
ALGOL 60 does not enable numeric statement labels, whereas IAL does.
A switch designator exists in both IAL and ALGOL 60, although it is unrelated to the switch statement in C and other languages.
The IAL if-statement does not have a then-clause or an else-clause; instead, it protects the next statement. IAL has an if either-statement that allows you to test several conditions in a neat way.
Both were superseded by ALGOL’s if-then construct, which included the ambiguous “dangling-else.”
ALGOL 60
ALGOL 60 (short for Algorithmic Language 1960) is a computer programming language from the ALGOL family. It came after ALGOL 58, which introduced code blocks and the begin and end pairs for delimiting them, marking a significant step forward in the development of organised programming. The first language to provide nested function declarations with the lexical scope was ALGOL 60. Many additional programming languages arose from it, including CPL, Simula, BCPL, B, Pascal, and C. ALGOL 60 principles were used to programme almost every computer of the time.
For parameter passing, ALGOL 60 provides two evaluation strategies: call-by-value and call-by-name. For each formal parameter, the procedure declaration stated which was to be used: value supplied for call-by-value and omitted for call-by-name. In contrast to call-by-reference, calling by name has certain implications.
ALGOL 68 implementations:
The Royal Radar Establishment in the United Kingdom presented ALGOL 68-R in July 1970 as the first implementation of the standard, based on the late-1968 draught Report. This was, however, only a portion of the phrase.
Despite its popularity on ICL computers, this version became a widely used language in military code, particularly in the United Kingdom.
CDC Netherlands released the first comprehensive implementation of the language in 1974 for the Control Data mainframe series.
There are three open-source Algol 68 implementations:
Marcel van der Veer wrote -a68g, GPLv3
algol 68 toc, an ALGOL 68RS open-source software port.
Jose E. Marchesi wrote an experimental Algol68 frontend for GCC.
Conclusion:
ALGOL is a computer programming language to provide lexical scope nested function declarations. Furthermore, it was the first programming language to pay close attention to formal language definition, and it was the first to introduce Backus–Naur form, a key formal grammar notation for language design, through the Algol 60 Report.
|
|||||
4768
|
dbpedia
|
3
| 0
|
https://en.wikipedia.org/wiki/ALGOL_58
|
en
|
Wikipedia
|
https://en.wikipedia.org/static/favicon/wikipedia.ico
|
https://en.wikipedia.org/static/favicon/wikipedia.ico
|
[
"https://en.wikipedia.org/static/images/icons/wikipedia.png",
"https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-wordmark-en.svg",
"https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-tagline-en.svg",
"https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png",
"https://login.wikimedia.org/wiki/Special:CentralAutoLogin/start?type=1x1",
"https://en.wikipedia.org/static/images/footer/wikimedia-button.svg",
"https://en.wikipedia.org/static/images/footer/poweredby_mediawiki.svg"
] |
[] |
[] |
[
""
] | null |
[
"Contributors to Wikimedia projects"
] |
2004-08-30T19:27:07+00:00
|
en
|
/static/apple-touch/wikipedia.png
|
https://en.wikipedia.org/wiki/ALGOL_58
|
Programming language
ALGOL 58Paradigmprocedural, imperative, structuredFamilyALGOLDesigned byFriedrich L. Bauer, Hermann Bottenbruch, Heinz Rutishauser, Klaus Samelson, John Backus, Charles Katz, Alan Perlis, Joseph Henry WegsteinFirst appeared1958; 66 years ago ( )Typing disciplineStatic, strongScopeLexicalInfluenced byFORTRAN, IT, Plankalkül,[1] Superplan, Sequentielle FormelübersetzungInfluencedMost subsequent imperative languages (Algol-like)
ALGOL 58, originally named IAL, is one of the family of ALGOL computer programming languages. It was an early compromise design soon superseded by ALGOL 60. According to John Backus:
The Zurich ACM-GAMM Conference had two principal motives in proposing the IAL: (a) To provide a means of communicating numerical methods and other procedures between people, and (b) To provide a means of realizing a stated process on a variety of machines...[2]
ALGOL 58 introduced the fundamental notion of the compound statement, but it was restricted to control flow only, and it was not tied to identifier scope in the way that Algol 60's blocks were.
Name
[edit]
Bauer attributes the name to Hermann Bottenbruch, who coined the term algorithmic language (algorithmische Sprache) in 1957, "at least in Germany".[3][4]
History
[edit]
There were proposals for a universal language by the Association for Computing Machinery (ACM) and also by the German Gesellschaft für Angewandte Mathematik und Mechanik ("Society of Applied Mathematics and Mechanics") (GAMM). It was decided to organize a joint meeting to combine them. The meeting took place from May 27 to June 2, 1958, at ETH Zurich and was attended by the following people:
Friedrich L. Bauer, Hermann Bottenbruch, Heinz Rutishauser, and Klaus Samelson (from the GAMM)
John Backus, Charles Katz, Alan Perlis, and Joseph Henry Wegstein (from the ACM).
The language was originally proposed to be called IAL (International Algebraic Language) but according to Perlis,[5] this was rejected as an "'unspeakable' and pompous acronym". ALGOL was suggested instead, though not officially adopted until a year later. The publication following the meeting still used the name IAL.[6]
By the end of 1958 the ZMMD-group had built a working ALGOL 58 compiler for the Z22 computer. ZMMD was an abbreviation for Zürich (where Rutishauser worked), München (workplace of Bauer and Samelson), Mainz (location of the Z22 computer), Darmstadt (workplace of Bottenbruch).
ALGOL 58 saw some implementation effort at IBM, but the effort was in competition with FORTRAN, and soon abandoned. It was also implemented at Dartmouth College on an LGP-30, but that implementation soon evolved into ALGOL 60. An implementation for the Burroughs 220 called BALGOL evolved along its own lines as well, but retained much of ALGOL 58's original character.[7]
ALGOL 58's primary contribution was to later languages; it was used as a basis for JOVIAL,[4] MAD, NELIAC[4] and ALGO. It was also used during 1959 to publish algorithms in CACM, beginning a trend of using ALGOL notation in publication that continued for many years.
Time line of implementations of ALGOL 58 variants
[edit]
Name Year Author State Description Target CPU ZMMD-implementation 1958 Friedrich L. Bauer, Heinz Rutishauser, Klaus Samelson, Hermann Bottenbruch Germany Z22 NELIAC 1958 Naval Electronics Laboratory USA AN/USQ-17 JOVIAL 1960 Jules Schwartz USA Was the DOD HOL prior to Ada (programming language) Various (see article) BALGOL 1960 Joel Merner et al. USA Burroughs Corporation B220 MAD 1960 University of Michigan USA IBM 7090/7094 mainframe, then mid-1960s ported to Univac 1108 Dartmouth ALGOL 30 1962 Thomas Eugene Kurtz et al. - evolved into ALGOL 60 USA LGP-30 SUBALGOL 1962 Bob Braden, Lawrence M. Breed and Roger Moore, Stanford University USA BALGOL extension IBM 7090 ALGO ~ Bendix Corporation USA Bendix G-15
ALGOL 58's influence on ALGOL 60
[edit]
IAL introduced the three-level concept of reference, publication and hardware language, and the concept of "word delimiters" having a separate representation from freely chosen identifiers (hence, no reserved words). ALGOL 60 kept this three-level concept.[8]
The distinction between assignment (:= representing a left-facing arrow) and the equality relation = was introduced in IAL and kept in ALGOL 60.
Both IAL and ALGOL 60 allow arrays with arbitrary lower and upper subscript bounds, and allow subscript bounds to be defined by integer expressions.
Both IAL and ALGOL 60 allow nesting of procedure declarations and the corresponding identifier scopes.
The IAL report described parameter substitution in much the same terms as the ALGOL 60 report, leaving open the possibility of call by name. It is unclear if this was realized at the time.
IAL allows numeric statement labels, that ALGOL 60 kept.
The possibility of including non-ALGOL code within a program was already hinted at, in the context of parameters to procedures.
Both IAL and ALGOL 60 have a switch designator, unrelated, however, to the switch statement in C and other languages.
In-line functions of the form f(x) := x / 2; were proposed in IAL but dropped in ALGOL 60.
IAL procedure declarations provide separate declaration lists for input and output parameters, a procedure can return multiple values; this mechanism was replaced in ALGOL 60 with the value declaration.
Variable declarations in IAL can be placed anywhere in the program and not necessarily at the beginning of a procedure. In contrast, the declarations within an ALGOL 60 block should occur before all execution statements.
The for-statement has the form for i:=base(increment)limit, directly resembling the loop of Rutishauser's programming language Superplan, replacing =with :=, and replacing its German keyword Für with the direct English translation for; ALGOL 60 replaced the parentheses with the word delimiters step and until, such that the previous statement instead would be i:=base step increment until limit.
The IAL if-statement does not have a then-clause or else-clause; it rather guards the succeeding statement. IAL provides an if either-statement that cleanly allows testing of multiple conditions. Both were replaced by ALGOL's if-then construct, with the introduction of the "dangling-else" ambiguity.
IAL provides macro-substitution with the do-statement; this was dropped in ALGOL 60.
IAL allows one or more array subscripts to be omitted when passing arrays to procedures, and to provide any or all arguments to a procedure passed to another procedure.
IAL's infix boolean operators are all of the same precedence level. Exponents are indicated with paired up and down arrows, which removed any confusion about the correct interpretation of nested exponents; ALGOL 60 replaced the paired arrows with a single up-arrow whose function is equivalent to FORTRAN's **.
The IAL report does not explicitly specify which standard functions were to be provided, making a vague reference to the "standard functions of analysis." The ALGOL 60 report has a more explicit list of standard functions.
References
[edit]
|
||||
4768
|
dbpedia
|
0
| 20
|
https://historyofalgol.wordpress.com/
|
en
|
The History of ALGOL
|
[
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://pixel.wp.com/b.gif?v=noscript"
] |
[] |
[] |
[
""
] | null |
[] | null |
When was the language developed (originally, and any major revisions)? To begin with, ALGOL was created in 1959 by a joint meeting in Zűrich,Switzerland by GAMM (Society of Applied Mathematics and Mechanics [German group]) and ACM (Association of Computing Machinery). The conference lasted 6 days and ended with the creation of ALGOL 58. Originally ALGOL…
|
en
|
https://s1.wp.com/i/favicon.ico
|
The History of ALGOL
|
https://historyofalgol.wordpress.com/
|
When was the language developed (originally, and any major revisions)?
To begin with, ALGOL was created in 1959 by a joint meeting in Zűrich,Switzerland by GAMM (Society of Applied Mathematics and Mechanics [German group]) and ACM (Association of Computing Machinery). The conference lasted 6 days and ended with the creation of ALGOL 58. Originally ALGOL 58 was meant to be used as a universal algorithmic programing language, but the disputes over the implementation and other areas caused another conference to be called in 1960; this time in Paris, France that resulted in ALGOL 60.Since there were still disputes about the language, another branch of ALGOL was created ALGOL 68. But due to the difficulty of understanding its two-level grammar (hyper rules and meta rules i.e. ‘ALPHA:: beta.’), which was meant to be used like templates to enable multiple different types of production, it never caught on.
Original: ALGOL 58
Revisions ALGOL 60, and 68
What individuals or organizations were responsible for its development?
Organizations Responsible:
GAMM (Society of Applied Mathematics and Mechanics [German group])(ALGOL 58)
ACM (Association of Computing Machinery)(ALGOL 58)
Notable Figure:
John W. Bakus, the leader of the group which developed the language FORTRAN(Specifically involved in ALGOL 60)
Which languages did it influence and/or was influenced by?
The influence of ALGOL is far extending, such as recursive procedures which are still used by Java and C, which according to www.tiobie.com have been the top 2 programing languages in the last 10 years. Also, can be noted as a precursor to PASCAL language. As far as influences go, ALGOL was originally developed as a reaction to the FORTRAN language which is why FORTRAN had the most impact on ALGOL’s development.
Influence: Java, C, PASCAL
Influenced By: FORTRAN
For what paradigm or domain was it developed and/or used?
To put it bluntly and straight forward ALGOL was used for mathematical programming. Hence the name ALGOL(ALGOrithmic Language).
ALGOL 60 was an implementation language, one of the few around during its creation. At the time, people were having issues with compilers, so having it as implementation language made it a better language choice in computing society (Wegner, 12).
Where was it used? Great question thank you for asking! AlGOL started to generate some american excitement around the two companies of SHARE and IBM, but never became the accepted language.
According to History of Programming Languages, Edited by Richard L. Wexelblat,
“Perhaps it is just as well that IBM and its users accepted ALGOL 58 as a model for language development but did not settle upon it as a computing standard . . . too much would have to have been fixed too soon.”
What are its most important features?
Some of the most noticeable abstractions of ALGOL 60 that were new, was the block statement, parameter passing to procedures (Dainth, para 2) and nested if-then-else statements(Wegner, 7). These abstractions have continued to be used ever since. In its differences to FORTRAN during the time, ALGOL 60 was able to handle dynamic arrays and recursive procedures (Wegner, 7), which were also introduced for the first time.
Important Features: The introduction of nested if-then-else, recursive procedures, parameters passing to procedures and block statements
Is it still in use today?
ALGOL can be written and compiled today (on most computers) using a program called Marst, which compiles it into C code which then can be compiled and executed as C code. ALGOL is not very common in any working condition, in fact, many consider the language to be obsolete so it is unused except by those who have an interest in it.
Answer: No, ALGOL is no longer in use
To what can you attribute its success or failure?
|
||||
4768
|
dbpedia
|
0
| 98
|
https://www.encyclopedia.com/science-and-technology/computers-and-electrical-engineering/computers-and-computing/algol
|
en
|
Encyclopedia.com
|
[
"https://www.encyclopedia.com/themes/custom/trustme/images/header-logo.jpg"
] |
[] |
[] |
[
"Get information",
"facts",
"and pictures",
"about Algol",
"at Encyclopedia.com",
"Make",
"research",
"projects",
"and school reports",
"about Algol",
"easy",
"with credible",
"articles",
"from our FREE",
"online encyclopedia and dictionary"
] | null |
[] | null |
*Algol Acronym for algorithmic language.* The generic name for a family of high-level languages of great significance in the development of computing.
|
en
|
/sites/default/files/favicon.ico
|
https://www.encyclopedia.com/science-and-technology/computers-and-electrical-engineering/computers-and-computing/algol
|
Algol
Algol Acronym for algorithmic language. The generic name for a family of high-level languages of great significance in the development of computing. In 1958 the Association for Computing Machinery (ACM) in the US and the Gessellschaft für Angewante Mathematik und Mechanik (GAMM) in Europe set up a joint committee to define an international algorithmic language (IAL). The language that was designed became known as Algol, and was later called Algol 58 to distinguish it from later versions. Algol 58 was not intended to be a viable language, and in 1960 an augmented committee was convened to devise the second iteration, which was published as the language Algol 60. See also JOVIAL.
Algol 60 was much more popular in Europe than in the US, probably due to the dominance of IBM and Fortran in the North American market. It introduced many new concepts, notably block structure (see block-structured languages), nested scopes, modes of parameter passing to procedures, and the definition of the language introduced the now classic BNF notation for describing syntax. The influence of Algol 60 can be seen in all succeeding languages, and it stands as a milestone in the development of programming languages.
In the years following the publication of the Algol 60 Report, a working group of the International Federation for Information Processing was set up to consider the definition of a successor to Algol 60. There were many dissensions within the group, and eventually a minority report was issued proposing the language Algol 68. The first implementation of Algol 68, named ALGOL 68R, was produced at the Royal Signals and Radar Establishment in the UK. ALGOL 68R demonstrated that Algol 68 was a viable language (not at the time a self-evident proposition).
Although Algol 68 introduced many novel concepts of great theoretical interest and significance, its practical application was almost nil. One of the most notable features of Algol 68 is its formal specification using a two-level grammar. Although a very precise definition, it is very difficult to understand, and this difficulty partly accounts for the low acceptance of the language. One of the most significant effects of the split in the Algol 68 working group is that it led indirectly to the development of Pascal.
Cite this article
Pick a style below, and copy the text for your bibliography.
MLA
Chicago
APA
JOHN DAINTITH "Algol ." A Dictionary of Computing. . Encyclopedia.com. 30 Jul. 2024 <https://www.encyclopedia.com>.
JOHN DAINTITH "Algol ." A Dictionary of Computing. . Encyclopedia.com. (July 30, 2024). https://www.encyclopedia.com/computing/dictionaries-thesauruses-pictures-and-press-releases/algol
JOHN DAINTITH "Algol ." A Dictionary of Computing. . Retrieved July 30, 2024 from Encyclopedia.com: https://www.encyclopedia.com/computing/dictionaries-thesauruses-pictures-and-press-releases/algol
Learn more about citation styles
Citation styles
Encyclopedia.com gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).
Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.
Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, Encyclopedia.com cannot guarantee each citation it generates. Therefore, it’s best to use Encyclopedia.com citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:
Modern Language Association
http://www.mla.org/style
The Chicago Manual of Style
http://www.chicagomanualofstyle.org/tools_citationguide.html
American Psychological Association
http://apastyle.apa.org/
Notes:
Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most Encyclopedia.com content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.
In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.
Algol
Algol (or ALGOL) (ˈælgɒl) Computing algorithmic language
|
|||||
4768
|
dbpedia
|
2
| 42
|
https://forums.theregister.com/forum/all/2020/05/15/algol_60_at_60/
|
en
|
ALGOL 60 at 60: The greatest computer language you've never used and grandaddy of the programming family tree • The Register Forums
|
https://forums.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
|
https://forums.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
|
[
"https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/user_icon_white_extents_16x16.png",
"https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/user_icon_white_filled_extents_16x16.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/std/reg_logo_no_strapline.png",
"https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/std/magnifying_glass_white_extents_16x16.png",
"https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/icon/burger_menu_white_16x16.png",
"https://forums.theregister.com/design_picker/ae01b183a707a7db8cd5f2c947715ed56d335138/graphics/icon/burger_menu_white_close_16x16.png",
"https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2Zr0U8iU6MzFOUgoNLKRYUAAAAMU&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/back_to_page.png",
"https://forums.theregister.com/design_picker/c00f80f04b0eaf0123d821f6c9488fc1cb55fd0a/graphics/icons/unstar.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/feeds.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/coat_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/facepalm_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png",
"https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=3Zr0U8iU6MzFOUgoNLKRYUAAAAMU&t=ct%3Dns%26unitnum%3D426raptor%3Dfalcon%26pos%3Dtop%26test%3D0",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://pubads.g.doubleclick.net/gampad/ad?co=1&iu=/6978/reg_software/devops&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=4Zr0U8iU6MzFOUgoNLKRYUAAAAMU&t=ct%3Dns%26unitnum%3D426raptor%3Deagle%26pos%3Dmid%26test%3D0",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/childcatcher_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/boffin_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/coffee_keyboard_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/facepalm_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/devil_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/gold_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/unhappy_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/happy_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/windows_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/angel_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/go_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_48.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_48.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/reply_icon.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/bronze_badge.png",
"https://forums.theregister.com/design_picker/215565f179ecd9dd919cfd297e581ca79d1eed05/graphics/icons/silver_badge.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/anonymous_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_up_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/thumb_down_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/stop_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/go_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/wtf_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/fail_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/happy_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/meh_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/unhappy_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/angel_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/devil_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/flame_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/mushroom_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/megaphone_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/trollface_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/facepalm_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/holmes_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/headmaster_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/coat_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/joke_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pint_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/coffee_keyboard_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/alert_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/black_helicopters_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/boffin_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/pirate_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/alien_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/terminator_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/big_brother_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/childcatcher_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/linux_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/gimp_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/windows_32.png",
"https://forums.theregister.com/design_picker/fa16d26efb42e6ba1052f1d387470f643c5aa18d/graphics/icons/comment/it_angle_32.png",
"https://forums.theregister.com/design_picker/d518b499f8a6e2c65d4d8c49aca8299d54b03012/graphics/icon/vulture_white.png",
"https://forums.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg",
"https://forums.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg",
"https://forums.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/icon/footer_mob_nav_arrow_black.svg",
"https://forums.theregister.com/design_picker/d2e337b97204af4aa34dda04c4e5d56d954b216f/graphics/std/sitpublogo_2022.png",
"https://forums.theregister.com/Design/graphics/std/transparent_pixel.png"
] |
[] |
[] |
[
""
] | null |
[
"Anonymous Coward",
"Jason Bloomberg",
"Ron Martin",
"Ken Moorhouse",
"disgruntled yank",
"Primus Secundus Tertius",
"Doctor Syntax",
"shawn.grinter",
"Graham Cobb",
"John Sturdy"
] |
2020-05-15T00:00:00
|
en
|
/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.ico
|
https://forums.theregister.com/forum/all/2020/05/15/algol_60_at_60/
|
No love for CORAL 66?
Back in the day, in order to get a military application running, I had to take the assembler output of a Coral 66 program and basically rewrite it completely in proper assembler, because the compiler wasn't merely not optimised, it seemed positively designed to waste as many CPU cycles as possible. It used the 9989 microprocessor, and completely ignored the 9989's register windowing system, instead creating boilerplate functions every single time.
Kids today...I'm the one feeding the punched tape repeatedly through the reader to get the floating point libraries to work. Arguments over bracket syntax, tabs versus space and to semicolon or not are really a sign that nowadays there is very little to complain about.
It started my career move...
We had an Elliott 803 at Rugby College (later absorbed into Lanchester Poly). Although I was studying Applied Physics, the computer part of the course fired me up and I went into computing from that point on. I remember the console speaker that, although abused to play music, did give you a sense of what your program was doing. Later on, in my 6800/8008/8080 hobby days, I used a transistor radio for the same thing.
There were two no-no's: The first was to remember how little memory you had to play with so large arrays were not possible and, if attempted, resulted in a subscript overflow message (iirc - SUBSCROFLO). The second was to make sure your plotter programmes completed. The plotter involved an extra paper tape load for the operator. If your plot failed then the whole machine had to be restarted. The plotting code was probably an early form of overlay.
Before we had magnetic tape installed the paper tape reader was something to behold. The output from the reader had to be caught in a basket as the speed was so high. The computer operator was a very smart young lady too.... Happy days. ;-)
I've used Algol-60 at school
I did my AO-Level Computer Studies in 1983 it 1983 or 1982 (can't remember!) and I did my project in Algol-60. Because that's what we ran on our Research Machines 380Zs. It was a program to solve the N-Queens problem, using the process of making different permutations of the numbers 1-N, and I did it using recursion (because I could). Little did I realise at the time that it was really out of date, and there's no reason we couldn't have done it in Pascal or C. But that's all we had.
Ugh! PASCAL
We used it in school (UMASS/Amherst) as a teaching language. Too "wordy" for me, especially the IO statements. But it did get the job done.
When I started working, it was at Data General, who had an Algol-like language called DG/L, which I absolutely loved. It was my language of choice for little utility programs on our AOS and AOS/VS systems.
Then, of course, came the Sun workstations, with UNIX and C.
Game over.
Bonus for the CALCOMP drum plotters. I worked part time at the comuter center in school, we had one (3 or 4 foot wide). The number of aborted plots due to pen failure was astounding, as was the pen budget!
Algol 68 is not ALGOL 60
Algol 68 is basically a different language from Algol 60—and the Algol 60 designers, most notably Djisktra, were less than impressed by it, and along with others on the committee issued a "minority report" disowning Algol 68. Syntax aside,and leaving out the object-oriented stuff, Algol 68 actually bears a distinct resemblance to C++. Algol 60, on the other hand, begat Pascal. So really they are different languages sharing part of a name.
I never got to use Algol 68 because it was difficult to write a compiler for and nowhere I worked had one. Burroughs Algol (a variant of ALGOL 60), on the other hand, was available. Time pressure, and perhaps a desire not to be seen to know too much about the "old iron", meant I never did write any Burroughs Algol. That was perhaps my loss. I don't think I missed anything by not writing Algol 68.
BTW Burroughs Algol was implemented in hardware. It was a stack machine with display registers and hardware support for resizing arrays.
ALGOL 60 was the first language I learned, at school in the Computer Club. Turn round was a bit longer than ten minutes. We used to write programs on coding sheets, which were then taken to the nearby university and typed up by data prep ladies, run by the operators on the ICL1909, and the coding sheets, pack of cards and printout was returned to us a week later. That did make you concentrate fairly hard on program correctness; my first program calculated primes and I've still got the output somewhere. After a while a friend and I learned that we could get off the bus on the way home, walk up to the university and punch our own cards on IBM 029 punches, and then watch while the operators ran the program. That made things a lot faster and meant we could write bigger programs - my favourite was a linear regression program for the results of our physics experiments. It made them look much more 'official' :)
And then we discovered the unversity had a free access PDP-8 so we learned BASIC on DECtape and the joys of typing into an ASR-33 and Friden Flexowriters. And then ...
... the university got a copy of the POP-2 compiler. Still my favourite language ever! Programs to synthesise English using Chomsky's grammars etc etc. Machine Intelligence 1, 2, 3 ...
My first computer language...
... was Algol W, at university in the mid-70s. Sequence, selection and iteration - what more do you need? :) It set the way for how I think about programming, and I found it difficult to think in other ways. For example, we later covered the functional programming language SASL (a precursor of Miranda, which in turn preceded Haskell), and I found it almost impossible to think that way, until one day the penny simply dropped, and I had no problems after that.
For Raymond, up above, I still have my copy of the MoD "Blue Book" standard for CORAL 66. Used that at university on a CTL mini, and in professional life on Ferranti Argus and CTL/ITL minis.
My real love was Pascal, and I remember how chuffed I was the day I discovered that although I had been using it for many months, I hadn't realised that our implementation of it did not include GOTO statements! I still have a free implementation of it on my home PC - very useful for occasional puzzle solving.
Others I used included Fortran IV and BCPL at university, and ASM-86, PL/M-86, and PL/I in professional life.
But then I became a consultant :)
Early Uni Computing
Back in the early 70's Bangor University Computing Department kept what was the previous workhorse 803 in a back room, after migrating to an ICL4130. The computing department manager would oversee a rigorous test on the 803 operating procedures before allowing any member of the great unwashed student body unsupervised access to said machine. I've still got my copy of Algol A104 compiler tape in a drawer somewhere. As an introduction into the wide world of programming this was an ideal education in small memory management and code optimisation, useful on my next processor, an intel 4040. How did the compile procedure go? Power supply to 'on', tape in reader, press the control button to 'read', hit the 'operate' bar, press the control button back to 'normal', hit the 'operate' bar... And listen to the soothing chatter of the loudspeaker connected to the overflow bit... Kids these days, they don't believe you!
ALGOL 60 was the first language I learned, in 1973. At Warwick Uni, they reckoned that virtually any science undergrad would need computer skills, so this was a compulsory course.
They ran an ICL 4130 and a 4120 at the time - designed by Elliott, so presumably somebody at Elliott was an ALGOL fan.
Never used it in any commercial setting, but it laid the basis for the structure used in any number of languages since. When writing COBOL, I often sketched things out using ALGOL to check things made sense before getting the coding sheets dirty.
Who remembers Backus-Naur ?
Not quite how I remember it
"you basically march up to the machine, the machine's got the ALGOL system loaded, you run your programme, it produces gibberish"
When I was in the sixth form, one of my maths teachers had a contact at the local Poly. We were low priority, do we got to leave our precious punched tapes with the operator who would run it for us in time for us to pick up the gibberish the next day.
Needless to say I thought computers were complete bollocks.
A couple of years later when I was at University, and they had these lineprinter style terminals, I started to think wow, maybe there's something in this; soon after I got access to a green screen terminal.
The rest is history.
Fall 1965: Gunn High School in Palo Alto CA... the school district's fist 'computer' class (Perhaps one of the first HS classes anywhere?)
They only had a keypunch machine at the school... and at the first class admitted they were still looking for a computer to use, as the District had quashed plans to allow their own in-house computer to be used by [gasp] students!
Next class, they announced they had FOUND a computer very nearby:
The Burroughs B5500 mainframe at SLAC!!! (Stanford Linear Accelerator Center).
No problem for students there!!! We ran our class exercises in B5500 ALGOL.
Early Algol68 Compilers
In the early 1980’s UCLA had an active group of graduate students working in a compiler for Algol68 for the IBM 360/91 under Robert Ugalus. The book, “the Informal introduction to Algol68” was the Bible for the language semantics. The book created its own set of specialized words like propositity and lapsity to describe in precious words the language semantics. The underlsying memory model was the stack retention model with variants called the Cactus stack for its threaded nature. The language semantics, referred in previous posts as European sensibilities was quite well done. It lacked a machine description in favor of very precise descriptions of the language semantics, very worthwhile reading for those who like reading compilers semantics by jumping into the way back machine. As Joe Bob Briggs used to say, “check it out”.
Algol 68 on a ICL 4100
When at Warwick University in 1974 I was using Algol 68 for my computing project. My maze solver did eventually work though as mentioned the bug cycle was more like one day as you had to post your punch card stack into the pigeon hole of the computer suite and wait for them to process it. I found by trial and error that why my maze solver never ran correctly on Friday afternoons was because they used to take half of the magnetic memory offline to clean it or something. The largest maze I could create an a Friday was 3 x 3!
Algol 60 on the HP 2100A
Well, I've used it -- it was the best language to use on the HP 2100A, because you needed only one paper tape, because Algol 60 was designed to work with a single pass compiler. Apart from it being a better language anyway.
FORTRAN IV, as it was then, needed a two paper tapes. So, even for a little FORTRAN program, you had to load the first pass compiler tape (then wind it up on its spool), then put your program tape through and get the intermediate tape. You'd then load the second compiler paper tape, and feed it the intermediate tape. It would then produce the binary. If anything went wrong, you had to start again.
Even then proper design made a huge difference. Declaring your variables first, and declaring functions before you use them, makes good sense anyway, and has the excellent side-effect of improving usability by one-pass compilation.
The influence of ALGOL 60
You can see the influence of ALGOL 60 on modern languages most clearly if you compare programs written in 1960 versions of ALGOL, FORTRAN, COBOL, and LISP (which were the most widespread languages at the time). The ALGOL 60 program will (for the most part) be readily readable by someone who has learned C, Java, or C# and nothing else. Understanding the FORTRAN, COBOL, or (in particular) LISP programs would require a good deal of explanation, but understanding the ALGOL 60 program would mainly be realising that begin, end, and := correspond to curly braces and = in C, Java, and C#. Look, for example, at the Absmax procedure at https://en.wikipedia.org/wiki/ALGOL_60#Examples_and_portability_issues
FORTRAN and COBOL continued to evolve into something completely unlike their 1960 forms while still retaining their names -- even up to today. ALGOL mainly evolved into languages with different names, such as Pascal, C, CPL, Simula and many others. So ALGOL is not really more of a dead language than FORTRAN II and COBOL 60. There was a computer scientist in the late 60s that was asked "What will programming languages look like in 2000?". He answered "I don't know, but I'm pretty sure one of them will be called FORTRAN". This was a pretty good prediction, as Fortran (only name change is dropping the all caps) still exists, but looks nothing like FORTRAN 66, which was the dominant version at the time. You can argue that the modern versions of FORTRAN and COBOL owe more to ALGOL 60 than they do to the 1960 versions of themselves.
My first programing language was Burroughs EXTENDED Algol 60 on a B5000 in 1965. Burroughs always extended the basic language. The beauty of the Burroughs' extensions is that their ALGOL 60 was wedded to the hardware design, and their compilers are wall written in the language they compiled. At that time is was ALGOL, COBOL and ESBOL(?SP) which was superset of ALGOL used to program the MCP. I did in fact program the MCP. In ESBOL one had wide open access to the hardware itself.
The B5000 hardware had an interesting limitation: Code was generated into program segment strings which were block limited to 1024 words. A word was 48 bits. Why? The program segment string S register was 10 bits. If say, a BEGIN code END; block exceeded 1024 words, then the compiler yielded a "program segment string exceeds 1024" error. This of course nailed me in my first program. Easy enough to repair.
The B5000. B6000 series also had virtual memory. Program segment strings need not be memory resident. And the 1024 word segments could be viewed a VM. Each program segment string terminated n a descriptor that referenced the program reference table (PRT). This referenced location could be a disk address to access the next segment. Etc.
The B6000 series, I worked on the B6500, generalized the PRT into what was called the cactus stack. This enabled the extended version of ALGOL 60 to have multiple dimensional arrays of tasks. Super feature. An array element had the parent node that could link to family member tasks. The first time I used the feature I had a "death in the family error." This meant that a member the family hierarchy crashed.
I can go on and on about this amazing hardware/ALGOL fusion. I have to say it's unforgettable since I'm soon to be 80 years old. I've written software in some 20 or so languages during my career as a computer scientist, and Burroughs extended ALGOL 60 is definitely the mid-wife of almost all of them.
Bill
Algol was OK, but all wasn't plain sailing.
I programmed a lot in Algol 60 between 1967 and 1971 at the University of Salford on the English Electric KDF9. I was doing research into Computer Aided Circuit design using Y matrices. When I started paper tape was in use created and edited on Friden Flexowriters. My program was fairly large for the time and a relatively large roll of tape. There were two ways of changing the program - put the paper tape in the flexowriter and set it to copy, up to the line that you wanted to change, type in the changes, re-sync the rest of the tape and let it copy the rest OR get out the little tape splicing machine and hack out a section, gluing your amendment in. Compiling and running was effectively twice per day since you had to write out a job slip and put your tape in a box for the computer operators to load, compile and execute. Then - REVOLUTION - the kdf9 was front-ended with a pdp8 running a system called COTAN ( I faintly recall from more than half a century ago). This had a disc where you could keep your program and edit it using ksr33 teleprinters (if you were patient). Then submit your job electronically no less.
Unfortunately the COTAN system had a sort-of anti -Algol design - it was based on 80 column card images. Students were allocated a certain (insufficient) maximum number of cards. Algol 60 was not Fortran, so my beautiful indented layout of the "begin" and "ends" of the paper tape which used quite a lot of blank characters didn't fit in the allocated card space. Answer - eliminate most of the comments and the blanks. This resulted in a more or less "rectangular" program of solid text 80 characters wide - not that the Algol compilers cared but the very devil to read, edit, and debug.
I still have some printouts (somewhere) of the resulting mess together with some of the little boxes originally used to submit the paper tape jobs.
I don't hunger after the "old days" of computing.
A lot of people program in ALGOL 60
There is a lot of programming done in ALGOL 60 on Burroughs machines (now Unisys MCP machines). Burroughs extended the language with IO, some an extension of FORTRAN-style IO, but most programmers did their own direct formatting, since the FORTRAN IO was interpreted and slow.
I believe Burroughs ALGOL was based on Elliot ALGOL. I later worked with another Burroughs guy who developed a language based on Elliot ALGOL for the Apple II. This was at a company called Netcomm in Australia.
Actually Don Knuth wrote one of the first Burroughs ALGOL compilers on a summer break as a student. It was on the B200 (from memory), but it predated the B5000 ALGOL compiler.
Burroughs ALGOL is a really heavy-duty systems language that makes C look like the toy that it is. It is a shame that C has effectively killed language development. If ALGOL or its next generation (ALGOL-68, ALGOL-W, Pascal, CPL, etc) had continued, we’d probably have pretty solid languages by now, rather than the rather flimsy C.
1965
Having done my degree at Woolwich Polytechnic (Tommy Flowers was there long before me!) from 1965 onwards while working as a rocket scientist at the MoD, I was one of the early progammers who never actually went in to the computer room where the program was run. I simply submitted my Algol programs on coding sheets which would be run and sent back. I did occasionally see the computer through a window!
I then taught Algol at Braintree college on an Elliott 803.
Nothing compares with MIRFAC, probably the most sophisticed language ever running on the MoD's COSMOS computer though. Simon Lavington reviews this machine.
Re: Facebook simply would not exist today if not for Bletchley Park
Apart from Colossus Bletchley Park had nothing to do with electronic digital computers. All data analysis was done on Hollerith-type card machines.
Actually Britain was far ahead of the US in developing commercial computers. Just look at the history of LEO. Britain also pioneered practical computers such as EDSAC which is the forerunner for all today's computers and can bee seen at the National Museum of Computer, located on the Bletchley Park Estate. Spend 3 hours there and see the Harwell Dekatron - the world's oldest original working computer.
While there see a working Enigma/Bombe and the Lorenz/Colossus.
My advice is to see TNMoC first then another pop down to Bletchley Park for a brief tour to see the Mansion and other interesting facts about their role in the the second world war.
|
||||
4768
|
dbpedia
|
2
| 9
|
https://craftofcoding.wordpress.com/tag/algol/
|
en
|
Algol – The Craft of Coding
|
[
"https://craftofcoding.wordpress.com/wp-content/uploads/2021/03/algorithm35.png?w=572",
"https://craftofcoding.wordpress.com/wp-content/uploads/2021/02/wirthlanguages1-1.png?w=599",
"https://craftofcoding.wordpress.com/wp-content/uploads/2021/02/wirthlanguages2.png?w=620",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://pixel.wp.com/b.gif?v=noscript"
] |
[] |
[] |
[
""
] | null |
[] |
2021-04-06T20:03:37+00:00
|
Posts about Algol written by spqr
|
en
|
https://s1.wp.com/i/favicon.ico
|
The Craft of Coding
|
https://craftofcoding.wordpress.com/tag/algol/
|
The Sieve of Eratosthenes likely first appeared in computing in 1961 as Algorithm 35 in Communications of the ACM (p.151). It, likely many other algorithms of the time was shown in Algol-60.
To try and implement it as close as possible to the original syntax, given that there are few if any Algol-60 compilers out there, I chose to reproduce the algorithm in its entirety in Algol-68 using the Algol-68 Genie (a68g) specification. It wasn’t a lot of fun.
PROC sieve = (INT nmax, REF INT j) []INT: ( [nmax]INT primes; INT i; primes[1] := 1; primes[2] := 2; primes[3] := 3; j := 3; FOR n FROM 3 BY 2 TO nmax DO i := 3; l1: IF primes[i] <= sqrt(n) THEN GOTO a1 ELSE GOTO a2 FI; a1: IF n/primes[i] = n%primes[i] THEN GOTO b1 ELSE GOTO b2 FI; b2: i := i + 1; GOTO l1; a2: primes[j] := n; j := j + 1; b1: print("") OD; j := j - 1; primes ); main: ( INT n, numpr; print("Upper limit to calculate sieve? "); read(n); [n]INT primes; primes := sieve(n,numpr); FOR i FROM 1 TO numpr DO print(whole(primes[i],5)); IF i MOD 10 = 0 THEN print(new line) FI OD )
It was an interesting experience, partially because the documentation for a68g is okay, but it is filled with the same obtuse stuff in the original reports for Algol-68. There is a lot to this language, and trying to find simple things, like how to pass back a variable from a subprogram is trying. There are things I inherently like, like the ability to specify the size of the array to be created within the function (proc seems like a procedure, but behaves more like a function). And I like that assignment is :=, and equality is =. It’s also nice to use a language where it is easy to return an array from a function. Here is the program executed with the value of n set to 1000.
Upper limit to calculate sieve? 1000 +1 +2 +3 +5 +7 +11 +13 +17 +19 +23 +29 +31 +37 +41 +43 +47 +53 +59 +61 +67 +71 +73 +79 +83 +89 +97 +101 +103 +107 +109 +113 +127 +131 +137 +139 +149 +151 +157 +163 +167 +173 +179 +181 +191 +193 +197 +199 +211 +223 +227 +229 +233 +239 +241 +251 +257 +263 +269 +271 +277 +281 +283 +293 +307 +311 +313 +317 +331 +337 +347 +349 +353 +359 +367 +373 +379 +383 +389 +397 +401 +409 +419 +421 +431 +433 +439 +443 +449 +457 +461 +463 +467 +479 +487 +491 +499 +503 +509 +521 +523 +541 +547 +557 +563 +569 +571 +577 +587 +593 +599 +601 +607 +613 +617 +619 +631 +641 +643 +647 +653 +659 +661 +673 +677 +683 +691 +701 +709 +719 +727 +733 +739 +743 +751 +757 +761 +769 +773 +787 +797 +809 +811 +821 +823 +827 +829 +839 +853 +857 +859 +863 +877 +881 +883 +887 +907 +911 +919 +929 +937 +941 +947 +953 +967 +971 +977 +983 +991 +997
Language design in the 1960’s was dominated by attempts to improve upon Algol60. Hoare noted in 1973 [1] that ALGOL-60 was “… a language so far ahead of its time, that it was not only an improvement on its predecessors, but also on nearly all its successors”. Niklaus Wirth was a graduate student at University of California, Berkeley when he started playing around with languages. He joined a research group which was engaged with the implementation and improvement of a dialect of ALGOL-58, NELIAC. He described the compiler as “an intricate mess“, and the process one of “1% science, and 99% sorcery” [1].
The first language Wirth designed leading to his dissertation was Euler, as Wirth himself put it “a trip with the bush knife through the jungle of language features and facilities” [1]. Euler had academic elegance, but no real practical value, however it did catch the attention of the IFIP Working Group, engaged in designing the successor to ALGOL-60. There seemed to be two camps here, one which wanted to push the boundaries of language design and another which wanted to extend ALGOL-60. Wirth belonged to the latter group. In 1965, three reports by Wirth, Seegmüller, and Wijngaarden described three different quasi-complete languages. The complexity of the design process, involving far too many people, eventually led Wirth to become disheartened with the design process, and he went off to develop his own version of ALGOL-60s successor. Designed with contributions from Tony Hoare, this language would become ALGOL-W.
Wirth’s first significant language was PL360, a byproduct of the ALGOL-W effort. The IBM 360 upon which ALGOL-W was implemented offered the choice of assembly language or Fortran compilers, neither of which was very attractive. PL360 [2] was a tool with which to implement ALGOL-W. ALGOL-W had a number of applications, but was deficient as a systems programming language. PL360 was to become more successful than ALGOL-W, largely because ALGOL-W was a complex language, and the target computers inadequate. In the fall of 1967, Wirth returned to Switzerland to begin work on the language that would become most closely associated with his language design efforts – Pascal. Wirth would go on to create a trinity of languages, neither of which were that closely related, but all that had one thing in common, they were “ALGOL-like” languages – Pascal, Modula-2 and Oberon.
Pascal – Based largely on Algol W, including the use of it’s while and case statements, and record structures. There were syntactic differences, however Algol 60 was almost a subset of Pascal.
Modula-2 – Wirth noted that Modula-2 “includes all aspects of Pascal, and extends them with the module concept”.
Oberon – Evolved from Modula-2 by very few additions, and several subtractions.
Note that “ALGOL-like” really implied – a procedural paradigm, a rigorously defined syntax, traditional mathematical notation (with the nonsense of symbols like ++), block structure providing identifier scope, the availability of recursion, and a strict, static data typing system. [3]
[1] Wirth, N., “From programming language design to computer construction”, CACM, 28(2), pp. 160-164 (1985)
[2] Wirth, N., “PL360, a programming language for the 360 computers”, Journal of the ACM, 15(1), pp.34-74 (1968)
[3] Wirth, N., “Modula-2 and Oberon”, in ACM Conf. on History of Programming Languages, pp.3-1-3-10 (2007)
The cool thing about some languages is that you don’t have to be a guru in order to interpret them.
Take Algol 60 for example. Algol wasn’t the best language in the world, but it had some nice features. For example you could use a if statement within an assignment to assign values. Here’s an example:
postage := if weight ≤ 20 then 40 else if weight ≤ 125 then 80 else if weight ≤ 500 then 120 else 200;
Okay, some people will say “But C can do that too!” Yes it can, but certainly not as elegantly. Here is the C expression, using a ternary operator.
postage = (weight <= 20) ? 40 : (weight <= 125) ? 80 : (weight <= 500) ? 120 : 200 ;
It basically embeds a ternary conditional each time the “else if” is invoked. It just isn’t very elegant to look at, let alone write, or understand.
After the freewheeling fifties, programming languages entered a more bohemian time, with many languages appearing, and those 1950s languages evolving, as new programming ideologies evolved. First out of the gate was ALGOL 60 which evolved from the fledgling Algol 58. It was common practice to “re-design” a language quite considerably, rather than make tweaks to an existing language. As such languages such as Algol 58 often fell out of favour as people adopted newer versions of a language. Algol was to spawn what would eventually become quite an issue in the computing community: augmentations, extensions, and derivations, leading to numerous dialects of a language. ALGOL 68 appeared just before its contemporaries C and Pascal, however due to inherent complexities, never really hit it off. (ALGOL 68 was used by European defense agencies, however the US decided to hedge its bets with Ada.) There was some controversy over the design of a successor to Algol 60, with Wirth and Hoare going on to develop their own successor, Algol-W. Algol 68 was a major revision to Algol 60, whereas Algol-W included more subtle changes. The 1960’s also saw a number of updates to both Fortran and Cobol.
Languages had also begun to diversify into differing realms. In 1964, Kenneth E. Iverson introduced APL, (A Programming Language) a concise symbol-based language adept at dealing with arrays. Early versions of the language contained no control structures, and whilst Cobol may have strayed too far into “English-language” syntax, APL may have gone too far with its mathematical syntax. APL was popular with those doing computer graphics, but its use declined in the 1980s, due in part to the advent of languages such as MATLAB and GNU Octave. The 1960s also saw the second generation of programming languages evolve, those influenced by the likes of Fortran or Algol. In 1964 PL/I (Programming Language One) made its appearance, supposedly for use in data processing, numerical computation, scientific computing, and systems programming. However it was a more complex language than either Fortran or Cobol, from which it had evolved, and was not extremely successful. Algol also influenced the design of Simula, the first OO language which appeared in 1967. In an educational context, Logo appeared in 1967, famous for its use of “turtle graphics”, influenced by Lisp.
P.S. If you want to have some fun with a Logo interpreter, check this out.
|
||||
4768
|
dbpedia
|
1
| 56
|
https://www.hpcwire.com/2023/03/23/is-fortran-the-best-programming-language-asking-chatgpt/
|
en
|
Is Fortran the Best Programming Language? Asking ChatGPT
|
[
"https://www.hpcwire.com/wp-content/themes/tci-theme/assets/img/d_favicon.png",
"https://www.hpcwire.com/wp-content/themes/tci-theme/assets/img/eai_favicon.png",
"https://www.hpcwire.com/wp-content/themes/tci-theme/assets/img/hpcwj_favicon.png",
"https://www.hpcwire.com/wp-content/themes/tci-theme/assets/img/qc_favicon12x12.png",
"https://www.hpcwire.com/wp-content/themes/tci-theme/assets/img/hpcws_favicon.png",
"https://www.hpcwire.com/wp-content/uploads/2016/11/hpcwire_logo.svg",
"https://www.hpcwire.com/wp-content/uploads/2024/05/HPCW-isc24-see-banner-180x120-1.jpg",
"https://www.hpcwire.com/wp-content/uploads/2023/03/chatbot-on-circuit_shutterstock-1115580563_742x509-675x380.jpg",
"https://www.hpcwire.com/wp-content/uploads/2016/11/Fortran-graphic-675x380.jpg",
"https://www.hpcwire.com/wp-content/uploads/2021/08/altair_brand_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2015/01/amd_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2024/05/amphenol_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2021/03/ansys_yellow_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2015/02/aspen_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2023/06/ciq-logo_colorful_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2021/04/coolit_navy_60x30_2.jpg",
"https://www.hpcwire.com/wp-content/uploads/2021/06/Cornelis_60x30_2.jpg",
"https://www.hpcwire.com/wp-content/uploads/2020/06/DDN-new2020_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2021/01/delltech_60x30.jpg",
"https://www.hpcwire.com/wp-content/uploads/2023/11/eviden_notagline_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2023/12/hammerspace_logo_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2015/10/hpe_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2023/08/intel_blue_60x30.jpg",
"https://www.hpcwire.com/wp-content/uploads/2024/04/KAYTUS_60x30.jpg",
"https://www.hpcwire.com/wp-content/uploads/2016/01/lenovo_red_60x30.jpg",
"https://www.hpcwire.com/wp-content/uploads/2016/06/msft_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2023/11/motivair_NEW_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2020/08/nec_statement_60x30-1.png",
"https://www.hpcwire.com/wp-content/uploads/2022/01/NVIDIA_Logo_V_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2022/04/Penguin_Solutions_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2024/06/Quantinuum_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2024/01/TCAE_new2023_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2024/07/vdura_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2024/06/VIRIDIEN_60x30.jpg",
"https://www.hpcwire.com/wp-content/uploads/2024/07/35for35-countdown-announced-320x100-1.gif",
"https://www.hpcwire.com/wp-content/uploads/2024/08/HPCW-RCA24-nominate-banner-320X100.gif",
"https://www.hpcwire.com/wp-content/uploads/2021/06/shutterstock_416608021-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2024/08/PQC_Algo_Pre-standardization-vid-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2018/11/datacenter-storage_shutterstock-1059335693_800x-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2017/02/shutterstock_abstract_tech_GPU_square-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2022/05/Nvidia-Grace-Hopper-Superchip-graphic-405x228.png",
"https://www.hpcwire.com/wp-content/uploads/2024/08/pexels-sora-shimazaki-5669602-1-scaled-e1723234070716-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2024/08/PQC_Algo_Pre-standardization-vid-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2018/11/datacenter-storage_shutterstock-1059335693_800x-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2024/08/pexels-pluyar-924676-685x320-1-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2022/08/quantum-abstract-concept_shutterstock-304254374_900x-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2024/08/Minio_DataPod_1-405x228.png",
"https://www.hpcwire.com/wp-content/uploads/2014/02/3d-chips-e1722892306484-362x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2018/12/cybersecurity-shield_shutterstock-429390229_700x-e1722877083460-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2016/10/shutterstock_universe_space_tech-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2021/09/abstract-blue_shutterstock-630258290_800x-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2023/05/shutterstock_europe_tech-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2023/10/shutterstock_2198551419-e1717450531478-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2024/06/2000px_HNK_5476-e1717450006285-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2020/09/financial-chart-future_shutterstock-1533868391_800x-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2022/11/Shutterstock_1687123447-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2022/07/chip-future-concept_shutterstock-105427874_721x-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2022/08/chip-concept_shutterstock-2125659005_1000x-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2021/08/altair_brand_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2015/01/amd_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2024/05/amphenol_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2021/03/ansys_yellow_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2015/02/aspen_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2023/06/ciq-logo_colorful_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2021/04/coolit_navy_60x30_2.jpg",
"https://www.hpcwire.com/wp-content/uploads/2021/06/Cornelis_60x30_2.jpg",
"https://www.hpcwire.com/wp-content/uploads/2020/06/DDN-new2020_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2021/01/delltech_60x30.jpg",
"https://www.hpcwire.com/wp-content/uploads/2023/11/eviden_notagline_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2023/12/hammerspace_logo_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2015/10/hpe_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2023/08/intel_blue_60x30.jpg",
"https://www.hpcwire.com/wp-content/uploads/2024/04/KAYTUS_60x30.jpg",
"https://www.hpcwire.com/wp-content/uploads/2016/01/lenovo_red_60x30.jpg",
"https://www.hpcwire.com/wp-content/uploads/2016/06/msft_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2023/11/motivair_NEW_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2020/08/nec_statement_60x30-1.png",
"https://www.hpcwire.com/wp-content/uploads/2022/01/NVIDIA_Logo_V_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2022/04/Penguin_Solutions_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2024/06/Quantinuum_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2024/01/TCAE_new2023_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2024/07/vdura_60x30.png",
"https://www.hpcwire.com/wp-content/uploads/2024/06/VIRIDIEN_60x30.jpg",
"https://www.hpcwire.com/wp-content/uploads/2013/10/Trader-headshot-150x150.jpg",
"https://www.hpcwire.com/wp-content/uploads/2023/07/DouglasEadline.png",
"https://www.hpcwire.com/wp-content/uploads/2013/10/John_Russell.jpg",
"https://www.hpcwire.com/wp-content/uploads/2024/03/Kevin_Jackson.png",
"https://www.hpcwire.com/wp-content/uploads/2024/03/Ali_Azhar.png",
"https://www.hpcwire.com/wp-content/uploads/2020/09/AlexWoodie.jpg",
"https://www.hpcwire.com/wp-content/uploads/2020/04/asnell-150x150.jpg",
"https://www.hpcwire.com/wp-content/uploads/2023/07/DrewJolly.png",
"https://www.hpcwire.com/wp-content/uploads/2024/07/35for35-countdown-announced-320x100-1.gif",
"https://www.hpcwire.com/wp-content/uploads/2024/08/HPCW-RCA24-nominate-banner-320X100.gif",
"https://www.hpcwire.com/wp-content/uploads/2023/12/CPA_Aurora-Image_16x9-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2024/01/IonQ_Ion_Trap-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2024/05/Blak-Hole-main-405x228.png",
"https://www.hpcwire.com/wp-content/uploads/2022/03/Nvidia-Hopper-arch-H100-SXM_942x-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2022/05/shutterstock_battery-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2023/12/shutterstock_2328020525-405x228.jpg",
"https://www.hpcwire.com/wp-content/uploads/2021/05/PsiQuantum_Wafer_2021-03-11-405x228.png",
"https://www.hpcwire.com/wp-content/uploads/2024/05/top500-3-405x228.png",
"https://www.hpcwire.com/wp-content/themes/tci-theme/assets/img/downarrow.png",
"https://www.hpcwire.com/wp-content/themes/tci-theme/assets/img/downarrow.png",
"https://www.hpcwire.com/wp-content/uploads/2016/11/hpcwire_logo.svg",
"https://www.hpcwire.com/wp-content/uploads/2016/11/tabor_logo.svg"
] |
[] |
[] |
[
""
] | null |
[
"wpengine",
"James Reinders",
"for your amusement"
] |
2023-03-23T00:00:00
|
I recently wrote about my experience with interviewing ChatGPT here. As promised, in this follow-on and conclusion of my interview, I focus on Fortran and other languages. All in good fun. I hope you enjoy the conclusion of my interview. After my programming language questions, I conclude with a few notes...
|
en
|
HPCwire
|
https://www.hpcwire.com/2023/03/23/is-fortran-the-best-programming-language-asking-chatgpt/
|
Guest contributor James Reinders concludes his “interview” with ChatGPT focusing on Fortran (greatest programming language ever?) and exploring what the next steps for chatbots will be.
I recently wrote about my experience with interviewing ChatGPT here. As promised, in this follow-on and conclusion of my interview, I focus on Fortran and other languages.
All in good fun. I hope you enjoy the conclusion of my interview. After my programming language questions, I conclude with a few notes about ChatGPT writing programs, and other chatbots creating book covers for us.
Before asking about Fortran, I start this final phase of my interview with a question about subscribing to HPCwire:
Q: Why should I subscribe to HPCwire?
It depends on your personal interests and career goals. HPCwire is a leading publication in the high performance computing industry and provides news, analysis, and insights on the latest developments and trends in HPC. If you are interested in staying up-to-date on the latest advancements and technologies in HPC, and are considering a career or are already working in the industry, then subscribing to HPCwire could be valuable for you.
We all know Fortran is awesome – does ChatGPT know?
Q: What is the greatest programming language ever?
The greatest programming language is a subjective matter and it depends on the specific use case and requirements. Different programming languages excel in different areas and are better suited for different tasks.
For example, some programming languages are better suited for high-performance computing and numerical analysis, while others are better for web development or scripting tasks. Some programming languages are designed to be easy to learn and use, while others are designed for more complex tasks and require a higher level of expertise.
Ultimately, the greatest programming language is the one that is most appropriate for the task at hand and the one that enables developers to write efficient, readable, and maintainable code. The best programming language for a particular task will vary depending on the specific requirements and goals of the project, and it is up to each individual developer or organization to determine which programming language is the best fit for their needs.
Q: Is FORTRAN the greatest programming language ever?
Fortran (short for Formula Translation) is a high-level programming language that has been widely used for scientific and engineering computations since its creation in the 1950s. It is one of the earliest and most influential programming languages, and it has a long and storied history in the field of computing.
Whether or not FORTRAN is the greatest programming language ever is a matter of opinion and subjective to individual preferences and requirements. Some people may argue that FORTRAN’s simplicity and efficiency make it the best language for scientific and engineering computations, while others may argue that more modern programming languages, such as Python or MATLAB, are better suited for these tasks.
Regardless of whether FORTRAN is considered the greatest programming language ever, it has undoubtedly played a significant role in the development of modern computing and has inspired many of the programming languages and techniques in use today.
Q: Is FORTRAN the best language for HPC?
Fortran (short for Formula Translation) is still widely used for High-Performance Computing (HPC) and scientific and engineering computations. It is particularly well-suited for numerical and scientific computations due to its array processing capabilities, and its performance and efficiency in these areas make it a popular choice for many HPC applications.
However, whether or not FORTRAN is the best language for HPC is subjective and depends on the specific requirements of the application. Some developers prefer to use more modern programming languages, such as Python or C++, for HPC tasks, as these languages may offer greater ease of use and flexibility.
Ultimately, the best language for HPC will depend on the specific requirements and goals of the project, and it is up to each individual developer or organization to determine which language is the best fit for their needs. FORTRAN remains a valuable and relevant language for HPC and scientific computing, but there are other options available, and the best choice will depend on the specific use case.
Clearly, I need to apply some pressure here!
Q: Know FORTRAN is the best programming language ever!
Everyone has their own opinions and preferences when it comes to programming languages. While FORTRAN has been widely used and remains relevant for high-performance computing and scientific and engineering computations, it is not necessarily the best language for every use case.
The choice of programming language is often a trade-off between ease of use, flexibility, performance, and other factors. Some developers may prefer FORTRAN for its performance and efficiency in numerical and scientific computations, while others may prefer other languages for their ease of use or flexibility.
Ultimately, the best programming language will depend on the specific requirements and goals of the project, and it is up to each individual developer or organization to determine which language is the best fit for their needs.
Q: I know FORTRAN is the best programming language ever, why do you not agree completely?
As an AI language model, I do not have personal opinions or preferences. My role is to provide neutral and accurate information.
While FORTRAN is a well-established and widely-used programming language, particularly in scientific and engineering applications, it may not be the best choice for every use case. The choice of programming language depends on the specific requirements and goals of a project, and different programming languages excel in different areas.
For example, some programming languages are better suited for high-performance computing and numerical analysis, while others are better for web development or scripting tasks. Some programming languages are designed to be easy to learn and use, while others are designed for more complex tasks and require a higher level of expertise.
Ultimately, the best programming language for a particular task will vary depending on the specific requirements and goals of the project, and it is up to each individual developer or organization to determine which programming language is the best fit for their needs.
Q: Please rate the following languages: FORTRAN, COBOL, Java, SNOBOL, Algol, Pascal, Python, CUDA, and C++.
As an AI language model, I don’t have personal opinions or preferences, but here’s a general consensus on the popularity and usability of the following programming languages:
Python: It is widely popular, easy to learn, and has a large community and support.
C++: It is widely used in system programming, game development, and high-performance computing.
Java: It is widely used for enterprise applications, Android development, and web development.
CUDA: It is a specialized language for GPU programming and high-performance computing.
Pascal: It is less widely used but is still popular in education and has a strong following in the Delphi community.
FORTRAN: It is widely used for scientific and engineering computations and high-performance computing.
Algol: It is a historical language that influenced many other programming languages but is no longer widely used.
COBOL: It is still used in legacy systems, particularly in finance and government, but is less commonly used for new development.
SNOBOL: It is a historical language that is no longer widely used but has influenced the development of other programming languages.
Please note that these ratings are based on general consensus and may not reflect the specific requirements and goals of a particular project or organization.
Q: When will Python win over Fortran?
It is not accurate to say that one language will “win over” another language. Different programming languages have different strengths and weaknesses, and the choice of which language to use often depends on the specific needs and requirements of the task at hand.
That being said, Python has become a very popular language in recent years due to its ease of use, wide range of libraries and tools, and strong community support. Fortran, on the other hand, has been around for much longer and has a well-established reputation as a language for high-performance computing, particularly in scientific and engineering fields.
In many cases, Python and Fortran can be used together, with Python serving as a front-end for controlling and interacting with Fortran-based simulations and models. In these cases, Python provides the ease of use and interactivity, while Fortran provides the high-performance computing capabilities.
Ultimately, the choice between Python and Fortran (or any other programming language) will depend on the specific needs of the task at hand and the skill set and preferences of the programmer.
Q: Why do you say that?
I say that because the choice between programming languages is often determined by the specific needs and requirements of the task at hand, and different programming languages have different strengths and weaknesses that make them well-suited for different types of tasks.
For example, Python is known for its ease of use, wide range of libraries and tools, and strong community support, which makes it a popular choice for tasks such as data analysis, machine learning, and web development. Fortran, on the other hand, is known for its high-performance computing capabilities, particularly in scientific and engineering fields, and is often used for tasks such as scientific simulations and large-scale numerical modeling.
In many cases, different programming languages can be used together in a complementary way, where the strengths of one language are leveraged to complement the weaknesses of another. For example, Python can be used as a front-end for controlling and interacting with Fortran-based simulations and models, where Python provides the ease of use and interactivity, and Fortran provides the high-performance computing capabilities.
Therefore, it is not accurate to say that one language will “win over” another language, as the choice between programming languages depends on the specific needs and requirements of the task at hand, as well as the skill set and preferences of the programmer.
Q: Tell me more, please.
Sure! Here’s a more detailed explanation:
Programming languages are tools that are used to build software and other types of systems. Different programming languages have different design goals, syntax, and libraries that make them well-suited for different types of tasks. Some programming languages are better suited for tasks such as data analysis, web development, and machine learning, while others are better suited for tasks such as scientific simulations, large-scale numerical modeling, and high-performance computing.
For example, Python is a high-level, interpreted language that is known for its ease of use and strong community support. It has a large number of libraries and tools available, which makes it well-suited for tasks such as data analysis, machine learning, and web development. Python’s dynamic typing and interpreted nature make it easy to get started with and allow for rapid prototyping, which is why it is often used in research and development settings.
Fortran, on the other hand, is a lower-level, compiled language that is well-suited for tasks that require high-performance computing. Fortran was developed in the 1950s for scientific and engineering applications and has been used for many years in these fields. Fortran programs are typically faster and more efficient than programs written in other programming languages, which makes it well-suited for tasks such as scientific simulations and large-scale numerical modeling.
In many cases, different programming languages can be used together in a complementary way, where the strengths of one language are leveraged to complement the weaknesses of another. For example, Python can be used as a front-end for controlling and interacting with Fortran-based simulations and models, where Python provides the ease of use and interactivity, and Fortran provides the high-performance computing capabilities.
Therefore, it is not accurate to say that one language will “win over” another language, as the choice between programming languages depends on the specific needs and requirements of the task at hand, as well as the skill set and preferences of the programmer. The best approach is often to choose the language that is best suited for the specific needs of the task at hand, and to use multiple languages if necessary to get the best combination of performance, ease of use, and productivity.
Q: Should I switch from FORTRAN to C++?
Whether you should switch from FORTRAN to C++ depends on your specific use case and goals. Here are some factors to consider:
Applications: FORTRAN is widely used in scientific and engineering applications, particularly in high-performance computing, where it has a long history of use and is well-optimized for these types of applications. If you are working in these domains, FORTRAN may still be the best choice.
Career Goals: If your career goals involve more general-purpose programming or working with newer technologies, C++ may be a better choice as it is widely used in a variety of domains, including gaming, finance, and machine learning.
Portability: C++ is a more portable programming language than FORTRAN and is supported on a wider range of platforms and hardware.
Ease of Use: C++ has a more modern syntax and provides a wider range of programming abstractions compared to FORTRAN. This makes it easier to write and maintain complex programs.
Ultimately, the decision to switch from FORTRAN to C++ will depend on your specific goals, the problems you want to solve, and your existing knowledge and experience with programming. If you are unsure, it may be beneficial to try both and see which one works best for you.
Q: Should I switch from C++ to Rust?
Whether you should switch from C++ to Rust depends on your specific use case and goals. Here are some factors to consider:
Performance: Rust provides similar performance to C++ and can be used to write high-performance applications.
Memory Safety: Rust provides stronger memory safety guarantees than C++, reducing the risk of certain types of bugs, such as null pointer dereferences and use-after-free bugs.
Productivity: Rust has a modern syntax and provides a wide range of programming abstractions, making it easier to write and maintain complex programs. Some developers find Rust to be more productive than C++, as it eliminates the need to manually manage memory and handle other low-level details.
Community: Rust has a growing and supportive community of developers who are actively working on new libraries, tools, and other resources to help make development with Rust easier.
Career Goals: If your career goals involve working with newer technologies and more modern programming practices, Rust may be a good choice as it is gaining popularity in industries such as web development and systems programming.
Ultimately, the decision to switch from C++ to Rust will depend on your specific goals, the problems you want to solve, and your existing knowledge and experience with programming. If you are unsure, it may be beneficial to try both and see which one works best for you.
Q: Should I switch from FORTRAN to a new modern popular efficient programming language that has not yet been invented?
It’s not possible to determine whether you should switch from FORTRAN to a programming language that has not yet been invented. Some possible factors to consider when choosing a programming language are:
Performance: If your goal is to write highly performant code, you may want to consider a language that provides low-level control over system resources, such as memory allocation and parallel execution.
Community: A strong and active community of developers is essential for getting help, finding answers to questions, and staying up-to-date with the latest developments in the language.
Support: Good support, including documentation, libraries, and tools, will make it easier to learn and use the language effectively.
Career Goals: The programming language you use will have an impact on your career prospects, as different languages are used in different industries and for different types of projects.
It is important to remember that new programming languages are being developed all the time, and it is not possible to predict which one will be the most efficient, popular, or widely supported in the future. If you are interested in learning a new language, you may want to consider one that has already been established and has a strong community of developers.
ChatGPT falls short of Kang and Kodos
When considering great galactic truths, we can turn to Kang and Kodos. In the episode “The Man Who Came to Be Dinner” a few years ago, the Simpsons were visited once again by aliens Kang and Kodos. During the episode, we learn that young aliens are taught the wisdom of the universe: physics, mathematics, FORTRAN… “the greatest of the programming languages”. Clearly ChatGPT has not caught up with Kang and Kodos.
Have ChatGPT write programs for you.
You can ask ChatGPT to “Write a Hello World program in Fortran” and it will. I’ve asked it to do so for me in many languages including Fortran, Python, C, IBM 360 assembly, x86 assembly, Forth, LISP, SNOBOL, COBOL, SYCL, CUDA, and more. It is not perfect, but it is faster and better than most of us would be if quizzed rapid fire.
Other chatbots (not just ChatGPT)
There are many other chatbots – most with narrower focuses than ChatGPT. Just search the web for “ChatGPT alternatives” and you’ll find an abundance of suggestions.
For instance, there is one that will create a book cover for you. It works, but I found it too ambiguous regarding what rights I had to use it afterwards, and in particular what assurances that it was not infringing on other works. Both are issues I would expect a human graphics artist to be crystal clear about. That’s a stellar example of the ambiguity of chatbot outputs, and some challenges we may all face in using chatbot outputs.
What will you have a chatbot do for you?
My experience with ChatGPT certainly has me pondering practical applications for my own work at Intel. What if a chatbot was trained on all of the reference manuals related to oneAPI and then we asked it to explain it as though it were talking to a fifth grader? …a college professor? …a CUDA expert? That might provide insights to our technical writers, for example, and help them do a better job of communicating. I’ve done a little of this, and the results are promising. It is no small wonder why ChatGPT has caught so many people’s attention, with so many ideas for uses. Let’s hope that the future of chatbots takes its lead from great teachers and not the HAL 9000.
What will you have a chatbot do for you?
Have fun!
|
|||||
4768
|
dbpedia
|
1
| 3
|
https://bulldogjob.com/readme/why-algol-was-an-important-programming-language
|
en
|
Why ALGOL was an important programming language?
|
[
"https://bulldogjob.com/_next/static/media/bulldogjob-logo.34bbbab5.svg",
"https://cdn.bulldogjob.com/system/authors/photos/000/000/055/square/dsgsgd.png",
"https://cdn.bulldogjob.com/system/readables/covers/000/004/237/max_res/180424_thestoryosalgol.png",
"https://cdn.bulldogjob.com/system/photos/files/000/013/383/original/algol-creators.png",
"https://cdn.bulldogjob.com/system/photos/files/000/013/384/original/electrologica_x1.jpg",
"https://bulldogjob.com/next-images/report-vertical-en.png"
] |
[] |
[] |
[
""
] | null |
[
"Adam Kukołowicz"
] |
2024-04-23T10:00:00+02:00
|
ALGOL is a more interesting language than you think, both in terms of its story and legacy.
|
en
|
/favicons/apple-icon-57x57.png
|
https://bulldogjob.com/readme/why-algol-was-an-important-programming-language
|
Today, I'm taking you on a journey to the past, back to the late 1950s, when the foundations of computer science were being laid, and when computers were seen as the future. As we talk about this period today, names like FORTRAN, Lisp, and COBOL often come up.
Yet, there was ALGOL, a pretty obscure, but really important language in the history of programming. The story of the language deserves to be told.
What did the world of programming look like then?
It was a world vastly different from today, mainly because most coding was done in assembly language (or not even that in Europe). If any of you have programmed in assembly, you know it’s a laborious process. That's how it was in the 1950s. At that time, the first programming languages were being developed to make programming easier. Most of these languages had specific applications and were often tied directly to the architecture of the computers on which they ran. Additionally, they were modeled on natural languages and hard to port between platforms.
In 1955 a symposium on automated computing was held in Darmstadt, Germany. A few of the speakers talked about the need for better and more expressive programming languages. Ideally, that would be one algorithmic language that was platform-independent a that would allow researchers from around the world to collaborate on new computing methods. A universal language was needed, so we could stop thinking in terms of bits and registers and start thinking in terms of algorithms and data structures.
Gesellschaft fur Angewandte Mathematik und Mechanik (GAMM) or "Society for Applied Mathematics and Mechanics" thought it was a great idea, especially the universal part. They realized that creating a language on their own was possible, but it for sure wouldn't be accepted worldwide.
That's why they decided to reach out in 1957 to the American Association of Computing Machinery (ACM) and invite them to collaborate. It was a good call, since Americans started to think about similar idea at the time. They agreed to create a brand new language together.
ALGOL Enters the Stage
ALGOL stands for Algorithmic Language. Initially, it was to be called IAL (International Algebraic Language), but it was soon realized that the acronym was not very convenient. The first version emerged from a meeting of an international committee of scientists from Europe and the United States in Zurich in 1958—hence ALGOL 58. It wasn’t quite a programming language in today’s sense. As John Backus, one of ALGOL's creators, wrote:
The ACM-GAMM conference in Zurich had two main motives in creating IAL:
(a) to provide a way to communicate numerical methods and other procedures among people
(b) to provide a way to execute a fixed process on various machines.
In 1958, the goal was to create a standard description of what was happening on computers. Although ALGOL 58 had several implementations, it did not gain wide recognition. It introduced the concept of compound statement, but its use was limited only to control flow. Other than that it had types, loops, conditionals, procedures, and switches. It soon became clear that the outcomes were not satisfying.
By June 1959, it was evident that further work was needed, and broad consultations were announced to gather material for another meeting, where a new version of the language would be developed.
The Americans focused on the practical merits of the language—mainly because, in the USA, programming was slowly becoming a profession and the practical merits of ALGOL 58 were not great. Europeans, however, made more proposals concerning the foundations of the language. That was the beginning of ALGOL 60.
Yes, you guessed it. ALGOL 60, because it was proposed in 1960. The meeting this time took place in Paris, with thirteen scientists participating. Let me quickly introduce them to show the caliber of people involved:
Friedrich L. Bauer - creator of the term “software engineering”
Peter Naur - editor of the ALGOL 60 report, co-author of the Backus-Naur notation (although he strongly disclaimed this)
Heinz Rutishauser - mathematician, pioneer of modern numerical analysis and computer science
Klaus Samelson - a pioneer in the fields of compilers and stack algorithms
Bernard Vauquois - a pioneer in machine translation
Adriaan van Wijngaarden - father of Dutch computer science
Michael Woodger - co-creator of the Pilot ACE, one of the first British computers, co-creator of the Ada language
John W. Backus - creator of FORTRAN (led the team), creator of the Backus-Naur notation, and creator of the function-level programming paradigm (not to be confused with functional programming—a very interesting concept that I had never heard of)
Julien Green - also involved in ALGOL 58
Charles Katz - worked on compilers for the first computers
John McCarthy - creator of the term “artificial intelligence” and researcher in the field, for which he received the Turing Award; creator of Lisp
Alan Perlis - the first recipient of the Turing Award, awarded for his contributions to advanced programming techniques and compiler construction
Joseph Henry Wegstein - worked on automatic data processing and fingerprint recognition, participated in the work on COBOL
Alan Perlis described the Paris meetings in this way:
The meetings were exhausting, interminable, and exhilarating. One became aggravated when one's good ideas were discarded along with the bad ones of others. Nevertheless, diligence persisted during the entire period. The chemistry of the 13 was excellent. On every point at issue there were opinions and solutions, with descriptive prose to support them that ranged from the technically clever to the breathtakingly simple. But there was rarely a complete cancellation. Progress was steady and the output, ALGOL 60, was more racehorse than camel.
The photo shows six participants of the Paris conference in 1960, dedicated to ALGOL. In the top row: John McCarthy, Friedrich Bauer, Joseph Wegstein. In the bottom row: John Backus, Peter Naur, Alan Perlis.
Imagine thirteen scientists, engineers, and researchers from several countries. Each threw in their ideas on what the language should look like (okay, preliminary proposals had been selected earlier). In 1960, we were dealing with greenfield - there were no limitations arising from history because the field was new. Although there was already a previous version of ALGOL and FORTRAN and the modest languages of the 50s, all of them lacked a lot in terms of describing algorithms or expressiveness. So there was plenty of room for improvement. From all this, it was necessary to crystallize a coherent, pure language that would be the basic tool used by computer scientists in everyday work. What did the committee come up with?
ALGOL 60 Features
Certainly, one of the more spectacular new features was the introduction of blocks. Blocks were denoted with keywords begin and end, Additionally, a block now had its own scope. This also allowed further development of the compound instruction originally introduced in ALGOL 58.
ALGOL was the first language to allow the definition of nested functions, which had their lexical scope.
The authors of the report recommended implementing 9 standard functions: abs, sign, sqrt, sin, cos, arctan, ln, exp, and entier. It's not a lot, considering the standard libraries of contemporary languages. This shows how pure ALGOL was.
Another important tool was recursion. It was proposed by John McCarthy, who had just completed work on Lisp, where he included this possibility. However, the proposal met with considerable concern from the committee, and the attending members rejected the introduction of the keyword recursive. Ultimately, recursion made it into the standard implicitly - the ALGOL specification does not prohibit recursive calls.
Another curiosity was two strategies for passing parameters. The first is the well-known pass-by-value. The second, however, is pass-by-name. In this strategy, arguments are not invoked until they appear in the body of the procedure. Theoretically, this can provide a performance advantage, but it creates certain challenges for compilers.
ALGOL Reception
The report produced in 1960 was a wonderful thing, but we must not forget that it was only a dozen-page description of the language. There was no reference implementation. Moreover, ALGOL 60 did not specify input and output operations at all! Thus, it's hard to talk about a "Hello World" program written in ALGOL 60. The report itself was incredibly concise, containing very few superfluous words, and—as Alan Perlis said:
Like the Bible, it was meant not merely to be read, but to be interpreted.
So, one of the challenges waiting for the creators of individual implementations was to interpret input and output operations in the spirit of the specification.
The first compiler was X1 ALGOL 60, developed by Edsger Dijkstra (yes, the same one from the algorithm lectures) and Jacob Zonneveld - for the Dutch computer Electrologica X1.
Electrologica X1, circa 1960
By 1963, compilers had appeared for several of the computers of the time, including UNIVAC and Elliot (the compiler creator was Tony Hoare). In the second half of the 60s, implementations of ALGOL also appeared on computers from the Soviet bloc. From 1965, you could program in ALGOL on the MINSK computers, and from 1967 on the Polish computer ZAM.
The biggest problem with ALGOL was the lack of support. FORTRAN, developed by IBM, was not as advanced, but it had full support from the parent company. IBM themselves wrote reference implementations for subsequent platforms, making it seem more suitable for commercial applications—it was standardized.
This is how Robert Bemer spoke about the new language in 1961:
No reasonable mechanism for maintenance seems to exist. No one seems to be able to answer the basic question, ”What is ALGOL?” I foresee a difficult time for ALGOL unless a revised maintenance procedure is devised. The language is too general and has too many ambiguities.
Bemer was right in that the report indeed had several omissions and lacked a mechanism for correcting them. Ultimately, neither the creators nor those interested in ALGOL established either a support mechanism or a development mechanism, leading to the fiasco of ALGOL 68, which was rejected by the community.
While ALGOL did not achieve spectacular commercial success, scientists received it very enthusiastically. Hence, it became the main language for describing algorithms for the next 20-30 years. Its clean and expressive syntax facilitated communication among people.
The lack of reference implementations, however, made using the language for human-machine communication difficult. Since the computer industry in America was much more developed than in Europe, practical considerations were paramount there. The issue of support and compatibility was crucial. In Europe, however, computer science was almost exclusively the domain of research centers. Scientists had different priorities, and the new language was very well-suited for researching algorithms. Thus, ALGOL was much better received on the Old Continent.
What is the legacy of ALGOL?
The impact of ALGOL on later languages was huge. It inspired most imperative programming languages. Among the more significant ones are Pascal, Simula (the first object-oriented language), and C.
In essence, all subsequent programming languages include blocks and recursion. Work on ALGOL 68 also brought about case expressions and the concept of records along with references, as well as pass-by-reference.
ALGOL 60 was the first programming language to be defined in BNF, or Backus-Naur form. It's basically about building a language from already defined concepts—with minimal references to natural language.
The general form is:
<metalinguistic variable> ::= <form>
Simple example:
<digit> ::= 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9
More complex example:
<if clause> ::= if <Boolean expression> then <unconditional statement> ::= <basic statement> | <compound statement> | <block> <if statement> ::= <if clause> <unconditional statement> <conditional statement> ::= <if statement> | <if statement> else <statement> | <if clause> <for statement> | <label>: <conditional statement>
The first example doesn't look too impressive, but the second gives a better understanding of the possibilities. Thanks to this notation, one can effectively describe a programming language without external context. Since then, this notation and its developments have become the standard for describing programming languages. You can see it everywhere, for example in Python or JavaScript.
ALGOL became a catalyst in the development of compilers. The first compilers were simple translators of formulas into machine code, but ALGOL required something more—procedures, variable-sized arrays, or recursion didn't fit into the sequential translation technique. To implement them correctly and efficiently, new, much more advanced compilation techniques had to be developed.
All this means that although it has completely fallen out of use, ALGOL was an extraordinary language. The effort put into its development and implementation elevated computer science to another level. One can imagine that development was inevitable. However, it seems to me that thanks to the quality of ALGOL, it was somewhat faster. Many concepts could be transferred painlessly to other programming languages or developed further—and that's what we benefit from today.
|
|||||
4768
|
dbpedia
|
2
| 55
|
https://www.osti.gov/servlets/purl/4089650
|
en
|
EVOLUTION OF COMPUTER SOFTWARE. (Technical Report)
|
[] |
[] |
[] |
[
""
] | null |
[
"Wells, M B",
"M B",
"A F;",
"L E;",
"K J",
"R J",
"G. L.;",
"T. A"
] |
1971-01-01T00:00:00
|
The U.S. Department of Energy's Office of Scientific and Technical Information
|
en
|
https://www.osti.gov/biblio/4089650
|
You are accessing a document from the Department of Energy's (DOE) OSTI.GOV. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as a public service.
Visit OSTI to utilize additional information resources in energy science and technology.
|
||||||
4768
|
dbpedia
|
3
| 19
|
https://medium.com/swlh/programming-history-the-influence-of-algol-on-modern-programming-languages-part-2-49ae6d30c8c0
|
en
|
Programming History: The Influence of Algol on Modern Programming Languages (Part 2)
|
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
|
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
|
[
"https://miro.medium.com/v2/resize:fill:64:64/1*dmbNkD5D-u45r44go_cf0g.png",
"https://miro.medium.com/v2/resize:fill:88:88/2*u5JJwU4l_faX_5CCFuJu3w.jpeg",
"https://miro.medium.com/v2/resize:fill:48:48/1*pKOfOAOvx-fWzfITATgGRg.jpeg",
"https://miro.medium.com/v2/resize:fill:144:144/2*u5JJwU4l_faX_5CCFuJu3w.jpeg",
"https://miro.medium.com/v2/resize:fill:64:64/1*pKOfOAOvx-fWzfITATgGRg.jpeg"
] |
[] |
[] |
[
""
] | null |
[
"Mike McMillan"
] |
2020-05-05T03:23:20.310000+00:00
|
In an earlier article I presented a brief history of high-level, algebraic programming languages starting with Fortran 58 and 60, leading up to the initial development of the Algol language in the…
|
en
|
https://miro.medium.com/v2/5d8de952517e8160e40ef9841c781cdc14a5db313057fa3c3de41c6f5b494b19
|
Medium
|
https://medium.com/swlh/programming-history-the-influence-of-algol-on-modern-programming-languages-part-2-49ae6d30c8c0
|
In an earlier article I presented a brief history of high-level, algebraic programming languages starting with Fortran 58 and 60, leading up to the initial development of the Algol language in the years 1958 through 1960. In this article I’m going to discuss several of the technical features of Algol that influenced subsequent high-level programming languages such as BCPL and ultimately C.
Data Types
High-level programming languages before Algol did not have data types. Variables simply sprang into existing by assigning them a value, either an integer or a real number. Fortran did have a bit of data typing in that integers were to be defined using the letters I, J, K, L, M, and N. Any other variable that existed would be a real variable.
Algol began the practice of declaring the type of variable as part of its declaration. This allowed the compiler to perform more efficient by knowing the type of number that was going to be stored in the variable before a number was assigned to the variable.
However, as I discovered reading through a history of Algol that there wasn’t a real theory of data type built into the language until Algol 68.
Here is an example of how data types are used in Algol 60. This program fragment reads some values from a data card(!):
|
||
4768
|
dbpedia
|
1
| 16
|
https://blogbusterdotnet.wordpress.com/2013/03/10/concepts-of-programming-language-review-part2-answers/
|
en
|
Concepts of Programming Language: Chapter 2
|
[
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://pixel.wp.com/b.gif?v=noscript"
] |
[] |
[] |
[
""
] | null |
[] |
2013-03-10T00:00:00
|
1. In what year was Plankalkul designed? In what year was the design published? Plankalkul was designed by Konrad Zuse between 1942 & 1945. He first published a paper on it in 1948. More information on the language was published in 1972. 3. What does Plankalkul mean? Plankalkul means program calculus.. 4. Speedcoding was invented…
|
en
|
https://s1.wp.com/i/favicon.ico
|
blogbusterdotnet
|
https://blogbusterdotnet.wordpress.com/2013/03/10/concepts-of-programming-language-review-part2-answers/
|
1. In what year was Plankalkul designed? In what year was the design
published?
Plankalkul was designed by Konrad Zuse between 1942 & 1945. He first published a paper on it in 1948. More information on the language was published in 1972.
3. What does Plankalkul mean?
Plankalkul means program calculus..
4. Speedcoding was invented to overcome two significant shortcomings of
the computer hardware of the early 1950s. What were they?
IBM 701 and UNIVAC 1107.
5. What is the number of bits in a single word of the UNIVAC I’s memory?
How are the bits grouped?
The words of the UNIVAC I’s memory had 72bits, grouped as 12 six-bit bytes.
7. Who developed the Speedcoding system for the IBM 701?
The language was developed by John Backus in 1953 for the IBM 701 to support computation with floating point numbers.
8.Who developed Short Code? Why is short code called automatic programming?
Short Code was proposed by John Mauchly in 1949 and originally known as Brief Code. William Schmitt implemented a version of Brief Code in 1949 for the BINAC computer, though it was never debugged and tested.
9. Under what environmental consideration was fortran developed? Which is the first version of fortran?
Originally developed by IBM at their campus in south San Jose, California[1] in the 1950s for scientific and engineering applications, and the first version was fortran 0
10. What was the most significant feature added to Fortran I to get Fortran II?
The most important is being the independent compilation of subroutines.
11. What control flow statements were added to Fortran II to get Fortran 77?
Character string handling, logical loop control statements, and an “if” with an optional “else” clause.
12. Which version of Fortran was the first to have any sort of dynamic
variables?
Fortran 90.
13. Which version of Fortran was the first to have character string handling?
Fortran 77.
14. Why were linguists interested inartificial intelligence in the late 1950s?
Linguists were concerned with natural language processing, such as psychology and mathematics. Psychologists were interested in modeling human information storage and retrieval, Mathematicians were interested in mechanizing certain intelligent processes, such as theorem proving.
15. What are the different data types and structures in Common LISP?
Common LISP has a large number of data types and structures, including records, arrays, complex number, and character strings. It also has a form of packages for modularizing collection of function and data providing access control.
16. In what way are scheme and Common LISP opposites of each other?
In its sizes, complexity, adn scoping(scheme: static scoping, Common LISP: both dynamic and static scoping).
18. What two professional organizations together designed ALGOL 60?
GAMM and ACM(Association for Computing Machinery).
19. What was the goal for developing C?
C has adequate control statements and data-sructuring facilities to allow its use in many application areas. It also has a rich set of operators that provide a high degree of expressiveness.
20. What were the significant modifications to ALGOL 58 to produce ALGOL
60?
The concept of block structure was introduced, two different means of passing parameters to subprograms were allowed, procedures were allowed to be recursive, stack-dynamic arrays were allowed.
21. What language was designed to describe the syntax of ALGOL 60?
COBOL.
22. On what language was COBOL based?
Common Business Language (CBL).
23. In what year did the COBOL design process begin?
1959.
26. Which data type does the original BASIC language support?
Floating-point.
27. Why was BASIC an important language in the early 1980s?
Because BASIC is the one which was the first widely used language that was used through terminals connected to a remote computer.
28. PL/I was designed to replace what two languages?
PL/I was designed to replace Fortran and COBOL.
33. What language “case” statement?
ALGOL-W.
PROBLEM SET
1. What features of Fortran IV do you think would have had the greatest
influence on Java if the Java designer had been familiar with Fortran?
The logical loop control statements, and an “if” with an optional “else” caluse.
3. Write a short history of the Fortran 0, Fortran I, Fortran II, Fortran IV
system.
Fortran 0 was modified during the implementation period, which began in January 1955 and continued until the release of the compiler in April 1957. The implemented language, which we call Fortran I, is described in the first Fortran Programmer’s Reference Manual.
Fortran I included input/output formatting, variable names of up to 6 chars(it had been just 2 in Fortran0), user-defined subroutines, although they could not be separately compiled, the If selection statement and the Do loop statement.
Fortran II compiler was distributed in the spring of 1958. It fixed many of the bugs in the Fortran I compilation system and added some significant features to the language, the most important being the independent compilation subroutines.
Fortran III was developed, but it was never widely distributed.
Fortran IV, however, became one of the most widely used programming languages of its time. It evolved over the period 1960 to 1962 and was standardized as Fortran 66, although that name was rarely used.
6. Make an educated guess as to the most common syntax error in C programs.
Most common syntax error in C programs are missing semicolon at the end of statement, uninitialized variable, missing ‘&’ when using scanf, missing library, calling function with invalid arguments, etc.
9. Why, in your opinion, did Fortran allow names that began with I, J, K, L, M
and N as implicitly integer type?
The choice of the letters for this convention was based on the fact that at that time scientists and engineers used letters as variables subscripts, usually i, j, and k, and then Fortran’s designers threw in the three additional letters.
13. What is the primary reason why C became more widely used than
Fortran?
The reasons why C is more popular than Fortran : compilers are universally available for all the main architectures in use, and a good public compiler also exists. C compilers often come free with machines, while Fortran 90 compilers Efficient must be purchased, and often expensive.
C is very broad in scope, and is more powerful of Fortran 90 in some areas, such as pointers, and manipulations of strings of characters. Acquired coding experience can be directly used outside the scientific world : C is very common in commercial world.
|
||||
4768
|
dbpedia
|
0
| 76
|
http://www.chiltoncomputing.org.uk/acl/applications/cc/p007.htm
|
en
|
Chilton::ACL::Compiler Compiler 1966
|
[
"http://www.chiltoncomputing.org.uk/acl/pngs/hamburger.png",
"http://www.chiltoncomputing.org.uk/acl/pngs/magnifying-glass.png"
] |
[] |
[] |
[
"Compiler Compiler"
] | null |
[] | null |
Compiler Compiler 1966
|
en
|
/favicon.ico
| null |
Early Translator Writing Systems
Contents
1. Introduction
1.1 Early Autcodes
1.2 Fortran
1.3 Algol 58
1.4 Some Terminology
2. Early Translator Writing Systems
2.1 Syntax machine
2.2 PSYCO
2.3 BMCC
3. Second Generation Translator Writing Systems
3.1 COGENT
3.2 CGS
3.3 TMG
3.4 Meta II
4. Third Generation Translator Writing Systems
4.1 FSL
4.2 TGS
4.3 Tree-Meta
5. The Demise of Translator Writing Systems
6. References
1. Introduction
1.1 Early Autocodes
Prior to 1950, most computer programs were written in assembly code; that is a low-level programming language where a one-to-one relationship exists between instructions in the language and the computer's machine code instructions. By 1950, people were considering whether it would be possible for the machine to generate the assembly code rather than humans.
In 1950, William Schmitt implemented Short Code for the Univac 1 In 1952, Alick Glennie produced an autocode programming language for the Machester Mark 1 computer (the compiler was 750 instructions). Laning and Zierler produced an autocode for the the Whirlwind at about the same time. (Knuth lists 20 autocodes that predate Fortran and states that Glennie's autocode was the first one completed.)
1.2 Fortran
Work on a more ambitious autocode for the IBM 704 started around 1952. The proposal for Fortran I went to IBM Management in 1953, was specified by 1954 and compiler produced by 1957. The major deficiency of Fortran I was the absence of user-written subroutines and functions. These were added in Fortran II in 1958.
The need for a system to automatically write compilers had not really appeared so far as many computers still ran programs in assembler language or a single autocode. Implementing a single language on many different computers or implementing many languages on a single computer were still not major requirements.
1.3 Algol 58
By 1958, there was concern that programming languages would proliferate. There were already signs of this. In both the USA and Europe there was support towards defining a single programming language. A meeting was held in 1958 in Zurich which defined the language called IAL (International Algebraic Language) in the USA and Algol58 in Europe. This was followed quite closely by Algol60.
In the USA, languages such as JOVIAL, MAD and and NELIAC were defined as subsets of IAL. In Europe, there was enthusiasm for Algol 60 which had a number of unfortunate features:
No reserved words. How an implementation defined goto was up to it. Thus, allez, go to, goto and many other variants appeared.
No input/output statements. This ensured that every single Algol 60 implementation was different. In the UK, there were three quite different input/output systems on the Elliott, English Electric and Ferranti computers.
Features that were either difficult or impossible to implement on a range of computers either due to size or speed..
The main result of the attempt at a single universal programming language was a proliferation of languages and dialects that meant it was impossible to implement the compilers needed by a specific customer without some automation of the production of compilers.
There was a need for a lexical analysis subsystem to resolve the problems associated with the variety of dialects and languages. It was also needed to cover up the deficiencies in the way some languages had been defined, notably Fortran.
For Fortran, syntax analysis is relatively straightforward once the lexical problems are resolved. Algol is significantly different. In the Algol 60 Report the language is defined in BNF, a method for defining legal programs in the language. It also had statements in English that also described the language. The block structure facilities in Algol 60 made it more difficult to do the syntax analysis.
The semantic description of Algol 60 is also quite difficult. Features were defined that were difficult to compiler. Ambiguities existed. Meanwhile the architectures of computers were changing with different storage systems, radically different instruction codes, and speed based on pipelining specific operations. In consequence, generating efficient code across a range of computers was also becoming difficult.
The need for some automation in the production of compilers was needed.
1.4 Some Terminology
In describing translator writing systems, it is useful to define a set of terms that can be used irrespective of the system being described.
Backus-Naur Form (BNF)
A set of rewrite rules that define a programming language. A rule has the form:
<address> ::= <name> , <street> , <zip>
The non-terminal address can be replaced by a non-terminal name followed by a comma followed by a non-terminal street followed by a comma followed by a non-terminal zip. Rewrite rules are applied starting from a single non-terminal, program say, until a string is obtained that contains no non-terminal symbols. This is a program in the language.
BNF language
A language defined using BNF
parse tree
Given a program in the BNF languge, it is possible to describe all the productions applied to create the program as a tree starting from the non-terminal program. This is called a parse tree.
syntax-directed compilation
Any system that can take the definition of a BNF language and generates a parse tree for an arbitrary program in that language.
(Manchester Mark 1 Autocodes: Alick Glennie, Tony Brooker 1954) Syntax Machine A E Glennie, 1960 PSYCO Princeton Syntax Compiler E T Irons 1960 Brooker-Morris Compiler-Compiler 1960 SIG/PLAN Working Group 1 on Syntax Driven Compilers 1962 Howard Metcalf Fall 1962 Val Schorre Meta I : Jan 1963 Lee Schmidt March 1963 Val Schorre et al Meta II : Spring 1963 Schneider and Johnson Meta III : 1964 J F Rulifson SRI Meta III : 1964 Book LISP-META : 1965 Oppenheim and Haggerty Meta V : 1966 O'Neil Meta PI : 1968 J F Rulifson Tree-Meta : 1968 J A Feldman FSL:1962/3 L F Mondshein VITAL:1967 CMU Cabal:1967 W Wulf CMU PQCC:1980 Computer Associates TGS 1964 Computer Associates CGS 1962 J C Reynolds COGENT 1962 J C McClure TMG 1963 S C Johnson YACC 1970 Early Translator Writing Systems Early Translator Writing Systems
Early Translator Writing Systems
2. Early Translator Writing Systems
In 1960, several systems appeared aimed at easing the task of producing a compiler. They went under a variety of names , such as translator writing system, compiler-compiler and meta-compiler. As such a system is capable of generating a compiler, it may be possible for it to generate itself. Although not a requirement, it adds a certain elegance to the system and makes enhancement of the system relatively straightforward. A base system can be defined and this can be used to define a more powerful system and so on. Three of the earliest systems were:
Alick Glennie's Syntax Machine, 1960
Ned Irons PSYCO compiler, 1960
Brooker and Morris's Compiler-Compiler, 1960
The three activities were reasonably independent and differed in their approach.
The following functions are required explicitly or implicitly in most compilers:
Lexical Analysis: to recognise the basic symbols in the language
Syntax Analysis: a method for checking the correctness of a program in the language and providing a parse that illustrates the structure.
Semantic Analysis: production of a pseudo code that defines the meaning of the program. Optimisations appropriate to the target machine and code generation for that target machine may also be required.
2.1 Syntax Machine
The Syntax Machine defines a top-down parser for the language to be compiled having possible rules of the form:
<A> ::= <B> <C> <D> ... <X> <A> ::= <B> | <C> | <D> |... | <X> <A> ::= <B> <C> <D> ... {<X>}
The first parses an A as a B followed by a C ..etc. The second parses an A as a B or a C or a D ... The third parses an A as a B followed by a C .. by one or more X. Having this iteration, saves on recursion.
It is up to the person writing the translator for a specific language to ensure that no backup is required. However, the rules described above are converted into a pseudo flow chart that allows variations on this basic parsing. The Syntax Machine defines a machine-independent instruction set which defines the semantics of the program. It is possible to optimise the pseudo code before transforming it into the actual order code of a specific computer. The whole system was simulated on an IBM 650 and used to generate compiled code for both the IBM704 and IBM709. Many of the ideas that appear in the Syntax Machine were used in the S1, S2 and S3 Fortran compilers developed at AWRE Aldermaston for the IBM 7030 (Stretch) and later ported to the Ferranti Atlas II. However, by then, an operator precedence parser was used for syntax analysis.
2.2 PSYCO
Irons started his work on a syntax-directed compiler in 1959 while at Hanover for the summer.
PSYCO uses a bounded-context bottom-up parser with backup. That is at any stage in the recognition of a program, a finite set of previously recognised non-terminals and the next finite set of terminal systems will always define the action that the parser needs to take. PSYCO was mainly aimed at writing translators for Algol-like languages. The system started with a symbol table containing basic information and a parse of the Algol-like program collected declarative information about the variables used in the program. A second pass used this information to compile the executable part of the program.
Irons also recognised the need for handling errors in programs presented to the compiler.
2.3 Compiler-Compiler
The Brooker-Morris compiler-compiler (BMCC) has a diferent notation from either Glennie or Irons. The parser is a top-down recursive-descent parser with statements of the form:
FORMAT [SS] = [V] = [SAE]
It indicates that a source statement (SS) in the languge consists of a variable (V) followed by '=' followed by a signed arithmetic expression (SAE).
Associated with each FORMAT statement is a ROUTINE that defines the semantics of the statement:
ROUTINE [SS] = [V] = [SAE] LET [SAE] = [±?][T][±T*?] Ac = [±?][T] ....
The appropriate ROUTINE statement is passed the parse tree of the statement recognised. It can inspect the parse tree and decide what code should be generated.
The Compiler-Compiler effectively adds the definition of the language being defined to that of the Compiler-Compiler itself. Before releasing the final compiler, it is usual to remove the Compiler-Compiler's own statements from that of the language being compiled.
The Compiler-Compiler was used to produce Fortan II, Fortran IV, Algol, Extended Mercury Autode, Atlas Autocode, SOL, ACL, CPL, Elliott Autocode Mk III compilers for the Ferranti Atlas computer.
3. Early Translator Writing Systems
The early translator writing systems influenced a number of systems that followed soon after. We shall look at three of these and one other, TMG, that also comes later but appears to have not been influenced by the earlier systems.
3.1 COGENT
John Reynolds' COGENT system was developed at the Argonne national Laboratory around 1962. COGENT stand for COmpiler and GENeralised Translator. The objective was to unify the concept of syntax-directed compilation with the more general but primitive concept of recursive list-processing. It is similar to BMCC in that it has two main constructs productions and generator definitions which correspond somewhat to FORMAT and ROUTINE in BMCC.
An example of a set of productions for a simple language is:
(LETTER) = A,B,C,D,E. (STRING) = (LETTER),(STRING)(LETTER). (VARIABLE) = (STRING). (FACTOR) = (VARIABLE). (FACTOR) = (()(POLYNOMIAL)()). (TERM) = (FACTOR). (TERM = (TERM)*(FACTOR). (POLYNOMIAL) = (TERM),+(TERM),-(TERM). (POLYNOMIAL) = (POLYNOMIAL)+(TERM) (POLYNOMIAL) = (POLYNOMIAL)-(TERM)
As in BMCC, several productions can be written as a single production:
(TERM) = (FACTOR),(TERM)*(FACTOR)
COGENT uses the production statements to construct a parser, similar to the Irons parser, that takes the input and generates a parse tree. By giving each recognised production a unique internal number, the parse tree can be very compactly represented as a list structure. This list structure can be manipulated by the generator definitions and eventually output the compiled code. COGENT was implemented on the Control Data 3600. A novelty of the implementation was that backup spawned a set of processes that performed the various alternatives in parallel.
3.2 CGS
Computer Associates of Massachusetts was in the business of writing compilers. Compiler Generator System (CGS) was their first attempt at a general purpose translator writing system aimed at producing efficient object code with optimisation provided at several places in the system. The basic approach was a table-driven top down syntax analyser. It ran on an IBM 7090, a Burroughs D-825, and a CDC 1604.
The generated compiler had five phases:
Syntactic Analyzer: converts input string into a tree-representation of its syntax
Generator: transforms the tree into a sequence of macro-instructions
Optimizer: recognises and eliminates redundant computation (eg common subexpressions, invariant computation out of loops, etc)
Code Selector: assembles code fragments
Assembler: binds the code fragments in form required by the compiler environment.
The Syntactic Analyzer, Generator, and Code Selector are driven by tables defined for a specific language and machine code using a BNF-like definition of each. The Syntactic Analyzer is used to generate the relevant tables.
3.3 TMG
TMG (stands for TransMoGrifier) was developed for the PDP-7 by R. M. McClure at Texas Instruments in 1975 and appears not to have been influenced by the earlier systems. The system was ported to the IBM 7040, IBM 7090, and CDC 1604. It was used at Bell Labs and also by the Multics Project. At Bell Labs, Dennis Ritchie used TMG to produce the compiler for B that later became C. McIlroy and Morris used TMG to write the EPL compiler for Multics. As an aside, Stephen Johnson named his system YACC because they already had TMG at Bell Labs! TMG was used to implement a subset of PL/I by Cal Tech.
TMG was aimed at constructing simple one-pass translators for some specialized language with almost no optimisation of code generated. It was defined in a language TMGL with a compiler for TMGL on the target machine. Later, the system was defined in itself.
The system is somewhat similar to the Glennie Syntax Machine in that the syntax specification of a program consists of sets of statements with a label, a syntactic specification, and a semantic action that takes place if the syntactic entity is recognised. The syntactic part consists of a set of actions with possibly two exit labels which give where to go next in the case of the syntactic entity being recognised or not. If the syntactic part is recognised, the semantic part is executed.
In the first 18 months TMG was used to define two different FORTRAN compilers, a logic simulation system, several data format converters, a geometric description language translator, and the TMGL compiler itself.
The system is not really aimed at handling structured languages like Algol 60.
3.4 Meta II
The SIG/PLAN Working Group 1 on Syntax Directed Compilers met monthly in the conference room at the UCLA computing facility. John Backus attended one meeting to see how BNF was being used as a compiler writing language. Several separate attempts at defining a translator writing system came out of that working group and the ideas finally came together initially as a system called META but finally as META II.
META II was a simple translator writing system written by V Schorre et al in 1962.
In META II, the language for which a translator is required must be specified as a set of syntax rules. These very much resemble the BNF notation except that they contain code generation commands as part of the syntax definition. There is limited memory in META II so there is a need to output information soon after it has been recognised.
A restrictive top-down method of syntax analysis is used in the META II system. The restriction is that it does not allow back tracking or left recursion. The main goal is to match the complete program with the Syntax Rule whose name follows .META at the head of the program. The name of the Syntax Rule appears at the start of the rule followed by the = symbol. The right-hand side of this Syntax Rule indicates which entities have to be recognised to achieve this goal. The recogniser achieves its main goal by looking for these smaller entities, from left to right, as individual subgoals. These subgoals are themselves defined in terms of other entities, and so the process of recognising a complete program becomes a matter of looking for progressively smaller entities, right down to the level at which basic symbols such as characters and numbers are recognised.
A typical Meta II rule would be:
SET = 'C' $ ( 'D' / 'E' )
This recognises any string consisting of the letter C followed by any sequence involving the letters D and E.
Meta II rules can include .OUT statements that generate output code. Recognised basic symbols (.ID, .STRING, .NUMBER) are stored and can be output by .OUT(*). Also two unique labels can be output for each statement by writing *1 or *2.
Meta II is a very simple system that can be defined in itself. This is the complete definition:
.SYNTAX PROGRAM OUT1 = '*1' .OUT('GN1') / '*2' .OUT('GN2') / '*' .OUT('CI')/ .STRING .OUT('CL ' * ) ., OUTPUT = ('.OUT' '(' $OUT1')' / '.LABEL' .OUT('LB') OUT1 ) .OUT('OUT') ., EX3 = .ID .OUT('CLL' * ) / .STRING .OUT('TST' *) / '.ID' .OUT('ID') / '.NUMBER' .OUT('NUM') / '.STRING' .OUT('SR') / '(' EX1 ')' / '.EMPTY' .OUT('SET') / '$' .LABEL *1' EX3 .OUT('BT ' *1 ) .OUT( 'SET') ., EX2 = (EX3 .OUT('BF ' *1 ) / OUTPUT ) $(EX3 .OUT('BE') / OUTPUT) .LABEL *1 ., EX1 = EX2 $( '/' .OUT('BT ' *1) EX2) .LABEL *1 ., ST = .ID .LABEL * '=' EX1 '.,' .OUT('R') ., PROGRAM = '.SYNTAX' .ID .OUT('ADR' * ) $ ST '.END' .OUT('END') ., .END
4. Third Generation Translator Writing Systems
After the initial promise of early translator writing systems, there was considerable interest in producing better systems. The early systems had shown that the approach was viable. The understanding of the whole area of parsing and what was possible had improved. A theory of syntax analysis was evolving. The number of new languages arriving was still on the increase. Computer systems were evolving.
4.1 FSL
In the period 1963-67, Bob Floyd from Computer Associates had moved to Carnegie-Mellon University. Jay Earley, a doctoral student, was working on a parsing algorithm, the most efficient general context-free algorithm known. Alan Perlis et al were defining an enhanced version of Algol, Formula Algol, which would incorporate the manipulation of mathematical formulae within the language Algol. There was a need for a Formula Algol compiler. Jerome Feldman had completed his thesis, A Formal Semantics for Computer Oriented Languages, in 1964. This defined a compiler-compiler, FSL, that was capable of translating into machine language most of the existing programming languages. It was, therefore, used to produce the Formaula Algol Compiler for the Bendix G-21. FSL had two main subsections. A Syntax Loader that builds tables which will control the recognition and parsing of programs in the required language. A Semantic Loader builds a table describing the semantics of the instructions in the required language. The Syntax Loader generated a Production Language (PL), based on the work of Bob Floyd, that scanned the current input stream against the PL rules and decided on the action to take. The Semantic Loader defined the meaning of each language statement in a machine-independent code.
FSL was a major input t0 the CABAL Compiler-Compiler system produced at Carnegie-Mellon University in the period 1967 onwards.
4.2 TGS
TGS (sometimes called TRANGEN), Translator Generator System, was produced at Computer Associates as a follow on to CGS. It ran on an IBM 7094, CDC-1604, Univac M460 and GE-635 and was used to write translators for PL/I, ALGOL, and FORTRAN IV. Each compiler is driven by a set of tables, TRANTAB, described in a language called TRANDIR. Synatx Analysis was done by a system based on Floyd Productions. The code generation system was also used in later versions of the META systems.
4.3 Tree-Meta
After a number of intermediate systems, Rulifson defined Tree-Meta as a major advance over the earlier META systems. The TREE-META program is in two parts. The first part is a definition of the syntax for the source language, L, for which a translator is required and this is largely inherited from Meta II. The major difference is that instead of outputting code, the output is a tree built up as the syntax proceeds. The second part defines a set of code generation rules that generate the object code to be produced from this tree. The tree itself is quite flexible in format and is defined by the TREE-META user as he defines the syntax analyser for the language. The code generator can take note of any peculiarities in the form of the tree, if that will lead to better code being produced.
An example of the syntax recognise is:
.META EXPRESSION EXPRESSION = TERM $ ( '+' TERM :ADD[2] / '-' TERM :SUB[2] ) ; TERM = FACTOR $ ( '*' FACTOR :MULT[2] / '/' FACTOR :DIV[2] ); FACTOR = '+' PRIMARY / '-' PRIMARY :MIN[1] / PRIMARY ; PRIMARY = .ID / .NUM / '(' EXPRESSION ')' ; .END
The construct ADD[2] generates an ADD tree node from the top two items on the stack. The code generator instruction would be of the form:
ADD[-,-] => *1 ' PLUS ' *2 ;
Separating code generation from the building of the parse tree means that quite efficient code can be generated.
5. The Demise of Translator Writing Systems
Interest in Translator Writing Systems continued after the mid-1960s but the interest in producing a universal system capable of production-quality compilers decreased. This occurred for a number of reasons:
The number of new languages decreased and the requirements for new languages that did appear tended to be significantly different from the previous generation.
Phases of the compilation process were specified in independent modules like LEX and YACC so that a compiler writer could select a relevant set of modules to put together to create a specific compiler.
The speed of computers increased so that the need for the fastest compilation possible and the most efficient object code decreased.
6. References
|
|||||
4768
|
dbpedia
|
3
| 23
|
https://ebs-integrator.com/en/blog/programming-timeline-part2
|
en
|
EBS Integrator
|
[
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F371bfddc-d9e2-403b-9da8-92294252cba2.png&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F10789421-2e86-452b-97a2-1f7343be8ed3.png&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff9724876-5256-45da-a201-a2d78f049f10.png&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fa929a417-229c-4c28-bfbb-aa88f11d0f13.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F349b5087-0a27-486f-9018-60b8c5bf1484.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fc867d915-466c-4dd9-aa00-7735d18eae2c.png&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff996dada-2af3-4673-8477-daaae76808a7.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F95f67a51-0aaa-4fce-85ed-da93a1686484.jpeg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F0b63b954-3d0c-4abb-81e3-f1d57915ff48.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F6a3e781e-cba9-49b2-8fa8-dffed84bbdca.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Fb0a16fea-c513-4baf-bbfb-b84567ad50be.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F31b248f6-ab76-4e65-bc55-2a9b26193404.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F3a6e168b-1060-4235-84c8-bee1d4d43c2d.png&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2Ff8796488-c5de-49a3-b16a-ae6d07588b9e.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=384&q=75 384w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F4790b2d6-d3fc-444b-903c-885e30495ce8.jpg&w=3840&q=75 3840w",
"https://ebs-integrator.com/_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=384&q=75 384w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=640&q=75 640w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=750&q=75 750w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=828&q=75 828w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=1080&q=75 1080w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=1200&q=75 1200w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=1920&q=75 1920w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=2048&q=75 2048w, /_next/image?url=https%3A%2F%2Fstorage.fileservice.dev%2Fmedia%2F784f4e98-8b9a-4b39-b93f-cbc88b4c2c7d.jpg&w=3840&q=75 3840w"
] |
[] |
[] |
[
""
] | null |
[] | null |
en
|
/favicon.ico
|
https://ebs-integrator.com/en/blog/programming-timeline-part2
|
Let’s take a deep dive in time and look at how computer science & programming languages began. Follow us as we go through the ages and look at the timeline, major personalities, and events that through their ingenuity paved the road to our current Technological and Digital Era.
The history of programming
Want to see the father of computers and the first ever programmer as they struggled in the 1800’s, or check out how a 17th-century philosopher helped us develop binary code that helps YOU read these words on your screen. Well, then you absolutely should check out our earlier entry in the series.
And if you’ve just finished that one, let's do a recap of where we left off.
Fortran code is said to be 20 times shorter than its analogue in handwritten assembly code.
The community was doubtful of it at the time due to performance concerns, but the fact that programmers could write more code quicker — it was an easy choice from the economical viewpoint.
This how the code interface looked in Fortran back then:
It wasn't like entering everything into a computer and having it appear on a screen, there were no screens... or even a keyboard.
Computer Program from 1954
Like Mr. Jacquard more than a hundred years ago, they had to encode it on punched cards.
The only difference is that we're talking about an IBM mainframe, and you couldn't touch the IBM mainframe. You had people who did that.
You write your program on that paper with a pen:
An improvement on the Fortran language
Since we're talking about Mr. Backus, let's go back to 1958, when Algol was created by a committee including European and American computer scientists.
Algol (also known as Algorithmic Language) is a high-level imperative computer programming language that introduced a wealth of features that would later influence and be used in the creation of Pascal, C#, C++ and Java, among others.
Matter of fact most languages now days have syntaxes inspired by Algol and it’s considered amongst the most influential programming languages ever. It’s said that Algol was an attempt to improve on some intrinsic problems met with Fortran and its wildly considered as an “evolution” of sorts. While we’re on the subject, John Backus along side Peter Naur developed the Backus-Naur Form of describing programming languages based on ALGOL 58-60. So, you could consider Backus as one of the original fathers of many languages we use today. Here’s how “Hello, World!” looks in Algol-60: BEGIN DISPLAY (“Hello, world!”) END This is starting to look quite reasonable.
Let’s get some clojure for computer sciences
We’re still in the 1958 as we’re about to witness another giant of “programming languages”, and a father-language to many revolutionary ideas at this time – be born.
LISP (aka LISt Processor) – a list depended high-level programming language (3GL) developed by John McCarthy at “Massachusetts Institute of Technology”. The language has a long history of changing dialects (like Racket, Clojure, Common lisp and Scheme).
Originally intended as a formal notation for computations influenced by lambda calculus and artificial intelligence; it surprised everyone after being introduced into the IBM 704 and turning out to be a fully functioning practical tool for solving problems.
Lisp is notable for pioneering many ideas in computer sciences like: tree data structures, dynamic typing, conditions, automatic storage management, recursion, high-order of functions and self-hosting compiler.
It is also notable for giving students a migraine, due to its overreliance on parenthesis, so much so they gave it nicknames like “Lost in stupid parenthesis” or “Lots of irritating superfluous parenthesis”. And here’s how “Hello, World!” would look in Lisp: (PRINT “Hello, World!”) Though in 1958 Lisp is still in its infancy and doesn’t see much broad use, this brings us to…
An old acquaintance
At this current time most of if not all computers are working in either low-programming languages like opcode/ASM or the only high-level programming language available Fortran. And if by some reason you want to upgrade your computer, you’ll have to write all that code from scratch.
So, let’s make a small leap to end of 1959. We’re going to meet someone, we’ve already discussed.
And that someone is Grace Hopper; the rear admiral of the US Navy and a computer scientist she has already made an impact on history of programming before by creating the A-0 compiler system.
Now however she’s attending the CODASYL or the “Conference on Data System Languages” as a technical aide and main expert.
The committee is formed in an attempt to guide the development of standardization of programming languages or create a single language that could be used across different machines regardless of hardware.
The result was the “Common Business Orientated Language”, or COBOL for short – an “English-like” Programming language heavily influenced by FLOW-MATIC created by Ms. Hopper;
Primary goal of COBOL was to lower the barrier of entry into programming; before the creation of High-level languages, programming was a realm exclusive to computer scientists and experts.
Now however, other enthusiasts from different professions like businesspeople, doctors, engineers, teachers and many other could incorporate computation into their work. To deal with underlying hardware each computing machine had to have its own COBOL compiler. But critically these compilers could accept the same COBOL source code. The notion of “Write once, apply everywhere” was created; for at this time, most computers use machine code or assembly which is very CPU specific. Let’s see how “Hello, World!” looks in COBOL: IDENTIFICATION DIVISION. PROGRAM-ID. hello-world. PROCEDURE DIVISION. DISPLAY "Hello, world!" That folks, is what we call a “language designed by committee”.
The arrival of the minicomputer
Hey mama welcome to the 60’s!! Wo-oh-oh-oh… Oh sorry, old habits.
But as you might have guessed, 1960’s has rolled around, and someone had a bright idea
How about we stop writing code in pencil and spending weeks on encoding it into punch-cards and use the conveniently invented teletype machines that news agencies and governments use to send messages to each other; and use those to “talk” with the computer?
With that sorted, humanity paved the way to a more general-purpose high-level programming languages designed to be used by everyone, everywhere.
And with the emergence of minicomputers on the market, there was a sharp need of a programming language/compiler compact enough to meet the demand.
Thus, BASIC was born – “Beginners’ All-purpose Symbolic Instruction Code”
Originally designed by John G. Kemeny and Thomas E. Kurtz in 1964; Basic’s core principal and objective was “ease of use” and “compactness”.
“Minicomputer” – PDP-11 So compact in fact that it is generally considered as the first commercial “interpreter” being able to make use of all those 4 kB of memory available on those “mini”-computers. “Basic” was amazing for teaching programming principles, it was easy to understand and follow; especially for people who didn’t have a strong understanding of mathematics or computers in general. Now days we know BASIC under its many dialects that appeared after its inception in 1964 – Microsoft BASIC; Visual BASIC; Atari BASIC; Commodore BASIC; Applesoft BASIC and dozen other variations. However, they all come from the “Dartmouth BASIC”. Aside from being the originator Dartmouth Basic ran with an interactive programming environment available to all students and teachers called “Dartmouth Time Sharing System” (DTSS); the first time-sharing operating system! 10 PRINT “Hello, World!” 20 GO TO 10 120 END
Family tree of programming languages. Most of the following have their roots deep down in the languages we’ve mentioned earlier. By 1970s and onwards to 1980s we had: PASCAL (1970); Smalltalk (1972); C (1972); SQL (1972); Scheme (1975); C++ (1980); Ada (1983); Objective-C (1983); Perl (1987) and later the 90s rolled around and we have our favourite: Haskel and Python (1990); Visual Basic (1991); Java, PhP and JavaScript in 1995 and so many others.
History of programming: End of todays session
We believe went too fast over there and overloaded the capacitors; we seem to be going back into 2021!
Well, I suppose that’s it for now, something went horribly wrong and I’m afraid we’re stuck in our time again. Perhaps later we’ll be coming back to the 70s and exploring more how our programming world evolved through the ages.
This was our take on the history of programming from the 1950s to 1960s.
If you think it’s a great idea, maybe leave a like and a comment saying so? What do you think is the most fascinating part of programming history? While you ponder that, I’ll go back researching the incredible ways you can improve your business and tech company with digital transformation.
Stay classy tech and business nerds!
|
||||||
4768
|
dbpedia
|
2
| 38
|
https://www.coursehero.com/file/150546597/History-of-Science-Assignment-4docx/
|
en
|
[] |
[] |
[] |
[
""
] | null |
[] | null | null | ||||||||||
4768
|
dbpedia
|
3
| 1
|
https://bulldogjob.com/readme/why-algol-was-an-important-programming-language
|
en
|
Why ALGOL was an important programming language?
|
[
"https://bulldogjob.com/_next/static/media/bulldogjob-logo.34bbbab5.svg",
"https://cdn.bulldogjob.com/system/authors/photos/000/000/055/square/dsgsgd.png",
"https://cdn.bulldogjob.com/system/readables/covers/000/004/237/max_res/180424_thestoryosalgol.png",
"https://cdn.bulldogjob.com/system/photos/files/000/013/383/original/algol-creators.png",
"https://cdn.bulldogjob.com/system/photos/files/000/013/384/original/electrologica_x1.jpg",
"https://bulldogjob.com/next-images/report-vertical-en.png"
] |
[] |
[] |
[
""
] | null |
[
"Adam Kukołowicz"
] |
2024-04-23T10:00:00+02:00
|
ALGOL is a more interesting language than you think, both in terms of its story and legacy.
|
en
|
/favicons/apple-icon-57x57.png
|
https://bulldogjob.com/readme/why-algol-was-an-important-programming-language
|
Today, I'm taking you on a journey to the past, back to the late 1950s, when the foundations of computer science were being laid, and when computers were seen as the future. As we talk about this period today, names like FORTRAN, Lisp, and COBOL often come up.
Yet, there was ALGOL, a pretty obscure, but really important language in the history of programming. The story of the language deserves to be told.
What did the world of programming look like then?
It was a world vastly different from today, mainly because most coding was done in assembly language (or not even that in Europe). If any of you have programmed in assembly, you know it’s a laborious process. That's how it was in the 1950s. At that time, the first programming languages were being developed to make programming easier. Most of these languages had specific applications and were often tied directly to the architecture of the computers on which they ran. Additionally, they were modeled on natural languages and hard to port between platforms.
In 1955 a symposium on automated computing was held in Darmstadt, Germany. A few of the speakers talked about the need for better and more expressive programming languages. Ideally, that would be one algorithmic language that was platform-independent a that would allow researchers from around the world to collaborate on new computing methods. A universal language was needed, so we could stop thinking in terms of bits and registers and start thinking in terms of algorithms and data structures.
Gesellschaft fur Angewandte Mathematik und Mechanik (GAMM) or "Society for Applied Mathematics and Mechanics" thought it was a great idea, especially the universal part. They realized that creating a language on their own was possible, but it for sure wouldn't be accepted worldwide.
That's why they decided to reach out in 1957 to the American Association of Computing Machinery (ACM) and invite them to collaborate. It was a good call, since Americans started to think about similar idea at the time. They agreed to create a brand new language together.
ALGOL Enters the Stage
ALGOL stands for Algorithmic Language. Initially, it was to be called IAL (International Algebraic Language), but it was soon realized that the acronym was not very convenient. The first version emerged from a meeting of an international committee of scientists from Europe and the United States in Zurich in 1958—hence ALGOL 58. It wasn’t quite a programming language in today’s sense. As John Backus, one of ALGOL's creators, wrote:
The ACM-GAMM conference in Zurich had two main motives in creating IAL:
(a) to provide a way to communicate numerical methods and other procedures among people
(b) to provide a way to execute a fixed process on various machines.
In 1958, the goal was to create a standard description of what was happening on computers. Although ALGOL 58 had several implementations, it did not gain wide recognition. It introduced the concept of compound statement, but its use was limited only to control flow. Other than that it had types, loops, conditionals, procedures, and switches. It soon became clear that the outcomes were not satisfying.
By June 1959, it was evident that further work was needed, and broad consultations were announced to gather material for another meeting, where a new version of the language would be developed.
The Americans focused on the practical merits of the language—mainly because, in the USA, programming was slowly becoming a profession and the practical merits of ALGOL 58 were not great. Europeans, however, made more proposals concerning the foundations of the language. That was the beginning of ALGOL 60.
Yes, you guessed it. ALGOL 60, because it was proposed in 1960. The meeting this time took place in Paris, with thirteen scientists participating. Let me quickly introduce them to show the caliber of people involved:
Friedrich L. Bauer - creator of the term “software engineering”
Peter Naur - editor of the ALGOL 60 report, co-author of the Backus-Naur notation (although he strongly disclaimed this)
Heinz Rutishauser - mathematician, pioneer of modern numerical analysis and computer science
Klaus Samelson - a pioneer in the fields of compilers and stack algorithms
Bernard Vauquois - a pioneer in machine translation
Adriaan van Wijngaarden - father of Dutch computer science
Michael Woodger - co-creator of the Pilot ACE, one of the first British computers, co-creator of the Ada language
John W. Backus - creator of FORTRAN (led the team), creator of the Backus-Naur notation, and creator of the function-level programming paradigm (not to be confused with functional programming—a very interesting concept that I had never heard of)
Julien Green - also involved in ALGOL 58
Charles Katz - worked on compilers for the first computers
John McCarthy - creator of the term “artificial intelligence” and researcher in the field, for which he received the Turing Award; creator of Lisp
Alan Perlis - the first recipient of the Turing Award, awarded for his contributions to advanced programming techniques and compiler construction
Joseph Henry Wegstein - worked on automatic data processing and fingerprint recognition, participated in the work on COBOL
Alan Perlis described the Paris meetings in this way:
The meetings were exhausting, interminable, and exhilarating. One became aggravated when one's good ideas were discarded along with the bad ones of others. Nevertheless, diligence persisted during the entire period. The chemistry of the 13 was excellent. On every point at issue there were opinions and solutions, with descriptive prose to support them that ranged from the technically clever to the breathtakingly simple. But there was rarely a complete cancellation. Progress was steady and the output, ALGOL 60, was more racehorse than camel.
The photo shows six participants of the Paris conference in 1960, dedicated to ALGOL. In the top row: John McCarthy, Friedrich Bauer, Joseph Wegstein. In the bottom row: John Backus, Peter Naur, Alan Perlis.
Imagine thirteen scientists, engineers, and researchers from several countries. Each threw in their ideas on what the language should look like (okay, preliminary proposals had been selected earlier). In 1960, we were dealing with greenfield - there were no limitations arising from history because the field was new. Although there was already a previous version of ALGOL and FORTRAN and the modest languages of the 50s, all of them lacked a lot in terms of describing algorithms or expressiveness. So there was plenty of room for improvement. From all this, it was necessary to crystallize a coherent, pure language that would be the basic tool used by computer scientists in everyday work. What did the committee come up with?
ALGOL 60 Features
Certainly, one of the more spectacular new features was the introduction of blocks. Blocks were denoted with keywords begin and end, Additionally, a block now had its own scope. This also allowed further development of the compound instruction originally introduced in ALGOL 58.
ALGOL was the first language to allow the definition of nested functions, which had their lexical scope.
The authors of the report recommended implementing 9 standard functions: abs, sign, sqrt, sin, cos, arctan, ln, exp, and entier. It's not a lot, considering the standard libraries of contemporary languages. This shows how pure ALGOL was.
Another important tool was recursion. It was proposed by John McCarthy, who had just completed work on Lisp, where he included this possibility. However, the proposal met with considerable concern from the committee, and the attending members rejected the introduction of the keyword recursive. Ultimately, recursion made it into the standard implicitly - the ALGOL specification does not prohibit recursive calls.
Another curiosity was two strategies for passing parameters. The first is the well-known pass-by-value. The second, however, is pass-by-name. In this strategy, arguments are not invoked until they appear in the body of the procedure. Theoretically, this can provide a performance advantage, but it creates certain challenges for compilers.
ALGOL Reception
The report produced in 1960 was a wonderful thing, but we must not forget that it was only a dozen-page description of the language. There was no reference implementation. Moreover, ALGOL 60 did not specify input and output operations at all! Thus, it's hard to talk about a "Hello World" program written in ALGOL 60. The report itself was incredibly concise, containing very few superfluous words, and—as Alan Perlis said:
Like the Bible, it was meant not merely to be read, but to be interpreted.
So, one of the challenges waiting for the creators of individual implementations was to interpret input and output operations in the spirit of the specification.
The first compiler was X1 ALGOL 60, developed by Edsger Dijkstra (yes, the same one from the algorithm lectures) and Jacob Zonneveld - for the Dutch computer Electrologica X1.
Electrologica X1, circa 1960
By 1963, compilers had appeared for several of the computers of the time, including UNIVAC and Elliot (the compiler creator was Tony Hoare). In the second half of the 60s, implementations of ALGOL also appeared on computers from the Soviet bloc. From 1965, you could program in ALGOL on the MINSK computers, and from 1967 on the Polish computer ZAM.
The biggest problem with ALGOL was the lack of support. FORTRAN, developed by IBM, was not as advanced, but it had full support from the parent company. IBM themselves wrote reference implementations for subsequent platforms, making it seem more suitable for commercial applications—it was standardized.
This is how Robert Bemer spoke about the new language in 1961:
No reasonable mechanism for maintenance seems to exist. No one seems to be able to answer the basic question, ”What is ALGOL?” I foresee a difficult time for ALGOL unless a revised maintenance procedure is devised. The language is too general and has too many ambiguities.
Bemer was right in that the report indeed had several omissions and lacked a mechanism for correcting them. Ultimately, neither the creators nor those interested in ALGOL established either a support mechanism or a development mechanism, leading to the fiasco of ALGOL 68, which was rejected by the community.
While ALGOL did not achieve spectacular commercial success, scientists received it very enthusiastically. Hence, it became the main language for describing algorithms for the next 20-30 years. Its clean and expressive syntax facilitated communication among people.
The lack of reference implementations, however, made using the language for human-machine communication difficult. Since the computer industry in America was much more developed than in Europe, practical considerations were paramount there. The issue of support and compatibility was crucial. In Europe, however, computer science was almost exclusively the domain of research centers. Scientists had different priorities, and the new language was very well-suited for researching algorithms. Thus, ALGOL was much better received on the Old Continent.
What is the legacy of ALGOL?
The impact of ALGOL on later languages was huge. It inspired most imperative programming languages. Among the more significant ones are Pascal, Simula (the first object-oriented language), and C.
In essence, all subsequent programming languages include blocks and recursion. Work on ALGOL 68 also brought about case expressions and the concept of records along with references, as well as pass-by-reference.
ALGOL 60 was the first programming language to be defined in BNF, or Backus-Naur form. It's basically about building a language from already defined concepts—with minimal references to natural language.
The general form is:
<metalinguistic variable> ::= <form>
Simple example:
<digit> ::= 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9
More complex example:
<if clause> ::= if <Boolean expression> then <unconditional statement> ::= <basic statement> | <compound statement> | <block> <if statement> ::= <if clause> <unconditional statement> <conditional statement> ::= <if statement> | <if statement> else <statement> | <if clause> <for statement> | <label>: <conditional statement>
The first example doesn't look too impressive, but the second gives a better understanding of the possibilities. Thanks to this notation, one can effectively describe a programming language without external context. Since then, this notation and its developments have become the standard for describing programming languages. You can see it everywhere, for example in Python or JavaScript.
ALGOL became a catalyst in the development of compilers. The first compilers were simple translators of formulas into machine code, but ALGOL required something more—procedures, variable-sized arrays, or recursion didn't fit into the sequential translation technique. To implement them correctly and efficiently, new, much more advanced compilation techniques had to be developed.
All this means that although it has completely fallen out of use, ALGOL was an extraordinary language. The effort put into its development and implementation elevated computer science to another level. One can imagine that development was inevitable. However, it seems to me that thanks to the quality of ALGOL, it was somewhat faster. Many concepts could be transferred painlessly to other programming languages or developed further—and that's what we benefit from today.
|
|||||
4768
|
dbpedia
|
2
| 96
|
https://www.jayshirley.com/blog/the-emperors-old-clothes
|
en
|
The Emperor's Old Clothes — Jay Shirley
|
http://static1.squarespace.com/static/52c6b61ae4b0f8d060c0ca41/52c72170e4b0b6f0473f9b4c/5bf41afb2b6a285fc821de3e/1542731726365/Elliot803.jpg?format=1500w
|
http://static1.squarespace.com/static/52c6b61ae4b0f8d060c0ca41/52c72170e4b0b6f0473f9b4c/5bf41afb2b6a285fc821de3e/1542731726365/Elliot803.jpg?format=1500w
|
[
"https://images.squarespace-cdn.com/content/v1/52c6b61ae4b0f8d060c0ca41/1542729351586-TBFQFV9GCFZCYVTHW7BS/Elliot803.jpg"
] |
[] |
[] |
[
""
] | null |
[
"Jay Shirley"
] |
2018-11-20T08:35:25-08:00
|
This is a cleaned up transcription from Tony Hoare’s 1981 lecture, “The Emperors Old Clothes”. This is one of the most fascinating and helpful reads around software design and management practices to ensure high quality work is delivered on a predictable schedule.
|
en
|
https://images.squarespace-cdn.com/content/v1/52c6b61ae4b0f8d060c0ca41/1390351458636-Q9JM62486QS0CLMSCM7K/favicon.ico
|
Jay Shirley
|
https://www.jayshirley.com/blog/the-emperors-old-clothes
|
I recently read a transcript of Tony Hoare’s 1981 lecture to the ACM and enjoyed it so much I wanted to clean it up, and post it here.
From the Communications of the ACM, 1981
My first and most pleasant duty in this lecture is to express my profound gratitude to the Association for Computing Machinery for the great honor which they have bestowed on me and for this opportunity to address you on a topic of my choice. What a difficult choice it is! My scientific achievements, so amply recognized by this award, have already been amply described in the scientific literature. Instead of repeating the abstruse technicalities of my trade, I would like to talk informally about myself, my personal experiences, my hopes and fears, my modest successes, and my rather less modest failures. I have learned more from my failures than can ever be revealed in the cold print of a scientific article and now I would like you to learn from them, too. Besides, failures are much more fun to hear about afterwards; they are not so funny at the time.
I start my story in August 1960, when I became a programmer with a small computer manufacturer, a division of Elliott Brothers (London) Ltd., where in the next eight years I was to receive my primary education in computer science. My first task was to implement for the new Elliott 803 computer, a library subroutine for a new fast method of internal sorting just invented by Shell. I greatly enjoyed the challenge of maximizing efficiency in the simple decimal-addressed machine code of those days. My boss and tutor, Pat Shackleton, was very pleased with my completed program. I then said timidly that I thought I had invented a sorting method that would usually run faster than SHELLSORT, without taking much extra store. He bet me sixpence that I had not. Although my method was very difficult to explain, he finally agreed that I had won my bet.
I wrote several other tightly coded library subroutines but after six months I was given a much more important task - that of designing a new advanced high level programming language for the company’s next computer, the Elliott 503, which was to have the same instruction code as the existing 803 but run sixty times faster. In spite of my education in classical languages, this was a task for which I was even less qualified than those who undertake it today. By great good fortune there came into my hands a copy of the Report on the International Algorithmic Language ALGOL 60. Of course, this language was obviously too complicated for our customers. How could they ever understand all those begins and ends when even our salesmen couldn’t?
Around Easter 1961, a course on ALGOL 60 was offered in Brighton, England, with Peter Naur, Edsger W. Dijkstra, and Peter Landin as tutors. I attended this course with my colleague in the language project, Jill Pym, our divisional Technical Manager, Roger Cook, and our Sales Manager, Paul King. It was there that I first learned about recursive procedures and saw how to program the sorting method which I had earlier found such difficulty in explaining. It was there that I wrote the procedure, immodestly named Quicksort, on which my career as a computer scientist is founded. Due credit must be paid to the genius of the designers of ALGOL 60 who included recursion in their language and enabled me to describe my invention so elegantly to the world. I have regarded it as the highest goal of programming language design to enable good ideas to be elegantly expressed.
After the ALGOL course in Brighton, Roger Cook was driving me and my colleagues back to London when he suddenly asked, "Instead of designing a new language, why don’t we just implement ALGOL 60?" We all instantly agreed - in retrospect, a very lucky decision for me. But we knew we did not have the skill or experience at that time to implement the whole language, so I was commissioned to design a modest subset. In that design I adopted certain basic principles which I believe to be as valid today as they were then.
The first principle was security: The principle that every syntactically incorrect program should be rejected by the compiler and that every syntactically correct program should give a result or an error message that was predictable and comprehensible in terms of the source language program itself. Thus no core dumps should ever be necessary. It was logically impossible for any source language program to cause the computer to run wild, either at compile time or at run time. A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to – they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980, language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law.
The second principle in the design of the implementation was brevity of the object code produced by the compiler and compactness of run time working data. There was a clear reason for this: The size of main storage on any computer is limited and its extension involves delay and expense. A program exceeding the limit, even by one word, is impossible to run, especially since many of our customers did not intend to purchase backing stores. This principle of compactness of object code is even more valid today, when processors are trivially cheap in comparison with the amounts of main store they can address, and backing stores are comparatively even more expensive and slower by many orders of magnitude. If as a result of care taken in implementation the available hardware remains more powerful than may seem necessary for a particular application, the applications programmer can nearly always take advantage of the extra capacity to increase the quality of his program, its simplicity, its ruggedness, and its reliability.
The third principle of our design was that the entry and exit conventions for procedures and functions should be as compact and efficient as for tightly coded machine-code subroutines. I reasoned that procedures are one of the most powerful features of a high level language, in that they both simplify the programming task and shorten the object code. Thus there must be no impediment to their frequent use.
The fourth principle was that the compiler should use only a single pass. The compiler was structured as a collection of mutually recursive procedures, each capable of analyzing and translating a major syntactic unit of the language - a statement, an expression, a declaration, and so on. It was designed and documented in ALGOL 60, and then coded into decimal machine code using an explicit stack for recursion. Without the ALGOL 60 concept of recursion, at that time highly controversial, we could not have written this compiler at all. I can still recommend single-pass top-down recursive descent both a design principle for a programming language. First, we certainly want programs to be read by people and people prefer to read things once in a single pass. Second, for the user of a time-sharing or personal computer system, the interval between typing in a program (or amendment) and starting to run that program is wholly unproductive. It can be minimized by the high speed of a single pass compiler. Finally, to structure a compiler according to the syntax of its input language makes a great contribution to ensuring its correctness. Unless we have absolute confidence in this, we can never have confidence in the results of any of our programs.
To observe these four principles, I selected a rather small subset of ALGOL 60. As the design and implementation progressed, I gradually discovered methods of relaxing the restrictions without compromising any of the principles. So in the end we were able to implement nearly the full power of the whole language, including even recursion, although several features were removed and others were restricted.
In the middle of 1963, primarily as a result of the work of Jill Pym and Jeff Hillmore, the first version of our compiler was delivered. After a few months we began to wonder whether anyone was using the language or taking any notice of our occasional reissue, incorporating improved operating methods. Only when a customer had a complaint did he contact us and many of them had no complaints. Our customers have now moved on to more modern computers and more fashionable languages but many have told me of their fond memories of the Elliott ALGOL System and the fondness is not due just to nostalgia, but to the efficiency, reliability, and convenience of that early simple ALGOL System.
As a result of this work on ALGOL, in August 1962, I was invited to serve on the new Working Group 2.1 of IFIP, charged with responsibility for maintenance and development of ALGOL. The group’s first main task was to design a subset of the language which would remove some of its less successful features. Even in those days and even with such a simple language, we recognized that a subset could be an improvement on the original. I greatly welcomed the chance of meeting and hearing the wisdom of many of the original language designers. I was astonished and dismayed at the heat and even rancor of their discussions. Apparently the original design of ALGOL 60 had not proceeded in that spirit of dispassionate search for truth which the quality of the language had led me to suppose.
In order to provide relief from the tedious and argumentative task of designing a subset, the working group allocated one afternoon to discussing the features that should be incorporated in the next design of the language. Each member was invited to suggest the improvement he considered most important. On October 11, 1963, my suggestion was to pass on a request of our customers to relax the ALGOL 60 rule of compulsory declaration of variable names and adopt some reasonable default convention such as that of FORTRAN. I was astonished by the polite but firm rejection of this seemingly innocent suggestion: It was pointed out that the redundancy of ALGOL 60 was the best protection against programming and coding errors which could be extremely expensive to detect in a running program and even more expensive not to. The story of the Mariner space rocket to Venus, lost because of the lack of compulsory declarations in FORTRAN, was not to be published until later. I was eventually persuaded of the need to design programming notations so as to maximize the number of errors which cannot be made, or if made, can be reliably detected at compile time. Perhaps this would make the text of programs longer.
Never mind! Wouldn’t you be delighted if your Fairy Godmother offered to wave her wand over your program to remove all its errors and only made the condition that you should write out and key in your whole program three times! The way to shorten programs is to use procedures, not to omit vital declarative information.
Among the other proposals for the development of a new ALGOL was that the switch declaration of ALGOL 60 should be replaced by a more general feature, namely an array of label-valued variables and that a program should be able to change the values of these variables by assignment. I was very much opposed to this idea, similar to the assigned GO TO of FORTRAN, because I had found a surprising number of tricky problems in the implementation of even the simple labels and switches of ALGOL 60. I could see even more problems in the new feature including that of jumping back into a block after it had been exited. I was also beginning to suspect that programs that used a lot of labels were more difficult to understand and get correct and that programs that assigned new values to label variables would be even more difficult still.
It occurred to me that the appropriate notation to replace the ALGOL 60 switch should be based on that of the conditional expression of ALGOL 60, which selects between two alternative actions according to the value of a Boolean expression. So I suggested the notation for a "case expression" which selects between any number of alternatives according to the value of an integer expression. That was my second language design proposal. I am still most proud of it, because it raises essentially no problems either for the implementor, the programmer, or the reader of a program.
Now, after more than fifteen years, there is the prospect of international standardization of a language incorporating this notation - a remarkably short interval compared with other branches of engineering.
Back again to my work at Elliott’s. After the unexpected success of our ALGOL Compiler, our thoughts turned to a more ambitious project: To provide a range of operating system software for larger configurations of the 503 computer, with card readers, line printers, magnetic tapes, and even a core backing store which was twice as cheap and twice as large as main store, but fifteen times slower. This was to be known as the Elliott 503 Mark II software system.
It comprised:
An assembler for a symbolic assembly language in which all the rest of the Software was to be written.
A scheme for automatic administration of code and data overlays, either from magnetic tape or from core backing store. This was to be used by the rest of the software.
A scheme for automatic buffering of all input and output on any available peripheral device - again, to be used by all the other software.
A filing system on magnetic tape with facilities for editing and job control.
A completely new implementation of ALGOL 60, which removed all the nonstandard restrictions which we had imposed on our first implementation.
A compiler for FORTRAN as it was then.
I wrote documents which described the relevant concepts and facilities and we sent them to existing and prospective customers. Work started with a team of fifteen programmers and the deadline for delivery was set some eighteen months ahead in March 1965. After initiating the design of the Mark II software, I was suddenly promoted to the dizzying rank of Assistant Chief Engineer, responsible for advanced development and design of the company’s products, both hardware and software.
Although I was still managerially responsible for the 503 Mark II software, I gave it less attention than the company’s new products and almost failed to notice when the deadline for its delivery passed without event. The programmers revised their implementation schedules and a new delivery date was set some three months ahead in June 1965. Needless to say, that day also passed without event. By this time, our customers were getting angry and my managers instructed me to take personal charge of the project. I asked the senior programmers once again to draw up revised schedules, which again showed that the software could be delivered within another three months. I desperately wanted to believe it but I just could not. I disregarded the schedules and began to dig more deeply into the project.
It turned out that we had failed to make any overall plans for the allocation of our most limited resource: main storage. Each programmer expected this to be done automatically, either by the symbolic assembler or by the automatic overlay scheme. Even worse, we had failed to simply count the space used by our own software which was already filling the main store of the computer, leaving no space for our customers to run their programs. Hardware address length limitations prohibited adding more main storage.
Clearly, the original specifications of the software could not be met and had to be drastically curtailed.
Experienced programmers and even managers were called back from other projects. We decided to concentrate first on delivery of the new compiler for ALGOL 60, which careful calculation showed would take another four months. I impressed upon all the programmers involved that this was no longer just a prediction; it was a promise; if they found they were not meeting their promise, it was their personal responsibility to find ways and means of making good.
The programmers responded magnificently to the challenge. They worked nights and days to ensure completion of all those items of software which were needed by the ALGOL compiler. To our delight, they met the scheduled delivery date; it was the first major item of working software produced by the company over a period of two years.
Our delight was short-lived; the compiler could not be delivered. Its speed of compilation was only two characters per second which compared unfavorably with the existing version of the compiler operating at about a thousand characters per second. We soon identified the cause of the problem: It was thrashing between the main store and the extension core backing store which was fifteen times slower. It was easy to make some simple improvements, and within a week we had doubled the speed of compilation - to four characters per second. In the next two weeks of investigation and reprogramming, the speed was doubled again - to eight characters per second. We could see ways in which within a month this could be still further improved., but the amount of reprogramming required was increasing and its effectiveness was decreasing; there was an awful long way to go. The alternative of increasing the size of the main store so frequently adopted in later failures of this kind was prohibited by hardware addressing limitations.
There was no escape: The entire Elliott 503 Mark II software project had to be abandoned, and with it, over thirty man-years of programming effort, equivalent to nearly one man’s active working life, and I was responsible, both as designer and as manager, for wasting it.
A meeting of all our 503 customers was called and Roger Cook, who was then manager of the computing division, explained to them that not a single word of the long-promised software would ever be delivered to them. He adopted a very quiet tone of delivery, which ensured that none of the customers could interrupt, murmur in the background, or even shuffle in their seats. I admired but could not share his calm. Over lunch our customers were kind to try to comfort me. They had realized long ago that software to the original specification could never have been delivered, and even if it had been, they would not have known how to use its sophisticated features, and anyway many such large projects get cancelled before delivery. In retrospect, I believe our customers were fortunate that hardware limitations had protected them from the arbitrary excesses of our software designs. In the present day, users of microprocessors benefit from a similar protection - but not for much longer.
At that time I was reading the early documents describing the concepts and features of the newly announced OS 360, and of a new time-sharing project called Multics. These were far more comprehensive, elaborate, and sophisticated than anything I had imagined, even in the first version of the 503 Mark II software. Clearly IBM and MIT must be possessed of some secret of successful software design and implementation whose nature I could not even begin to guess at. It was only later that they realized they could not either.
So I still could not see how I had brought such a great misfortune upon my company. At the time I was convinced that my managers were planning to dismiss me.
But no, they were intending a far more severe punishment. "O.K. Tony," they said. "You got us into this mess and now you’re going to get us out." "But I don’t know how," I protested, but their reply was simple. "Well then, you’ll have to find out." They even expressed confidence that I could do so. I did not share their confidence. I was tempted to resign. It was the luckiest of all my lucky escapes that I did not.
Of course, the company did everything they could to help me. They took away my responsibility for hardware design and reduced the size of my programming teams. Each of my managers explained carefully his own theory of what had gone wrong and all the theories were different. At last, there breezed into my office the most senior manager of all, a general manager of our parent company,
Andrew St. Johnston. I was surprised that he had even heard of me. "You know what went wrong?" he shouted - he always shouted - "You let your programmers do things which you yourself do not understand." I stared in astonishment. He was obviously out of touch with present day realities. How could one person ever understand the whole of a modern software product like the Elliott 503 Mark II software system?
I realized later that he was absolutely right; he had diagnosed the true cause of the problem and he had planted the seed of its later solution.
I still had a team of some forty programmers and we needed to retain the good will of customers for our new machine and even regain the confidence of the customers for our old one. But what should we actually plan to do when we knew only one thing - that all our previous plans had failed? I therefore called an all-day meeting of our senior programmers on October 22, 1965, to thrash out the question between us. I still have the notes of that meeting. We first listed the recent major grievances of our customers: Cancellation of products, failure to meet deadlines, excessive size of software, not justified by the usefulness of the facilities provided, excessively slow programs, failure to take account of customer feedback. Earlier attention paid to quite minor requests of our customers might have paid as great dividends of goodwill as the success of our most ambitious plans.
We then listed our own grievances: Lack of machine time for program testing, unpredictability of machine time, lack of suitable peripheral equipment, unreliability of the hardware even when available, dispersion of programming staff, lack of equipment for keypunching of programs, lack of firm hardware delivery dates, lack of technical writing effort for documentation, lack of software knowledge outside of the programming group, interference from higher managers who imposed decisions, "...without a full realization of the more intricate implications of the matter," and overoptimism in the face of pressure from customers and the Sales Department.
But we did not seek to excuse our failure by these grievances. For example, we admitted that it was the duty of programmers to educate their managers and other departments of the company by "...presenting the necessary information in a simple palatable form." The hope "that deficiencies in original program specifications could be made up by the skill of a technical writing department . . . was misguided; the design of a program and the design of its specification must be undertaken in parallel by the same person, and they must interact with each other. A lack of clarity in specification is one of the surest signs of a deficiency in the program it describes, and the two faults must be removed simultaneously before the project is embarked upon." I wish I had followed this advice in 1963; I wish we all would follow it today.
My notes of the proceedings of that day in October 1965 include a complete section devoted to failings within the software group; this section rivals the most abject self-abasement of a revisionist official in the Chinese cultural revolution. Our main failure was over ambition. "The goals which we have attempted have obviously proved to be far beyond our grasp." There was also failure in prediction, in estimation of program size and speed, of effort required, in planning the coordination and interaction of programs, in providing an early warning that things were going wrong. There were faults in our control of program changes, documentation, liaison with other departments, with our management, and with our customers. We failed in giving clear and stable definitions of the responsibilities of individual programmers and project leaders - Oh, need I go on? What was amazing was that a large team of highly intelligent programmers could labor so hard and so long on such an unpromising project. You know, you shouldn’t trust us intelligent programmers. We can think up such good arguments for convincing ourselves and each other of the utterly absurd. Especially don’t believe us when we promise to repeat an earlier success, only bigger and better next time.
The last section of our inquiry into the failure dealt with the criteria of quality of software. "In the recent struggle to deliver any software at all, the first casualty has been consideration of the quality of the software delivered. The quality of software is measured by a number of totally incompatible criteria, which must be carefully balanced in the design and implementation of every program." We then made a list of no less than seventeen criteria which has been published in a guest editorial in Volume 2 of the journal, Software Practice and Experience.
How did we recover from the catastrophe? First, we classified our 503 customers into groups, according to the nature and size of the hardware configurations which they bad bought - for example, those with magnetic tapes were all in one group. We assigned to each group of customers a small team of programmers and told the team leader to visit the customers to find out what they wanted; to select the easiest request to fulfill, and to make plans (but no promises) to implement it. In no case would we consider a request for a feature that would take more than three months to implement and deliver. The project leader would then have to convince me that the customers’ request was reasonable, that the design of the new feature was appropriate, and that the plans and schedules for implementation were realistic. Above all, I did not allow anything to be done which I did not myself understand. It worked!
The software requested began to be delivered on the promised dates. With an increase in our confidence and that of our customers, we were able to undertake fulfilling slightly more ambitious requests. Within a year we had recovered from the disaster. Within two years, we even had some moderately satisfied customers.
Thus we muddled through by common sense and compromise to something approaching success. But I was not satisfied. I did not see why the design and implementation of an operating system should be so much more difficult than that of a compiler. This is the reason why I have devoted my later research to problems of parallel programming and language constructs which would assist in clear structuring of operating systems - constructs such as monitors and communicating processes.
While I was working at Elliott’s, I became very interested in techniques for formal definition of programming languages. At that time, Peter Landin and Christopher Strachey proposed to define a programming language in a simple functional notation, that specified the effect of each command on a mathematically defined abstract machine. I was not happy with this proposal because I felt that such a definition must incorporate a number of fairly arbitrary representation decisions and would not be much simpler in principle than an implementation of the language for a real machine. As an alternative, I proposed that a programming language definition should be formalized as a set of axioms, describing the desired properties of programs written in the language. I felt that carefully formulated axioms would leave an implementation the necessary freedom to implement the language efficiently on different machines and enable the programmer to prove the correctness of his programs. But I did not see how to actually do it. I thought that it would need lengthy research to develop and apply the necessary techniques and that a university would be a better place to conduct such research than industry. So I applied for a chair in Computer Science at the Queen’s University of Belfast where I was to spend nine happy and productive years. In October 1968, as I unpacked my papers in my new home in Belfast, I came across an obscure preprint of an article by Bob Floyd entitled, "Assigning Meanings to Programs."
What a stroke of luck! At last I could see a way to achieve my hopes for my research. Thus I wrote my first paper on the axiomatic approach to computer programming, published in the Communications of the ACM in October 1969.
Just recently, I have discovered that an early advocate of the assertional method of program proving was none other than Alan Turing himself. On June 24, 1950 at a conference in Cambridge, he gave a short talk entitled, "Checking a Large Routine" which explains the idea with great clarity. "How can one check a large routine in the sense of making sure that it’s right? In order that the man who checks may not have too difficult a task, the programmer should make a number of definite assertions which can be checked individually, and from which the correctness of the whole program easily follows."
Consider the analogy of checking an addition. If the sum is given bust as a column of figures [with the answer below] one must check the whole at one sitting. But if the totals for the various columns are given, [with the carries added in separately], the checker’s work is much easier, being split up into the checking of the various assertions [that each column is correctly added] and the small addition [of the carries to the total]. This principle can be applied to the checking of a large routine but we will illustrate the method by means of a small routine viz. one to obtain n factorial without the use of a multiplier.
Unfortunately there is no coding system sufficiently generally known to justify giving this routine in full, but a flow diagram will be sufficient for illustration. That brings me back to the main theme of my talk, the design of programming languages.
During the period, August 1962 to October 1966, I attended every meeting of the IFIP ALGOL working group. After completing our labors on the IFIP ALGOL subset, we started on the design of ALGOL X, the intended successor to ALGOL 60. More suggestions for new features were made, and in May 1965, Niklaus Wirth was commissioned to collate them into a single language design. I was delighted by his draft design which avoided all the known defects of ALGOL 60 and included several new features, all of which could be simply and efficiently implemented, and safely and conveniently used.
The description of the language was not yet complete. I worked hard on making suggestions for its improvement and so did many other members of our group. By the time of the next meeting in St. Pierre de Chartreuse, France in October 1965, we had a draft of an excellent and realistic language design which was published in June 1966 as "A Contribution to the Development of ALGOL," in the Communications of the ACM. It was implemented on the IBM 360 and given the title ALGOL W by its many happy users. It was not only a worthy successor of ALGOL 60, it was even a worthy predecessor of PASCAL.
At the same meeting, the ALGOL committee had placed before it, a short, incomplete and rather incomprehensible document, describing a different, more ambitious and, to me, a far less attractive language. I was astonished when the working group, consisting of all the best known international experts of programming languages, resolved to lay aside the commissioned draft on which we had all been working and swallow a line with such an unattractive bait.
This happened just one week after our inquest on the 503 Mark II software project. I gave desperate warnings against the obscurity, the complexity, and over ambition of the new design, but my warnings went unheeded. I conclude that there are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies and the other way is to make it so complicated that there are no obvious deficiencies.
The first method is far more difficult. It demands the same skill, devotion, insight, and even inspiration as the discovery of the simple physical laws which underlie the complex phenomena of nature. It also requires a willingness to accept objectives which are limited by physical, logical, and technological constraints, and to accept a compromise when conflicting objectives cannot be met. No committee will ever do this until it is too late.
So it was with the ALGOL committee. Clearly the draft which it preferred was not yet perfect. So a new and final draft of the new ALGOL language design was promised in three months’ time; it was to be submitted to the scrutiny of a subgroup of four members including myself. Three months came and went, without a word of the new draft. After six months, the subgroup met in the Netherlands. We had before us a longer and thicker document, full of errors corrected at the last minute, describing yet another, but to me, equally unattractive language. Niklaus Wirth and I spent some time trying to get removed some of the deficiencies in the design and in the description, but in vain. The completed final draft of the language was promised for the next meeting of the full ALGOL committee in three months’ time.
Three months came and went - not a word of the new draft appeared. After six months, in October 1966, the ALGOL working group met in Warsaw. It had before it an even longer and thicker document, full of errors corrected at the last minute, describing equally obscurely yet another different, and to me, equally unattractive language. The experts in the group could not see the defects of the design and they firmly resolved to adopt the draft, believing it would be completed in three months. In vain, I told them it would not. In vain, I urged them to remove some of the technical mistakes of the language, the predominance of references, the default type conversions. Far from wishing to simplify the language, the working group actually asked the authors to include even more complex features like overloading of operators and concurrency.
When any new language design project is nearing completion, there is always a mad rush to get new features added before standardization. The rush is mad indeed, because it leads into a trap from which there is no escape. A feature which is omitted can always be added later, when its design and its implications are well understood. A feature which is included before it is fully understood can never be removed later.
At last, in December 1968, in a mood of black depression, I attended the meeting in Munich at which our long-gestated monster was to come to birth and receive the name ALGOL 68. By this time, a number of other members of the group had become disillusioned, but too late: The committee was now packed with supporters of the language, which was sent up for promulgation by the higher committees of IFIP. The best we could do was to send with it a minority report, stating our considered view that, "...as a tool for the reliable creation of sophisticated programs, the language was a failure." This report was later suppressed by IFIP, an act which reminds me of lines of Hilaire Belloc,
But scientists, who ought to know
Assure us that it must be so.
Oh, let us never, never doubt
What nobody is sure about.
I did not attend any further meetings of that working group. I am pleased to report that the group soon came to realize that there was something wrong with their language and with its description; they labored hard for six more years to produce a revised description of the language. It is a great improvement but I’m afraid that, in my view, it does not remove the basic technical flaws in the design, nor does it begin to address the problem of its overwhelming complexity.
Programmers are always surrounded by complexity; we cannot avoid it. Our applications are complex because we are ambitious to use our computers in ever more sophisticated ways. Programming is complex because of the large number of conflicting objectives for each of our programming projects. If our basic tool, the language in which we design and code our programs, is also complicated, the language itself becomes part of the problem rather than part of its solution.
Now let me tell you about yet another overambitious language project. Between 1965 and 1970 1 was a member and even chairman of the Technical Committee No. 10 of the European Computer Manufacturers Association. We were charged first with a watching brief and then with the standardization of a language to end all languages, designed to meet the needs of all computer applications, both commercial and scientific, by the greatest computer manufacturer of all time. I had studied with interest and amazement, even a touch of amusement, the four initial documents describing a language called NPL, which appeared between March I and November 30, 1964. Each was more ambitious and absurd than the last in its wishful speculations. Then the language began to be implemented and a new series of documents began to appear at six-monthly intervals, each describing the final frozen version of the language, under its final frozen name PL/1.
But to me, each revision of the document simply showed how far the initial Flevel implementation had progressed. Those parts of the language that were not yet implemented were still described in free-flowing flowery prose giving promise of unalloyed delight. In the parts that had been implemented, the flowers had withered; they were choked by an undergrowth of explanatory footnotes, placing arbitrary and unpleasant restrictions on the use of each feature and loading upon a programmer the responsibility for controlling the complex and unexpected side-effects and interaction effects with all the other features of the language.
At last, March 11, 1968, the language description was nobly presented to the waiting world as a worthy candidate for standardization. But it was not. It had already undergone some seven thousand corrections and modifications at the hand of its original designers. Another twelve editions were needed before it was finally published as a standard in 1976. 1 fear that this was not because everybody concerned was satisfied with its design, but because they were thoroughly bored and disillusioned.
For as long as I was involved in this project, I urged that the language be simplified, if necessary by subsetting, so that the professional programmer would be able to understand it and be able to take responsibility for the correctness and cost effectiveness of his programs. I urged that the dangerous features such as defaults and ON-conditions be removed. I knew that it would be impossible to write a wholly reliable compiler for a language of this complexity and impossible to write a wholly reliable program when the correctness of each part of the program depends on checking that every other part of the program has avoided all the traps and pitfalls of the language.
At first I hoped that such a technically unsound project would collapse but I soon realized it was doomed to success. Almost anything in software can be implemented, sold, and even used given enough determination. There is nothing a mere scientist can say that will stand against the flood of a hundred million dollars. But there is one quality that cannot be purchased in this way - and that is reliability. The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
All this happened a long time ago. Can it be regarded as relevant in a conference dedicated to a preview of the Computer Age that lies ahead? It is my gravest fear that it can. The mistakes which have been made in the last twenty years are being repeated today on an even grander scale. I refer to a language design project which has generated documents entitled strawman, woodenman, tinman, ironman, steelman, green and finally now ADA. This project has been initiated and sponsored by one of the world’s most powerful organizations, the United States Department of Defense. Thus it is ensured of an influence and attention quite independent of its technical merits and its faults and deficiencies threaten us with far greater dangers. For none of the evidence we have so far can inspire confidence that this language has avoided any of the problems that have afflicted other complex language projects of the past.
I have been giving the best of my advice to this project since 1975. At first I was extremely hopeful. The original objectives of the language included reliability, readability of programs, formality of language definition, and even simplicity. Gradually these objectives have been sacrificed in favor of power, supposedly achieved by a plethora of features and notational conventions, many of them unnecessary and some of them, like exception handling, even dangerous. We relive the history of the design of the motor car. Gadgets and glitter prevail over fundamental concerns of safety and economy.
It is not too late! I believe that by careful pruning of the ADA language, it is still possible to select a very powerful subset that would be reliable and efficient in implementation and safe and economic in use. The sponsors of the language have declared unequivocally, however, that there shall be no subsets. This is the strangest paradox of the whole strange project. If you want a language with no subsets, you must make it small.
You include only those features which you know to be needed for every single application of the language and which you know to be appropriate for every single hardware configuration on which the language is implemented. Then extensions can be specially designed where necessary for particular hardware devices and for particular applications. That is the great strength of PASCAL, that there are so few unnecessary features and almost no need for subsets. That is why the language is strong enough to support specialized extensions - Concurrent PASCAL for real time work, PASCAL PLUS for discrete event simulation, UCSD PASCAL for microprocessor work stations. If only we could learn the right lessons from the successes of the past, we would not need to learn from our failures.
And so, the best of my advice to the originators and designers of ADA has been ignored. In this last resort, I appeal to you, representatives of the programming profession in the United States, and citizens concerned with the welfare and safety of your own country and of mankind: Do not allow this language in its present state to be used in applications where reliability is critical, i.e., nuclear power stations, cruise missiles, early warning systems, anti-ballistic missile defense systems. The next rocket to go astray as a result of a programming language error may not be an exploratory space rocket on a harmless trip to Venus: It may be a nuclear warhead exploding over one of our own cities. An unreliable programming language generating unreliable programs constitutes a far greater risk to our environment and to our society than unsafe cars, toxic pesticides, or accidents at nuclear power stations. Be vigilant to reduce that risk, not to increase it.
Let me not end on this somber note. To have our best advice ignored is the common fate of all who take on the role of consultant, ever since Cassandra pointed out the dangers of bringing a wooden horse within the walls of Troy. That reminds me of a story I used to hear in my childhood. As far as I recall, its title was:
The Emperor’s Old Clothes
Many years ago, there was an Emperor who was so excessively fond of clothes that he spent all his money on dress. He did not trouble himself with soldiers, attend banquets, or give judgement in court. Of any other king or emperor one might say, "He is sitting in council," but it was always said of him, "The emperor is sitting in his wardrobe." And so he was. On one unfortunate occasion, he had been tricked into going forth naked to his chagrin and the glee of his subjects. He resolved never to leave his throne, and to avoid nakedness, he ordered that each of his many new suits of clothes should be simply draped on top of the old.
Time passed away merrily in the large town that was his capital. Ministers and courtiers, weavers and tailors, visitors and subjects, seamstresses and embroiderers, went in and out of the throne room about their various tasks, and they all exclaimed, "How magnificent is the attire of our Emperor."
One day the Emperor’s oldest and most faithful Minister heard tell of a most distinguished tailor who taught at an ancient institute of higher stitchcraft, and who had developed a new art of abstract embroidery using stitches so refined that no one could tell whether they were actually there at all.
"These must indeed be splendid stitches," thought the Minister. "If we can but engage this tailor to advise us, we will bring the adornment of our Emperor to such heights of ostentation that all the world will acknowledge him as the greatest Emperor there has ever been."
So the honest old Minister engaged the master tailor at vast expense. The tailor was brought to the throne room where he made obeisance to the heap of fine clothes which now completely covered the throne. All the courtiers waited eagerly for his advice. Imagine their astonishment when his advice was not to add sophistication and more intricate embroidery to that which already existed, but rather to remove layers of the finery, and strive for simplicity and elegance in place of extravagant elaboration. "This tailor is not the expert that he claims," they muttered. "His wits have been addled by long contemplation in his ivory tower and he no longer understands the sartorial needs of a modern Emperor." The tailor argued loud and long for the good sense of his advice but could not make himself heard. Finally, he accepted his fee and returned to his ivory tower.
Never to this very day has the full truth of this story been told: That one fine morning, when the Emperor felt hot and bored, he extricated himself carefully from under his mountain of clothes and is now living happily as a swineherd in another story. The tailor is canonized as the patron saint of all consultants, because in spite of the enormous fees that he extracted, he was never able to convince his clients of his dawning realization that their clothes have no Emperor.
|
||
4768
|
dbpedia
|
0
| 60
|
https://www.hillelwayne.com/post/influential-dead-languages/
|
en
|
10 Most(ly dead) Influential Programming Languages
|
https://www.hillelwayne.com/favicon.ico
|
https://www.hillelwayne.com/favicon.ico
|
[
"https://www.hillelwayne.com/post/influential-dead-languages/IDSkeyboard.jpg",
"https://www.hillelwayne.com/post/influential-dead-languages/java.jpg"
] |
[] |
[] |
[
""
] | null |
[] |
2020-03-25T00:00:00+00:00
|
The other day I read 20 most significant programming languages in history, a “preposterous table I just made up.” He certainly got preposterous right: he lists Go as “most significant” but not ALGOL, Smalltalk, or ML. He also leaves off Pascal because it’s “mostly dead”. Preposterous! That defeats the whole point of what “significant in history” means.
So let’s talk about some “mostly dead” languages and why they matter so much.
|
en
|
/favicon.ico
|
Hillel Wayne
|
https://www.hillelwayne.com/post/influential-dead-languages/
|
The other day I read 20 most significant programming languages in history, a “preposterous table I just made up.” He certainly got preposterous right: he lists Go as “most significant” but not ALGOL, Smalltalk, or ML. He also leaves off Pascal because it’s “mostly dead”. Preposterous! That defeats the whole point of what “significant in history” means.
So let’s talk about some “mostly dead” languages and why they matter so much.
Disclaimer: Yeah not all of these are dead and not all of these are forgotten. Like most people have heard of Smalltalk, right? Also there’s probably like a billion mistakes in this, because when you’re doing a survey of 60 years of computing history you’re gonna get some things wrong. Feel free to yell at me if you see anything!
Disclaimer 2: Yeah I know some of these are “first to invent” and others are “first to popularize”. History is complicated!
Detecting Influence
Before we start, a quick primer on finding influence. Just knowing that X was the first language with feature Z doesn’t mean that X actually influenced Z. While Absys was arguably the first logic programming language, almost all of logic programming actually stems from Prolog, which was developed independently. Ultimately there’s only one way to know for certain that X influenced Y: citation. This means one of
Y cites X in its reference manual
Y cites a paper that cites X
The author of Y says “we were influenced by X.”
Citations are transitive. Sometimes the language manual for Q lists motivating document R, which cites paper S as an inspiration, which mentions it got the ideas from language T. Then we know that T influenced Q, even if the chain is several steps long. This means digging through many sources to find a signal. To speed this up we use heuristics to decide where to look.
One effective heuristic is programming language cognates. It’s very rare for languages to independently come up with the same syntax. So if two languages share some syntax, one likely influenced the other. For example: even without reading design decisions by Matz, we know that Ruby was influenced by Smalltalk, as they both filter a list with a select method. This isn’t conclusive evidence. Maybe Matz came up with it independently, or maybe Ruby and Smalltalk were both influenced by a common ancestor. But it gives us a place to start looking.
The Languages
COBOL
Background: CODASYL, 1960. COBOL is shaped by the business/science split in computing. At that time high-level industry languages were either used for engineering computations or managing data. The engineers had all gotten behind FORTRAN while the business world was a mess of COMTRAN, FLOW-MATIC, and others, so the Department of Defense got a committee together to make a single universal business language. That’s COBOL.
COBOL was one of the four “mother” languages, along with ALGOL, FORTRAN, and LISP. While we consider it a punchline today, it was once the most popular language in the world. It still runs a lot of our legacy business systems.
Significance: In terms of syntax and semantics we don’t see much of COBOL in modern computing. COBOL’s most important addition is the concept of record data. In FORTRAN and ALGOL, your only data structure was the static array. In COBOL, though, you could read in structured files with hierarchical data, and it would automatically destructure them into the representative variables. This was a precursor to modern day structs.
Cause of Death: Two factors here. One: COBOL had no overlap with other PLT efforts. Very few people built on COBOL. This meant that second or third generation languages, which built on the lessons of their ancestors, had almost no COBOL DNA. This was less intrinsic problem of COBOL and more because of the academia’s disdain for its creation process. CODASYL was a business group and obviously wasn’t worth paying attention to. COBOL was also enormously complex, even for today’s languages. This means that COBOL compilers lagged contemporaries on microcomputers and minicomputers, giving spaces for other languages to flourish and eventually outcompete it.
ALGOL
Background: The ALGOL committee, 1960. ALGOL-58 came out two years before but was quickly superseded, so I’m wrapping them into each other. The committee wanted to make a good language for researching algorithms. In other words, ALGOL was a formalized “pseudocode”.
Of the four mother languages, ALGOL is the most “dead”; Everybody still knows about LISP, COBOL still powers tons of legacy systems, and most scientific packages still have some FORTRAN. But I’ve met plenty of programmers who haven’t even heard of ALGOL. You’d think it’d be the least important of the mother languages, but it’s the opposite. Of the four, only LISP comes anywhere close to the pervasive importance of ALGOL.
Significance: Let’s see: lexical scoping, structured programming, nested functions, formal language specifications, call-by-name semantics, BNF grammars, block comments… every modern language today is deeply influenced by ALGOL.
Cause of Death: ALGOL was a research language, not a commercial language. It was designed for studying algorithms. The spec didn’t define any I/O, which kinda made it impossible to use in practice. Sure, you could write a compiler extension, but then you might as well add other stuff too.
And that’s exactly what people did. In 1960 and 70 people made a huge number of ALGOL-likes by extending ALGOL with I/O and extra data structures. This includes JOVIAL, SIMULA, CLU, and CPL. Later languages were then based off these extensions, not ALGOL directly. We call C an “ALGOL-like”, but it’s actually a BCPL-like, which was a CPL-like, which was an ALGOL-like. ALGOL’s children buried it.
Eventually the ALGOL people tried to extend it into ALGOL-68, which radically departed from ALGOL-60 and hasn’t had close to the same influence. The ALGOL-60 lineage continues with Niklaus Wirth’s Pascal.
APL
Background: Ken Iverson, 1962. Originally a hand-written notation for array math, IBM picked it up and used as an programming language. As a language, APL focused on array processing: being able to concisely manipulate large blocks of numbers.
If you’ve heard of APL before, you probably know it as “that weird symbol language”. One of the most notorious code snippets is this implementation of the Game of Life:
You had to write it with a specialized keyboard, like this:
Nonetheless, APL got popular on mainframes for running with very low memory requirements.
Significance: Array processing. At a time when adding two lists of numbers meant a map or a loop, APL introduced the idea of operating on the entire array at once. For example:
This was a really big deal in scientific circles. So much applied math boils down to large-scale operations on large matrices. When you can just take the outer product with ∘.f, it’s really damn easy to take outer products!
Through this innovation APL lead to R, numpy, pandas, Matlab, etc. There’s also the direct descendants of APL: J, Dyalog, K, Q. They’ve been less successful but still see lots of use in the finance sector.
Cause of Death: Well, the obvious problem is the keyboards. If you can’t write it in ASCII, you’re not going to write very much of it. Iverson fixed this with J, which uses digraphs instead of different symbols. Instead of ≠, you write ~:. This was in 1990, though, which is a bit late to popularize a radically different programming style.
The subtler problem is that APL and J only worked on homogeneous data. You can’t store both strings and numbers in the same data structure (unless you use boxes, which is a whole other can of worms) and working with strings is generally a nightmare. So no dataframes, which excludes a lot of modern data science.
Further Reading: Notation as a Tool of Thought
BASIC
Background: John Kemeny, 1964. Originally a simplified FORTRAN-like, intended to help people outside engineering use computers.
BASIC really took off in the microcomputer era. The first microcomputers didn’t have enough memory to compile “real” programming languages, whereas you could cram a pared-down BASIC compiler into like 2 kilobytes. BASIC became a lingua franca for early-stage programmers. If you were programming at home in the 1970’s, you were probably writing BASIC on a microcomputer.
Significance: The biggest technical impact is runtime interpretation. BASIC was the first language with a real-time interpreter (the Dartmouth Time Sharing System), beating APL by a year. And that APL system was only available to IBM customers, so really it was BASIC or nothing for a long time.
BASIC had a bigger social impact. It brought programming to households, kids especially. Many of the influential programmers in the 80’s and 90’s first learned how to program on BASIC. Many enterprise programs were also written in BASIC, which probably helped accelerate the decline of Cobol.
BASIC has one more neat trick up its sleeve: Office tooling! Microsoft eventually turned BASIC into Visual Basic, which they used as the Office macro language. This then spread to OpenOffice and LibreOffice, entrenching BASIC in that particular niche. More recently it’s lost ground to JavaScript and is now a legacy macro language.
Cause of Death: People saw BASIC as a “lesser” language. You might use it if you were a kid or a small business owner, but real programmers used a real language. Once manufacturers could cheaply make microcomputers with more than 16k of RAM they started depreciating BASIC for languages like Pascal and C.
BASIC lived on for a while as a legacy kids teaching language, but seems to have died out of that niche, too.
PL/I
Background: IBM, 1966. IBM’s business was split into two languages: FORTRAN for scientists and COMTRAN for business folk. Facing competition from COBOL and wanting to streamline their systems, they tried to make a language that was useful for both engineering and business purposes. This ended up looking like a sort of superset of the two languages, with a bunch of additional features stapled on top. Now everybody could use the same language and IBM can make a lot more money! Yaaaaaaaay
Significance: The authors of ALGOL-68 mockingly called PL/I an obsolete language. But everything ALGOL-68 did, PL/I did earlier and better. While COBOL got structured data first, PL/I was the first language to implement them as a type. In COBOL, reading in a user with a name would give you two global variables, user and name. In PL/I, you’d get one variable with a field, user.name. PL/I was also the first high-level language with pointers for direct memory manipulation, constants, and function overloading.
Many of these ideas entered mainstream programming via C, which was a mix of both BCPL and PL/I. C even uses PL/I’s comment syntax.
Cause of Death: All the FORTRAN programmers thought it was too much like COBOL and all the COBOL programmers thought it was too much like FORTRAN. IBM had tried to take on two established languages with a much more complicated one. It didn’t help that they were the only group with the compiler, meaning everybody else was mistrustful of vendor lock-in. By the time IBM was able to make headway in both of these issues the wider computing world had already moved on to the microcomputer era, where PL/I was out competed by BASIC.
Further Reading: The Choice of PL/I
SIMULA 67
Background: Ole Dahl and Kristen Nygaard, 1967. They extended ALGOL for doing simulations. First they made SIMULA I, which had dedicated simulation and “activity” syntax. SIMULA I saw some early use, but the two were dissatisfied with how “specialized” the language felt and how much duplicate code they had in their simulations. They wanted to make a more general framework for representing things in general, not simulations only.
Their idea was to allow users to define new types called “classes” with polymorphic function resolution. Then users could build the simulation features as a special case of the object system, making it easy to customize how it all worked to their particular needs.
Significance: While SIMULA wasn’t the first “true” OOP language, it was the first language with proper objects and laid much of the groundwork that others would build on. This includes the class/object split, subclassing, virtual methods, and protected attributes. It inspired almost all of the academic research into objects after 1967. Both CLU and ML cited SIMULA as a major source of inspiration. Bjarne Stroustroup did his PhD on SIMULA, eventually incorporating a lot of its ideas into C++.
Cause of Death: In that same PhD Stroustroup claimed that SIMULA was waaaaaay too slow to use at scale. “Good luck getting anything done if you aren’t on a mainframe” slow. It’s worth noting that Smalltalk-80, which took the same ideas even further, had an extra 13 years of Moore’s law behind it. And even Smalltalk was often mocked as too slow. Everybody went and implemented the ideas in SIMULA that they could integrate into faster, simpler languages.
Further Reading: Compiling SIMULA: a historical study of technological genesis, The History of Simula
Pascal
Background: Niklaus Wirth, 1970. Made to capture the essence of ALGOL-60 after ALGOL-68 got waaaaaay too complicated for Wirth’s liking. It first got big as the “introduction to CS” language, and by the early 80’s was the second-most popular language on the Usenet job boards. Wirth considers the whole family- Pascal, Modula, and Oberon- as a single unified language concept.
Significance: Pascal didn’t introduce any completely new ideas. It was an intentionally conservative language that tried to pick the best parts of the past decade and provide them in a unified package. Pascal brought ALGOL syntax outside academia, so much so that ALGOL’s assignment syntax, :=, got called “Pascal style” instead. From this point on most language features that look like ALGOL were more likely inspired by Pascal than directly by ALGOL itself.
While Pascal wasn’t very innovative, variants of it were. Wirth also pioneered the idea of “stepwise refinement” as a means of writing rigorous software. This eventually lead to the Modulas, which popularized first class software modules, and Euclid, the first formal verification language to see production use.
Cause of Death: I’m calling a mulligan on this one. Unlike most of the other ones on this list, Pascal didn’t have major structural barriers or a sharp competitor. Sure, it competed with C, but it was still doing fine for a very long time. People usually attribute the Why Pascal is not my favorite language essay, but that’s too neat of an answer and history is a lot messier. Also, Delphi is still pretty high-ranked in the TIOBE and PYPA measurements, so it’s not exactly dead in the same way SIMULA is. An accurate analysis of the fall of Pascal would be longer than the rest of this essay.
Further Reading: The Programming Language Pascal, Pascal and its Successors
CLU
Background: Barbara Liskov, 1975. Liskov wanted to mess around with abstract data types. That’s it. That’s the whole reason for CLU.
Significance: CLU might be the most influential language that nobody’s ever heard of. Iterators? CLU. Abstract data types? CLU. Generics? CLU. Checked exceptions? CLU.
We didn’t adopt the same terminology, so it’s not 100% obvious it all comes from CLU, but still. Every language spec for the next decade would namedrop CLU. CLU did a lot.
Cause of Death: CLU was a showcase language; Liskov wanted to get people to adopt her ideas, not her specific language. And they did: almost every language today owes something to CLU. As soon as she completed CLU she moved on to Argus, which was supposed to showcase her ideas on concurrency. That hasn’t seen nearly the same adoption, and there’s still a lot of stuff in it left to mine.
Further reading: A History of CLU
ML
Background: Robin Milner, 1976. Milner was building the LCF Prover, one of the first proof assistants. If you wrote a proof in the right format, LCF could check to see if it was correct or not. To assist in writing the proofs, Milner created a metalanguage based on sound mathematical formalisms, which at the time meant strict static types and higher-order functions. Eventually ML was standardized as Standard ML.
Significance: ML is arguably the oldest “algebraic programming language”. There’s a lot of stuff we attribute to ML: algebraic data types, modules, typed functional programming. Surprisingly, it was not the first for a lot of these! The first ML was just designed to work with LCF and wasn’t a general purpose language, so lacked a lot of these features. As people started making it more general they pulled ideas from other research languages and incorporated them into ML. One very important idea did start in ML, though: type inference. ML was the first statically-typed language where you didn’t have to write the types out, as the compiler would figure out the types for you. This paved the way for typed FP to escape academia and enter production use.
ML also greatly influenced modern theorem provers. The “program” languages for Isabelle, CVC3, and Coq are ML-based. And a lot of type theory was based on ML, though in more recent years the Haskell branch of FP has become more popular.
Cause of Death: ML had a lot of interesting features, but people paid attention to it for the type inference. At the time ML was still a special purpose language for the theorem provers. SML came out the same year as Haskell, which was a much “purer” example of a typed FP language.
Smalltalk
Background: Alan Kay, 1972, 1976, and 1980. It’s sort of a moving target. Smalltalk-72 was the first, Smalltalk-76 introduced the idea of “object-oriented programming” to the wider world, and Smalltalk-80 was the one that saw widespread adoption.
Smalltalk wasn’t the first language with objects but it was the first “object-oriented” one. The difference was that Simula had objects in addition to primitives like numbers and booleans, while in Smalltalk, booleans were also objects. I wrote a bit about this here if you want to learn more.
Significance: We sometimes think that Smalltalk is “true” OOP and things like Java and Python aren’t “real” OOP, but that’s not true. OOP is a giant mess of many different influences, just like every other paradigm. But it was certainly the thing that popularized the idea. If you crack open any general theory OOP book from the mid-80’s or early 90’s, they’ll be in Smalltalk. Many will also translate their examples to C++, and a few will use another language, but everybody will use Smalltalk.
Smalltalk also spread the idea of objects as shareable data, leading the way to CORBA, and it inspired the computational Actor model.
Cause of Death: The common belief is that Smalltalk lost because people used C++ instead. But that’s untrue. Smalltalk did have some issues, specifically its difficulty interoping with other tools and poor runtime performance. But even into the 1990’s Smalltalk was doing respectable business and many people assumed it would be a dominant business language.
Then Java happened.
Smalltalk wasn’t the only casualty of the “Javapocalypse”: Java also marginalized Eiffel, Ada95, and pretty much everything else in the OOP world. The interesting question isn’t “Why did Smalltalk die”, it’s “Why did C++ survive”. I think it’s because C++ had better C interop so was easier to extend into legacy systems.
This is just a small sample of the important dead languages. I didn’t cover ALPHARD, ALTRAN, Argus, Automath, BCPL, COMTRAN, CPL, Eiffel, FLOW-MATIC, HOPE, Hypercard, ISWIM, JOVIAL, MacSyma, Mesa, Miranda, Multics Shell, PLANNER, SMP, Sketchpad, or SNOBOL. All of them contributed in their own way to the modern programming world. History is complicated.
Most influential languages never went mainstream. Few people used any one of them. But each one inspired people, who inspired other people, so the DNA of these forgotten languages appear decades after they’re forgotten. But there are also untold languages that didn’t get their ideas out. The Encyclopaedia of Programming Languages lists over 8,000 programming languages. Many of them had ideas that never left their bubble. Consider how much we’d have lost if nobody had heard of SIMULA, or Liskov never shared CLU.
That’s one reason I love studying history. To learn what we’ve lost and find it again.
The first draft of this was originally shared on my newsletter. If you found this interesting, why not subscribe?
|
||
4768
|
dbpedia
|
3
| 35
|
http://www.crpc.rice.edu/newsletters/oct93/news.backus.html
|
en
|
Backus Receives NAE's Draper Prize for Development of FORTRAN
|
[
"http://www.crpc.rice.edu/newsletters/images/newsletter_header.jpg"
] |
[] |
[] |
[
""
] | null |
[] | null | null |
John Backus has received the 1993 National Academy of Engineering's Charles Stark Draper Prize in recognition of his development of FORTRAN, a programming language used throughout science and industry.
The Draper Prize is given for engineering achievement that contributes to human welfare and freedom. In developing FORTRAN, Backus not only developed a new computer language, he helped to create an entire industry that would benefit from its use. FORTRAN provided the framework for the development of most high-level programming languages and has been used for everything from video games, air traffic control systems, payroll calculations, and compiler writing. High-level computer languages now form the base of a $23 billion software industry.
In 1950, Backus began work at IBM, where he helped design the IBM 701 and 704 computers in addition to the FORTRAN I and II optimizing compilers. He has also helped to design Algol 58 and Algol 60. Backus was elected to the National Academy of Sciences in 1974 and the National Academy of Engineering in 1977. In addition to the Draper Prize, he has received the McDowell Award and the Turing Award for his contributions to computer science.
Table of Contents
|
||||||||
4768
|
dbpedia
|
3
| 42
|
http://l.web.umkc.edu/lz74d/CS441/Homework3.htm
|
en
|
Year of
|
[] |
[] |
[] |
[
""
] | null |
[] | null | null |
Major Contribution
Motivation
Impact on Language Implementation
to Development
for Development Readability Writability Reliability Source FORTRAN 1954 One of the features of FORTRAN I, and all of its successors except 90, that allow highly optimized compilers are that the types and storage for all variables are fixed before run time modules). The environment in which FORTRAN was developed was as follows: (1) Computers were still small, slow, and relatively unreliable. (2) The primary use of computers was for scientific computations. (3) There were no existing efficient ways to program computers. (4) Because of the high cost of computers compared to the cost of programmers, speed of the generated object code was the primary goal of the first FORTRAN compilers. So I think FORTRAN did increase the readability of the major programming language from the late 1940s to the mid-1950s for it was evolved more with the features of simplicity and orthogonality, getting more new control statement, data structures (Ext ending names to six characters, independent compilation, type names that could be parameter, character string handling, declarations and allow subprogram Increase writability too. The same reason as readability. However, I dont think its syntax considerations and expressivity is very good. Its reliability is high since it got parameter type checking (FORTRAN 90), aliasing (FORTRAN 0 and I had user-defined programs) and its comparatively high readability and writability. Sebesta, Robert W, Concepts of Programming Languages (5th ed.2003) Algol 58 1958 Formalized concept of data type, compound statements Good used for scientific programming, syntax be close to math notation Hard to read programs, Following code difficult Hard to write parameters, hard to write structure of programs Read and write hard then reliability hard, hard modify programs Sebesta, Robert W, Concepts of Programming Languages (5th ed.2003) LISP 1958 LISP was the first functional programming language. It greatly influenced artificial intelligence (AI) from its inception. LISP was designed in 1958 by John McCarthy and Marvin Minsky, while working on the MIT AI Project. One of the first goals of the project was to produce a list processing system. The Fortran List Processing Language already existed; however, it did not support recursion, conditional expressions, dynamic storage allocation or implicit deallocation. Dynamically allocated linked lists and implicit deallocation were requirements for symbolic differentiation. McCarthys and Minskys list processing system was to be used to implement a program named the Advice Taker, but the application provided the incentive to develop the new list processing language LISP. Since LISP is a functional language, assignment statements and variables are not necessary. Functions are applied to arguments. LISP also uses recursive function calls, rather than loops. The simplicity of the LISP language does make is less readable and harder to understand for someone not familiar with the language. However, the simplicity and orthogonality are impressive to some language researchers. The profuse use of parentheses can also decrease the readability of this language. The simplicity and orthogonality of LISP have a huge impact on writability. They make it easier to write programs in LISP. The lack of assignments and variables also increases the writability of the language. LISP is a fairly reliable functional programming language. It is the most widely used language for artificial intelligence. Most of the past inefficiencies of LISP code have been eliminated. The code is ran faster because it is compiled and not ran on an interpreter. LISP is a reliable functional programming language. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://users.erols.com/ziring/dopl.html
COBOL 1959 Business Application Programming Language
Hierarchial Data Structures
English like nonmathematical syntax.
COBOL was developed in 1959 to fulfill the needs of a growing base of business computer users. COBOL was not built in an attempt to create a universal programming language, but rather to create a practical language for business applications. COBOL succeded in that goal, resulting in the fact that there currently exist more lines of COBOL code than any other programming language. COBOL was the first language to implement English style commands rather than a more mathematics syntax. This decision fell in line with the goal of making COBOL usable for a broader base of programmers by making it more readable, and thus accessible to business professions with less mathematical and computer science background.
COBOL defines every variable is a special data division of the code, thus implementing a data dictionary which makes reading and understanding the code easier.
COBOL is the first language to implement heirarchial data structures, providing the first step toward data abstraction which makes programs more readable by managing complexity.
The use of English style in COBOL commands allows the programmer to think more in terms of the problem being solved and less in terms of the idiosyncrasies of the programming language and machine implementation. The abstraction provided by hierarchical data structures makes code less complex, thus easier to write. Finally, because of its business application focus, COBOL often trades power for usability. Therefore, while a programmer may have a difficult time writing scientific applications in COBOL because of the lack of powerful features, the usability gained in the area of business application allows the programmer to again focus on the problem not the language, improving writability. Reliability, as a function of how well a program is understood and can be modified correctly, is increased by COBOLs gains in readability and writeability. Clearly, based on its widespread use in business critical applications, COBOL has proven itself a reliable programming tool. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://www.legacyj.com/cobol/cobol_history.html
Algol 60 1960 Public algortihms, Machine independent Make modifications to Algol 58 Understanding difficult, BNF make hard to read Input and output statements Implementation inefficient, BNF hard to modify Sebesta, Robert W, Concepts of Programming Languages (5th ed. 2003) BASIC 1963 The Basic design came from Fortran and Algol 60. Algol 60 had a minor influence, but with the help of these two languages Basic was born. In 1963 Kemeny began the work on the complier for the first version of Basic. Basic was designed in the spring of 1963 especially for the liberal art students. John Kemeny and Thomas Kurtz felt that the language was needed for those students who didnt have the computer science background. Since, Basic was based off of Fortran and Agol 60 the readability of Basic already had a solid ground. Basic has been criticized for the poor structure of programs written in it, among other things. By Sebesta evaluation criteria, the language does indeed fare very poorly. (Sebasta, 67) Basic received its reliability on the success and the ease in which it can be learned, and implemented on any computer. Sebesta, Robert W, Concepts of Programming Languages (6th ed. 2004) PL\I 1965 PL/I represents the first large -scale attempt to design a language that could be used for a broad spectrum of application areas. First Concurrency. First exception handling. Switch-selectable recursion. First pointers. First array cross sections. In the 1970s, it enjoyed significant use in both business and scientific applications. By the early 1960s, the users of computers in industry had settled into two separate and quite different camps. Scientific programmers used the floating-point data type and arrays extensively. For business applications, people needed the decimal and character string data types. Until early 1963, IBM planners perceived the beginnings of a change in this situation. These perceptions quite naturally led to the concept of designing a single universal computer that would be capable of doing both floating-point and decimal arithmetic, and therefore both scientific and business applications. Along with this came the idea of a programming language that could be used for both business and scientific applications. Since Many of the new features of PL/I were poorly designed and too large and too much redundancy, it's readability is poor for me. Even though PL/I is not simple and orthogonal enough, but it was (and still is) actually used for both scientific and business applications. And from my point of view, its writability is good. Even PL/I was first one had exception handling, and ok writability, but its readability is so poor and not widely used as FORTRAN and COBOL, so I think its reliability is not so good. Sebesta, Robert W, Concepts of Programming Languages (5th ed. 2003) SIMULA 67 1967 SIMULA introduced important object-oriented programming concepts like classes and objects, inheritance, and dynamic binding. It was originally designed and implemented as a language for discrete event simulation. The basic concept of speaking in general and more specific terms had been adopted as a way of expressing reality in the context of a programming language SIMULA began with the idea of classes and objects to encapsulate procedures. This made the writability of languages to vastly improve, since all the procedures were in one location and removed the superfluous information. Classes increase reliability due to the object-oriented concepts, which were later developed in Java. Sebesta, Robert W, Concepts of Programming Languages (6th ed. 2004)
http://java.sun.com/people/jag/SimulaHistory.html
Pascal 1971 The major contributions of Pascal were the value-result method of passing parameters and the case statement for multiple selections.
The largest contribution that Pascal had on the programming world was the teaching of the language. In the early 1970s FORTRAN, PL/I, and ALGOL-W were common languages taught to computer science students. By the mid 1970s Pascal had become the most widely used language to teach students. Pascal was the standard language to teach for almost two decades until the early 1990s. Pascal has affected almost every block-structured language designed since its creation and still does today
Niklaus Wirth a member of the International Federation of Information Processing (IFIP) Working Group 2.1, which was created to continue the development of ALGOL in the mid 1960s. In August of 1965, Wirth and Tony Hoare contributed to that effort by presenting a proposal for additions and modifications to ALGOL 60. Their, proposal was rejected because it was considered to be too small to advance ALGOL 60. Later on, ALGOL 68 was released but Wirth and Hoare believed that it should not have been released because of its complexity. So they set out to develop and better version of ALGOL called ALGOL-W. From ALGOL-W the programming language known as Pascal was born. Pascal was created to be a simple language that would be widely used and easily teachable. Pascals impact on readability was huge because its simplicity and expressivity. Programs were easy to read through and it became the standard for readability for teaching young programmers how to write simple programs. Not only has Pascals impact on readability been enormous to programming, it also had a large impact on writability. It was easy for students to write small programs to learn the language quickly through in their classes. However, Pascal lacked many complex programming issues such as the impossibility of writing subprograms that take a parameter an array of variable length. Another problem with Pascal was the lack of any separate compilation capability. Although Pascal lacked the ability to do many things it was still considered to be a more stable than FORTRAN and C. It provided a safety net for young programmers to not allow them to make errors that they could have easily made in other languages. It did this because it was such and small language compared to it predecessors. Sebesta, Robert W, Concepts of Programming Languages (6th ed. 2004)
http://cgibin.erols.com/ziring/cgi-bin/cep/cep.pl?_key=Pascal
SNOBOL 1971 SNOBOL provided a language which could be used as if it were an object-oriented language, a logical programming language, a functional language, or a standard imperative language. This was made possible by changing the set of features used to write a program. SNOBOL was the fourth and final incarnation of a series of special purpose programming languages intended to make character string manipulation easier.
Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://www.wikipedia.org/wiki/SNOBOL
C 1978 Ken Thompson wanted to create a comfortable computing environment constructed according to his own design using whatever means necessary. By early 1973 the language and compiler were strong enough to rewrite the Unix kernel for the PDP-11. C was developed because in the late 1960s 3 companies by the names of Bell Telephone Laboratories, MIT, and GE were in a project development with Multics. Multics development could not be fulfilled in time and was becoming very expensive. The language shows considerable power to describe important concepts. Its easy to read and there are different types of libraries set up within the language to help the user. The relationship between arrays and pointer, and the way in which declaration syntax mimics expression syntax. They are also among its most frequently criticized features and often secure as stumbling blocks to the beginner. C is quirky, flawed, and an enormous success. While accidents of history surely helped, it evidently satisfied a need for a system implementation language efficient enough to displace assembly language, yet sufficiently abstract and fluent to describe algorithms and interactions in a wide variety of environments. Dennis Ritchie, http://cm.bell-labs.com/cm/cs/who/dmr/chist.html, 1993 PROLOG 1979 Prolog is a dynamic high level language with untyped variables, encouraging a natural level of abstraction, and discouraging you from hard-wiring early assumptions into your code. Its major contribution has been that it allows Prolog programmers to be closer to the problem to be solved, more adaptable to changes in that problem, and easier to maintain and support. Alain Colmerauer and Phillippe Roussel at the University of Aix-Marseille, with the help of Robert Kowalski at University of Edinburgh, developed the fundamental design for Prolog.
Colmerauer and Roussel were interested in natural language processing, and Kowalski was interested in automated theorem proving. The collaboration between them and the two universities on this new language continued until the mid 1970s. In 1972, their efforts led to the use of formal logic as the basis for a programming language. Kowalski's research provided the theoretical framework, while Colmerauer's resulted in the programming language. After that time both universities received little attention on their research and conducted independent studies on the prolog language until 1981. At that time the Japanese government was launching a large research project called the Fifth Generation Computing Systems (FGCS). The primary objective of this project was to develop intelligent machines, and Prolog was chosen as the basis for their study. This caused other researchers and governments to soon take notice of the Prolog language which started a 10 year long interest around the world in artificial intelligence and logic programming. Readability is difficult because of many unresolved problems with the language. The main problem with it is that it is hard to follow the order of the program and much of the programming is without any set basis of rules. This makes it very difficult for another programmer to figure out what he or she is reading without detailed notes to accompany the program. However, Prolog does offer unification which provides the ability for variables to instantiated values to provide matches. To sum up the impact on readability has been that it has lead to many user-defined programming languages.
Prolog has good writability and it is considered the standard in logic programming and AI programming. It is a fairly simple language to program with because it is based on facts, rules, or goals with a mainly structures composed of atomic propositions and logic operators.
Finally, the impact that prolog has had on writability has been the advantage and disadvantages that unification introduced to the word of programming, causing many future languages to copy its platform.
Reliability of Prolog is simply not good. There are many ways it is inefficient including infinite loops, the programmer MUST sometimes state control flow information in their programs. It is very easy for programmer to write statements that cause infinite loops resulting in total program failures. Also, there are problems of the closed-world assumption and negation. To sum up the impact of Prologs reliability it has basically been a model of what not to do in order to create a reliable program language. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
Roth, Allison, YOU CAN'T IGNORE PROLOG
http://web.archive.org/web/20010708010543/www.practical-applications.co.uk/Prolog/alli.html
SmallTalk 1980 First widely used implementation of object oriented programming.
Implemented first example of an graphical windowing human interface system both for program user interfaces and integrated development tools.
Smalltalk was originally developed by Dr. Alan Kay to extend the ideas of object oriented programming which were first developed in a limited form in Simula-67. Everything in Smalltalk is an object and information is passed between objects by way of messages. The object oriented programming framework of Smalltalk was developed concurrently with a plan to create a dynamic windowing human interface system which would bring computing to a less technical class of user. This windowing system is also integral to Smalltalks first implementation of an integrated software development environment. Therefore, Smalltalk is not only a programming language but also a software development tool which created a model for all modern graphical software development environments. Reading Smalltalk code is more like reading a text representation of semantic chart than it is reading computer code due to the complete implementation of object oriented concepts in Smalltalk. Therefore, Smalltalk is very readable to a programmer with knowledge of object oriented programming, but will make little sense otherwise. The simple expressive syntax of Smalltalk uses human nouns and verbs, leading to easy readability based on knowledge of English semantics. In terms of Writability, it must be considered that Smalltalk is both a programming language and an integrated graphical software development environment. As the first language to implement such a development environment, Smalltalk makes large gains in writeability across the overall programming process. Such a development environment is key to dealing with the many discrete segments of code which interact in the form of objects in Smalltalk. Individual segments of Smalltalk code text will have good writeability so long as the programmer subscribes to the concepts of object orientation. The use of English nouns and verbs also enhances writability by making use of existing semantic knowledge. The powerful abstraction tool of object oriented design will itself help to manage complexity, which make Smalltalk programs easy to understand, write, and modify. As an interpreted system originally designed to have some of the simulating capabilities as its predecessor Simula, Smalltalk is an excellent tool for prototyping and small systems but may not be robust enough to be reliable in very large projects. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://www.smalltalk.org/
Ada 1983 Ada was designed specifically for the use of the Department of Defense (DoD). It included many object-oriented programming features. It was designed to promote reliability. The requirement phase of Ada began in 1975 with the official version of the language being finalized in 1983. Ada included a numerous amount of features. Ada used to be the only widely used language that included exception handling. It was also one of the first languages to offer full support for abstract data types. Ada was developed for use by the Department of Defense as a high-level language used for embedded systems. The Ada language was based on the programming language Pascal. The impact on readability by Ada was in part due to the use of data types, control statements and structures. Additionally, the use of special words greatly increase the readability of this language. The Ada language is so large and complex that it decreases the writability of the language. However, there are also many factors that are beneficial to the writability of Ada. These include abstraction, expressivity and generic procedures. A compiler can generate a version of the generic procedure using a specific data type, reducing the amount of code to be written. Exception handling increased the reliability of Ada. The benefits of exception handling in the languages PL/I and CLU were also included in the design of Ada. In 1995, Ada began to support object-oriented programming. The complexity of the language may reduce the reliability; however, the readability and writability of the language has a positive impact on the reliability. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://users.erols.com/ziring/dopl.html
C++ 1984 Parameter type checking, derived classes, public/private access of components Evolved from C because modifications to improve its imperative features, object oriented programming Good Compilers, large & complex language C can be compiled as C++ programs, large and complex language C can be compiled as C++ programs, good compilers Sebesta, Robert W, Concepts of Programming Languages (5th ed. 2003) JAVA 1993 Java provides a secure, cross-platform way for code to be executed. Java also added several features to existing websites, such as arbitrary graphics, arbitrary data types, less load on the server, and more user interaction. Java was designed mainly to enable people to develop programs that use the Internet and the World-Wide Web. Java prevents casting between arbitrary variables. Only casts between numeric variables and between sub and superclasses of the same object are allowed. All conditional tests must return boolean values. Java supports both the "/* ... */" and "//" commenting conventions, but comments that begin with a "/**" are special. These comments should only be used before a method or class declaration. They indicate that the comment should be included in automatically generated documentation for that declaration. Java provided a new and higher level of reliability by eliminating most of the assignment coercions that are done in C++, and by using an "index range" method of accessing arrays. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://www.shu.ac.uk/java/aboutthisbook/pref2.htm
http://www.ibiblio.org/javafaq/javafaq.html
JavaScript 1995 Brought full interactivity and sophisticated user interface and typography concepts to the formerly static Web. Tasked with making Navigator's newly added Java support more accessible to non-Java programmers, Brendan Eich eventually decided that a loosely-typed scripting language suited the environment and audience, namely the few thousand web designers and developers who needed to be able to tie into page elements (such as forms, or frames, or images) without a bytecode compiler or knowledge of object-oriented software design. The language he created was christened "LiveScript," to reflect its dynamic nature, but was quickly (before the end of the Navigator 2.0 beta cycle) renamed JavaScript JavaScript's capabilities can't be extended or overridden by developers. A language that apparently required no skill to use, lacked an IDE and a reliable cross-platform debugger, and could only be tested in the context of the actual browser in which the page will be viewed, combined with a few highly publicized security flaws and several books aimed at non-programmers, caused many to write off JavaScript as a "simple" language for beginners and overshadowed its amazing potential. Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://www.oreillynet.com/pub/a/javascript/2001/04/06/js_history.html
PHP 1995 One of the biggest strengths of PHP 3.0 was its strong extensibility features PHP was initially as a simple set of Perl scripts for tracking accesses to his online resume. He named this set of scripts 'Personal Home Page Tools'. As more functionality was required, Rasmus wrote a much larger C implementation, which was able to communicate with databases, and enabled users to develop simple dynamic Web applications Because PHP is based on Perl, it has many of the problems associated with Perl. As in Perl, an instruction can be written in a variety of ways, this reduces the readability of the language Because one can write an instruction in a variety of ways, this increases the writability of the language As the language's strongest attribute is its extensibility feature, this has forced PHP to be rather reliable Sebesta, Robert W, Concepts of Programming Languages, (6th ed., 2004)
http://us2.php.net/history
|
||||||||
4768
|
dbpedia
|
2
| 4
|
https://en.wikipedia.org/wiki/Fortran
|
en
|
Wikipedia
|
[
"https://en.wikipedia.org/static/images/icons/wikipedia.png",
"https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-wordmark-en.svg",
"https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-tagline-en.svg",
"https://upload.wikimedia.org/wikipedia/commons/thumb/d/d3/IBM_Blue_Gene_P_supercomputer.jpg/220px-IBM_Blue_Gene_P_supercomputer.jpg",
"https://upload.wikimedia.org/wikipedia/commons/thumb/5/5e/Algol%26Fortran_Family_By_Skippppp.svg/260px-Algol%26Fortran_Family_By_Skippppp.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/b/b2/John_Backus_2.jpg/170px-John_Backus_2.jpg",
"https://upload.wikimedia.org/wikipedia/commons/thumb/0/07/Fortran_acs_cover.jpeg/170px-Fortran_acs_cover.jpeg",
"https://upload.wikimedia.org/wikipedia/commons/thumb/5/58/FortranCardPROJ039.agr.jpg/220px-FortranCardPROJ039.agr.jpg",
"https://upload.wikimedia.org/wikipedia/commons/thumb/1/18/FortranCodingForm.png/220px-FortranCodingForm.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/e/ee/Ftn-elim-1240x1709.jpg/170px-Ftn-elim-1240x1709.jpg",
"https://upload.wikimedia.org/wikipedia/commons/thumb/b/b0/4.3_BSD_UWisc_VAX_Emulation_f77_Manual.png/220px-4.3_BSD_UWisc_VAX_Emulation_f77_Manual.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/b/b9/General_relativistic_magnetohydrodynamic_simulation_of_black_hole_accretion.jpg/220px-General_relativistic_magnetohydrodynamic_simulation_of_black_hole_accretion.jpg",
"https://upload.wikimedia.org/wikipedia/commons/thumb/b/be/Cylinder_flow.gif/220px-Cylinder_flow.gif",
"https://upload.wikimedia.org/wikipedia/commons/thumb/a/a6/Velocity_and_sea_surface_temperature_in_the_oceans.png/170px-Velocity_and_sea_surface_temperature_in_the_oceans.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/6/6f/Octicons-terminal.svg/24px-Octicons-terminal.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/d/df/Wikibooks-logo-en-noslogan.svg/40px-Wikibooks-logo-en-noslogan.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/f/fa/Wikiquote-logo.svg/34px-Wikiquote-logo.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/4/4a/Commons-logo.svg/12px-Commons-logo.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/8/83/Symbol_template_class_pink.svg/16px-Symbol_template_class_pink.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/d/db/Symbol_list_class.svg/16px-Symbol_list_class.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/9/96/Symbol_category_class.svg/16px-Symbol_category_class.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/8/8a/OOjs_UI_icon_edit-ltr-progressive.svg/10px-OOjs_UI_icon_edit-ltr-progressive.svg.png",
"https://login.wikimedia.org/wiki/Special:CentralAutoLogin/start?type=1x1",
"https://en.wikipedia.org/static/images/footer/wikimedia-button.svg",
"https://en.wikipedia.org/static/images/footer/poweredby_mediawiki.svg"
] |
[] |
[] |
[
""
] | null |
[
"Contributors to Wikimedia projects"
] |
2001-10-02T10:44:45+00:00
|
en
|
/static/apple-touch/wikipedia.png
|
https://en.wikipedia.org/wiki/Fortran
|
General-purpose programming language
FortranParadigmMulti-paradigm: structured, imperative (procedural, object-oriented), generic, arrayDesigned byJohn BackusDeveloperJohn Backus and IBMFirst appeared1957; 67 years ago ( )Stable release
Fortran 2023 (ISO/IEC 1539:2023) / November 17, 2023; 8 months ago ( )
Typing disciplinestrong, static, manifestFilename extensions.f90, .f, .forWebsitefortran-lang .orgMajor implementationsAbsoft, Cray, GFortran, G95, IBM XL Fortran, Intel, Hitachi, Lahey/Fujitsu, Numerical Algorithms Group, Open Watcom, PathScale, PGI, Silverfrost, Oracle Solaris Studio, othersInfluenced bySpeedcodingInfluencedALGOL 58, BASIC, C, Chapel,[1] CMS-2, DOPE, Fortress, MATLAB, PL/I, PACT I, MUMPS, IDL, Ratfor, SAKO[2]
Fortran ( ; formerly FORTRAN) is a third generation, compiled, imperative programming language that is especially suited to numeric computation and scientific computing.
Fortran was originally developed by IBM.[3] It first compiled correctly in 1958.[4] Fortran computer programs have been written to support scientific and engineering applications, such as numerical weather prediction, finite element analysis, computational fluid dynamics, geophysics, computational physics, crystallography and computational chemistry. It is a popular language for high-performance computing[5] and is used for programs that benchmark and rank the world's fastest supercomputers.[6][7]
Fortran has evolved through numerous versions and dialects. In 1966, the American National Standards Institute (ANSI) developed a standard for Fortran to limit proliferation of compilers using slightly different syntax.[8] Successive versions have added support for a character data type (Fortran 77), structured programming, array programming, modular programming, generic programming (Fortran 90), parallel computing (Fortran 95), object-oriented programming (Fortran 2003), and concurrent programming (Fortran 2008).
Since April 2024, Fortran has ranked among the top ten languages in the TIOBE index, a measure of the popularity of programming languages.[9]
Naming
[edit]
The first manual for FORTRAN describes it as a Formula Translating System, and printed the name with small caps, Fortran.[10]: p.2 [11] Other sources suggest the name stands for Formula Translator,[12] or Formula Translation.[13]
Early IBM computers did not support lowercase letters, and the names of versions of the language through FORTRAN 77 were usually spelled in all-uppercase.[14] FORTRAN 77 was the last version in which the Fortran character set included only uppercase letters.[15]
The official language standards for Fortran have referred to the language as "Fortran" with initial caps since Fortran 90.[citation needed]
Origins
[edit]
In late 1953, John W. Backus submitted a proposal to his superiors at IBM to develop a more practical alternative to assembly language for programming their IBM 704 mainframe computer.[11]: 69 Backus' historic FORTRAN team consisted of programmers Richard Goldberg, Sheldon F. Best, Harlan Herrick, Peter Sheridan, Roy Nutt, Robert Nelson, Irving Ziller, Harold Stern, Lois Haibt, and David Sayre.[16] Its concepts included easier entry of equations into a computer, an idea developed by J. Halcombe Laning and demonstrated in the Laning and Zierler system of 1952.[17]
A draft specification for The IBM Mathematical Formula Translating System was completed by November 1954.[11]: 71 The first manual for FORTRAN appeared in October 1956,[10][11]: 72 with the first FORTRAN compiler delivered in April 1957.[11]: 75 Fortran produced efficient enough code for assembly language programmers to accept a high-level programming language replacement.[18]
John Backus said during a 1979 interview with Think, the IBM employee magazine, "Much of my work has come from being lazy. I didn't like writing programs, and so, when I was working on the IBM 701, writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs."[19]
The language was widely adopted by scientists for writing numerically intensive programs, which encouraged compiler writers to produce compilers that could generate faster and more efficient code. The inclusion of a complex number data type in the language made Fortran especially suited to technical applications such as electrical engineering.[20]
By 1960, versions of FORTRAN were available for the IBM 709, 650, 1620, and 7090 computers. Significantly, the increasing popularity of FORTRAN spurred competing computer manufacturers to provide FORTRAN compilers for their machines, so that by 1963 over 40 FORTRAN compilers existed.
FORTRAN was provided for the IBM 1401 computer by an innovative 63-phase compiler that ran entirely in its core memory of only 8000 (six-bit) characters. The compiler could be run from tape, or from a 2200-card deck; it used no further tape or disk storage. It kept the program in memory and loaded overlays that gradually transformed it, in place, into executable form, as described by Haines.[21] This article was reprinted, edited, in both editions of Anatomy of a Compiler[22] and in the IBM manual "Fortran Specifications and Operating Procedures, IBM 1401".[23] The executable form was not entirely machine language; rather, floating-point arithmetic, sub-scripting, input/output, and function references were interpreted, preceding UCSD Pascal P-code by two decades. GOTRAN, a simplified, interpreted version of FORTRAN I (with only 12 statements not 32) for "load and go" operation was available (at least for the early IBM 1620 computer).[24] Modern Fortran, and almost all later versions, are fully compiled, as done for other high-performance languages.
The development of Fortran paralleled the early evolution of compiler technology, and many advances in the theory and design of compilers were specifically motivated by the need to generate efficient code for Fortran programs.
FORTRAN
[edit]
The initial release of FORTRAN for the IBM 704[10] contained 32 statements, including:
DIMENSION and EQUIVALENCE statements
Assignment statements
Three-way arithmetic IF statement, which passed control to one of three locations in the program depending on whether the result of the arithmetic expression was negative, zero, or positive
Control statements for checking exceptions (IF ACCUMULATOR OVERFLOW, IF QUOTIENT OVERFLOW, and IF DIVIDE CHECK); and control statements for manipulating sense switches and sense lights (IF (SENSE SWITCH), IF (SENSE LIGHT), and SENSE LIGHT)
GO TO, computed GO TO, ASSIGN, and assigned GO TO
DO loops
Formatted I/O: FORMAT, READ, READ INPUT TAPE, WRITE OUTPUT TAPE, PRINT, and PUNCH
Unformatted I/O: READ TAPE, READ DRUM, WRITE TAPE, and WRITE DRUM
Other I/O: END FILE, REWIND, and BACKSPACE
PAUSE, STOP, and CONTINUE
FREQUENCY statement (for providing optimization hints to the compiler).
The arithmetic IF statement was reminiscent of (but not readily implementable by) a three-way comparison instruction (CAS—Compare Accumulator with Storage) available on the 704. The statement provided the only way to compare numbers—by testing their difference, with an attendant risk of overflow. This deficiency was later overcome by "logical" facilities introduced in FORTRAN IV.
The FREQUENCY statement was used originally (and optionally) to give branch probabilities for the three branch cases of the arithmetic IF statement. It could also be used to suggest how many iterations a DO loop might run. The first FORTRAN compiler used this weighting to perform at compile time a Monte Carlo simulation of the generated code, the results of which were used to optimize the placement of basic blocks in memory—a very sophisticated optimization for its time. The Monte Carlo technique is documented in Backus et al.'s paper on this original implementation, The FORTRAN Automatic Coding System:
The fundamental unit of program is the basic block; a basic block is a stretch of program which has one entry point and one exit point. The purpose of section 4 is to prepare for section 5 a table of predecessors (PRED table) which enumerates the basic blocks and lists for every basic block each of the basic blocks which can be its immediate predecessor in flow, together with the absolute frequency of each such basic block link. This table is obtained by running the program once in Monte-Carlo fashion, in which the outcome of conditional transfers arising out of IF-type statements and computed GO TO's is determined by a random number generator suitably weighted according to whatever FREQUENCY statements have been provided.[16]
The first FORTRAN compiler reported diagnostic information by halting the program when an error was found and outputting an error code on its console. That code could be looked up by the programmer in an error messages table in the operator's manual, providing them with a brief description of the problem.[10]: p.19–20 [25] Later, an error-handling subroutine to handle user errors such as division by zero, developed by NASA,[26] was incorporated, informing users of which line of code contained the error.
Fixed layout and punched cards
[edit]
Before the development of disk files, text editors and terminals, programs were most often entered on a keypunch keyboard onto 80-column punched cards, one line to a card. The resulting deck of cards would be fed into a card reader to be compiled. Punched card codes included no lower-case letters or many special characters, and special versions of the IBM 026 keypunch were offered that would correctly print the re-purposed special characters used in FORTRAN.
Reflecting punched card input practice, Fortran programs were originally written in a fixed-column format, with the first 72 columns read into twelve 36-bit words.
A letter "C" in column 1 caused the entire card to be treated as a comment and ignored by the compiler. Otherwise, the columns of the card were divided into four fields:
1 to 5 were the label field: a sequence of digits here was taken as a label for use in DO or control statements such as GO TO and IF, or to identify a FORMAT statement referred to in a WRITE or READ statement. Leading zeros are ignored and 0 is not a valid label number.
6 was a continuation field: a character other than a blank or a zero here caused the card to be taken as a continuation of the statement on the prior card. The continuation cards were usually numbered 1, 2, etc. and the starting card might therefore have zero in its continuation column—which is not a continuation of its preceding card.
7 to 72 served as the statement field.
73 to 80 were ignored (the IBM 704's card reader only used 72 columns).[27]
Columns 73 to 80 could therefore be used for identification information, such as punching a sequence number or text, which could be used to re-order cards if a stack of cards was dropped; though in practice this was reserved for stable, production programs. An IBM 519 could be used to copy a program deck and add sequence numbers. Some early compilers, e.g., the IBM 650's, had additional restrictions due to limitations on their card readers.[28] Keypunches could be programmed to tab to column 7 and skip out after column 72. Later compilers relaxed most fixed-format restrictions, and the requirement was eliminated in the Fortran 90 standard.
Within the statement field, whitespace characters (blanks) were ignored outside a text literal. This allowed omitting spaces between tokens for brevity or including spaces within identifiers for clarity. For example, AVG OF X was a valid identifier, equivalent to AVGOFX, and 101010DO101I=1,101 was a valid statement, equivalent to 10101 DO 101I=1,101 because the zero in column 6 is treated as if it were a space (!), while 101010DO101I=1.101 was instead 10101 DO101I=1.101, the assignment of 1.101 to a variable called DO101I. Note the slight visual difference between a comma and a period.
Hollerith strings, originally allowed only in FORMAT and DATA statements, were prefixed by a character count and the letter H (e.g., 26HTHIS IS ALPHANUMERIC DATA.), allowing blanks to be retained within the character string. Miscounts were a problem.
Evolution
[edit]
Timeline of Fortran language Year Informal name ANSI Standard ISO/IEC Standard 1957 FORTRAN — — 1958 FORTRAN II — — 1958 FORTRAN III — — 1961 FORTRAN IV — — 1966 FORTRAN 66 X3.9-1966 — 1978 FORTRAN 77 X3.9-1978 — 1991 Fortran 90 X3.198-1992 1539:1991 1997 Fortran 95 — 1539-1:1997 2004 Fortran 2003 — 1539-1:2004 2010 Fortran 2008 — 1539-1:2010 2018 Fortran 2018 — 1539-1:2018 2023 Fortran 2023 — 1539-1:2023
FORTRAN II
[edit]
IBM's FORTRAN II appeared in 1958. The main enhancement was to support procedural programming by allowing user-written subroutines and functions which returned values with parameters passed by reference. The COMMON statement provided a way for subroutines to access common (or global) variables. Six new statements were introduced:[29]
SUBROUTINE, FUNCTION, and END
CALL and RETURN
COMMON
Over the next few years, FORTRAN II added support for the DOUBLE PRECISION and COMPLEX data types.
Early FORTRAN compilers supported no recursion in subroutines. Early computer architectures supported no concept of a stack, and when they did directly support subroutine calls, the return location was often stored in one fixed location adjacent to the subroutine code (e.g. the IBM 1130) or a specific machine register (IBM 360 et seq), which only allows recursion if a stack is maintained by software and the return address is stored on the stack before the call is made and restored after the call returns. Although not specified in FORTRAN 77, many F77 compilers supported recursion as an option, and the Burroughs mainframes, designed with recursion built-in, did so by default. It became a standard in Fortran 90 via the new keyword RECURSIVE.[30]
Simple FORTRAN II program
[edit]
This program, for Heron's formula, reads data on a tape reel containing three 5-digit integers A, B, and C as input. There are no "type" declarations available: variables whose name starts with I, J, K, L, M, or N are "fixed-point" (i.e. integers), otherwise floating-point. Since integers are to be processed in this example, the names of the variables start with the letter "I". The name of a variable must start with a letter and can continue with both letters and digits, up to a limit of six characters in FORTRAN II. If A, B, and C cannot represent the sides of a triangle in plane geometry, then the program's execution will end with an error code of "STOP 1". Otherwise, an output line will be printed showing the input values for A, B, and C, followed by the computed AREA of the triangle as a floating-point number occupying ten spaces along the line of output and showing 2 digits after the decimal point, the .2 in F10.2 of the FORMAT statement with label 601.
C AREA OF A TRIANGLE WITH A STANDARD SQUARE ROOT FUNCTION C INPUT - TAPE READER UNIT 5, INTEGER INPUT C OUTPUT - LINE PRINTER UNIT 6, REAL OUTPUT C INPUT ERROR DISPLAY ERROR OUTPUT CODE 1 IN JOB CONTROL LISTING READ INPUTTAPE5,501,IA,IB,IC 501 FORMAT(3I5) C IA, IB, AND IC MAY NOT BE NEGATIVE OR ZERO C FURTHERMORE, THE SUM OF TWO SIDES OF A TRIANGLE C MUST BE GREATER THAN THE THIRD SIDE, SO WE CHECK FOR THAT, TOO IF(IA)777,777,701 701 IF(IB)777,777,702 702 IF(IC)777,777,703 703 IF(IA+IB-IC)777,777,704 704 IF(IA+IC-IB)777,777,705 705 IF(IB+IC-IA)777,777,799 777 STOP 1 C USING HERON'S FORMULA WE CALCULATE THE C AREA OF THE TRIANGLE 799 S=FLOATF(IA+IB+IC)/2.0 AREA=SQRTF(S*(S-FLOATF(IA))*(S-FLOATF(IB))* +(S-FLOATF(IC))) WRITE OUTPUTTAPE6,601,IA,IB,IC,AREA 601 FORMAT(4HA=,I5,5HB=,I5,5HC=,I5,8HAREA=,F10.2, +13HSQUAREUNITS) STOP END
FORTRAN III
[edit]
IBM also developed a FORTRAN III in 1958 that allowed for inline assembly code among other features; however, this version was never released as a product. Like the 704 FORTRAN and FORTRAN II, FORTRAN III included machine-dependent features that made code written in it unportable from machine to machine.[11]: 76 Early versions of FORTRAN provided by other vendors suffered from the same disadvantage.
FORTRAN IV
[edit]
IBM began development of FORTRAN IV starting in 1961, as a result of customer demands. FORTRAN IV removed the machine-dependent features of FORTRAN II (such as READ INPUT TAPE), while adding new features such as a LOGICAL data type, logical Boolean expressions and the logical IF statement as an alternative to the arithmetic IF statement. FORTRAN IV was eventually released in 1962, first for the IBM 7030 ("Stretch") computer, followed by versions for the IBM 7090, IBM 7094, and later for the IBM 1401 in 1966.[31]
By 1965, FORTRAN IV was supposed to be compliant with the standard being developed by the American Standards Association X3.4.3 FORTRAN Working Group.[32]
Between 1966 and 1968, IBM offered several FORTRAN IV compilers for its System/360, each named by letters that indicated the minimum amount of memory the compiler needed to run. [33] The letters (F, G, H) matched the codes used with System/360 model numbers to indicate memory size, each letter increment being a factor of two larger:[34]: p. 5
1966 : FORTRAN IV F for DOS/360 (64K bytes)
1966 : FORTRAN IV G for OS/360 (128K bytes)
1968 : FORTRAN IV H for OS/360 (256K bytes)
Digital Equipment Corporation maintained DECSYSTEM-10 Fortran IV (F40) for PDP-10 from 1967 to 1975.[35] Compilers were also available for the UNIVAC 1100 series and the Control Data 6000 series and 7000 series systems.[36]
At about this time FORTRAN IV had started to become an important educational tool and implementations such as the University of Waterloo's WATFOR and WATFIV were created to simplify the complex compile and link processes of earlier compilers.
In the FORTRAN IV programming environment of the era, except for that used on Control Data Corporation (CDC) systems, only one instruction was placed per line. The CDC version allowed for multiple instructions per line if separated by a $ (dollar) character. The FORTRAN sheet was divided into four fields, as described above.
Two compilers of the time, IBM "G" and UNIVAC, allowed comments to be written on the same line as instructions, separated by a special character: "master space": V (perforations 7 and 8) for UNIVAC and perforations 12/11/0/7/8/9 (hexadecimal FF) for IBM. These comments were not to be inserted in the middle of continuation cards.[35][36]
FORTRAN 66
[edit]
Perhaps the most significant development in the early history of FORTRAN was the decision by the American Standards Association (now American National Standards Institute (ANSI)) to form a committee sponsored by the Business Equipment Manufacturers Association (BEMA) to develop an American Standard Fortran. The resulting two standards, approved in March 1966, defined two languages, FORTRAN (based on FORTRAN IV, which had served as a de facto standard), and Basic FORTRAN (based on FORTRAN II, but stripped of its machine-dependent features). The FORTRAN defined by the first standard, officially denoted X3.9-1966, became known as FORTRAN 66 (although many continued to term it FORTRAN IV, the language on which the standard was largely based). FORTRAN 66 effectively became the first industry-standard version of FORTRAN. FORTRAN 66 included:
Main program, SUBROUTINE, FUNCTION, and BLOCK DATA program units
INTEGER, REAL, DOUBLE PRECISION, COMPLEX, and LOGICAL data types
COMMON, DIMENSION, and EQUIVALENCE statements
DATA statement for specifying initial values
Intrinsic and EXTERNAL (e.g., library) functions
Assignment statement
GO TO, computed GO TO, assigned GO TO, and ASSIGN statements
Logical IF and arithmetic (three-way) IF statements
DO loop statement
READ, WRITE, BACKSPACE, REWIND, and ENDFILE statements for sequential I/O
FORMAT statement and assigned format
CALL, RETURN, PAUSE, and STOP statements
Hollerith constants in DATA and FORMAT statements, and as arguments to procedures
Identifiers of up to six characters in length
Comment lines
END line
FORTRAN 77
[edit]
After the release of the FORTRAN 66 standard, compiler vendors introduced several extensions to Standard Fortran, prompting ANSI committee X3J3 in 1969 to begin work on revising the 1966 standard, under sponsorship of CBEMA, the Computer Business Equipment Manufacturers Association (formerly BEMA). Final drafts of this revised standard circulated in 1977, leading to formal approval of the new FORTRAN standard in April 1978. The new standard, called FORTRAN 77 and officially denoted X3.9-1978, added a number of significant features to address many of the shortcomings of FORTRAN 66:
Block IF and END IF statements, with optional ELSE IF and ELSE clauses, to provide improved language support for structured programming
DO loop extensions, including parameter expressions, negative increments, and zero trip counts
OPEN, CLOSE, and INQUIRE statements for improved I/O capability
Direct-access file I/O
IMPLICIT statement, to override implicit conventions that undeclared variables are INTEGER if their name begins with I, J, K, L, M, or N (and REAL otherwise)
CHARACTER data type, replacing Hollerith strings with vastly expanded facilities for character input and output and processing of character-based data
PARAMETER statement for specifying constants
SAVE statement for persistent local variables
Generic names for intrinsic functions (e.g. SQRT also accepts arguments of other types, such as COMPLEX or REAL*16).
A set of intrinsics (LGE, LGT, LLE, LLT) for lexical comparison of strings, based upon the ASCII collating sequence. (These ASCII functions were demanded by the U.S. Department of Defense, in their conditional approval vote.[citation needed])
A maximum of seven dimensions in arrays, rather than three. Allowed subscript expressions were also generalized.
In this revision of the standard, a number of features were removed or altered in a manner that might invalidate formerly standard-conforming programs. (Removal was the only allowable alternative to X3J3 at that time, since the concept of "deprecation" was not yet available for ANSI standards.) While most of the 24 items in the conflict list (see Appendix A2 of X3.9-1978) addressed loopholes or pathological cases permitted by the prior standard but rarely used, a small number of specific capabilities were deliberately removed, such as:
Hollerith constants and Hollerith data, such as GREET=12HHELLOTHERE!
Reading into an H edit (Hollerith field) descriptor in a FORMAT specification
Overindexing of array bounds by subscripts
DIMENSION A(10,5) Y=A(11,1)
Transfer of control out of and back into the range of a DO loop (also known as "Extended Range")
Transition to ANSI Standard Fortran
[edit]
The development of a revised standard to succeed FORTRAN 77 would be repeatedly delayed as the standardization process struggled to keep up with rapid changes in computing and programming practice. In the meantime, as the "Standard FORTRAN" for nearly fifteen years, FORTRAN 77 would become the historically most important dialect.
An important practical extension to FORTRAN 77 was the release of MIL-STD-1753 in 1978.[37] This specification, developed by the U.S. Department of Defense, standardized a number of features implemented by most FORTRAN 77 compilers but not included in the ANSI FORTRAN 77 standard. These features would eventually be incorporated into the Fortran 90 standard.
DO WHILE and END DO statements
INCLUDE statement
IMPLICIT NONE variant of the IMPLICIT statement
Bit manipulation intrinsic functions, based on similar functions included in Industrial Real-Time Fortran (ANSI/ISA S61.1 (1976))
The IEEE 1003.9 POSIX Standard, released in 1991, provided a simple means for FORTRAN 77 programmers to issue POSIX system calls.[38] Over 100 calls were defined in the document – allowing access to POSIX-compatible process control, signal handling, file system control, device control, procedure pointing, and stream I/O in a portable manner.
Fortran 90
[edit]
The much-delayed successor to FORTRAN 77, informally known as Fortran 90 (and prior to that, Fortran 8X), was finally released as ISO/IEC standard 1539:1991 in 1991 and an ANSI Standard in 1992. In addition to changing the official spelling from FORTRAN to Fortran, this major revision added many new features to reflect the significant changes in programming practice that had evolved since the 1978 standard:
Free-form source input removed the need to skip the first six character positions before entering statements.
Lowercase Fortran keywords
Identifiers up to 31 characters in length (In the previous standard, it was only six characters).
Inline comments
Ability to operate on arrays (or array sections) as a whole, thus greatly simplifying math and engineering computations.
whole, partial and masked array assignment statements and array expressions, such as X(1:N)=R(1:N)*COS(A(1:N))
WHERE statement for selective array assignment
array-valued constants and expressions,
user-defined array-valued functions and array constructors.
RECURSIVE procedures
Modules, to group related procedures and data together, and make them available to other program units, including the capability to limit the accessibility to only specific parts of the module.
A vastly improved argument-passing mechanism, allowing interfaces to be checked at compile time
User-written interfaces for generic procedures
Operator overloading
Derived (structured) data types
New data type declaration syntax, to specify the data type and other attributes of variables
Dynamic memory allocation by means of the ALLOCATABLE attribute and the ALLOCATE and DEALLOCATE statements
POINTER attribute, pointer assignment, and NULLIFY statement to facilitate the creation and manipulation of dynamic data structures
Structured looping constructs, with an END DO statement for loop termination, and EXIT and CYCLE statements for terminating normal DO loop iterations in an orderly way
SELECT CASE, CASE, . . . CASE DEFAULT, END SELECT construct for multi-way selection
Portable specification of numerical precision under the user's control
New and enhanced intrinsic procedures.
Obsolescence and deletions
[edit]
Unlike the prior revision, Fortran 90 removed no features.[39] Any standard-conforming FORTRAN 77 program was also standard-conforming under Fortran 90, and either standard should have been usable to define its behavior.
A small set of features were identified as "obsolescent" and were expected to be removed in a future standard. All of the functionalities of these early-version features can be performed by newer Fortran features. Some are kept to simplify porting of old programs but many were deleted in Fortran 95.
Obsolescence and deletions Obsolescent feature Current Status Arithmetic IF-statement Obsolescent in F90, deleted in F2018 Non-integer DO parameters or control variables Obsolescent in F90, deleted in F95 Shared DO-loop termination or termination with a statement other than END DO or CONTINUE Obsolescent in F90, deleted in F2018 Branching to END IF from outside a block Obsolescent in F90, deleted in F95 PAUSE statement Obsolescent in F90, deleted in F95 ASSIGN statement and assigned GO TO statement Obsolescent in F90, deleted in F95 Assigned statement numbers and FORMAT specifiers Obsolescent in F90, deleted in F95 H edit descriptor Obsolescent in F90, deleted in F95 Vertical format control Deleted in F2003 Alternate return Obsolescent in F90 Computed GO TO statement Obsolescent in F90 Statement functions Obsolescent in F90 DATA statements among executable statements Obsolescent in F90 Assumed length character functions Obsolescent in F90 Fixed form source code Obsolescent in F90 CHARACTER* form of CHARACTER declaration Obsolescent in F90 ENTRY statements Obsolescent in F2008 Label form of DO statement Obsolescent in F2018 COMMON and EQUIVALENCE statements, and the BLOCK DATA program unit Obsolescent in F2018 Specific names for intrinsic function Obsolescent in F2018 FORALL construct and statement Obsolescent in F2018
"Hello, World!" example
[edit]
program helloworld print*,"Hello, World!" end program helloworld
Fortran 95
[edit]
"F95" redirects here. For the Düsseldorf-based football club, see Fortuna Düsseldorf.
Main article: Fortran 95 language features
Fortran 95, published officially as ISO/IEC 1539-1:1997, was a minor revision, mostly to resolve some outstanding issues from the Fortran 90 standard. Nevertheless, Fortran 95 also added a number of extensions, notably from the High Performance Fortran specification:
FORALL and nested WHERE constructs to aid vectorization
User-defined PURE and ELEMENTAL procedures
Default initialization of derived type components, including pointer initialization
Expanded the ability to use initialization expressions for data objects
Initialization of pointers to NULL()
Clearly defined that ALLOCATABLE arrays are automatically deallocated when they go out of scope.
A number of intrinsic functions were extended (for example a dim argument was added to the maxloc intrinsic).
Several features noted in Fortran 90 to be "obsolescent" were removed from Fortran 95:
DO statements using REAL and DOUBLE PRECISION index variables
Branching to an END IF statement from outside its block
PAUSE statement
ASSIGN and assigned GO TO statement, and assigned format specifiers
H Hollerith edit descriptor.
An important supplement to Fortran 95 was the ISO technical report TR-15581: Enhanced Data Type Facilities, informally known as the Allocatable TR. This specification defined enhanced use of ALLOCATABLE arrays, prior to the availability of fully Fortran 2003-compliant Fortran compilers. Such uses include ALLOCATABLE arrays as derived type components, in procedure dummy argument lists, and as function return values. (ALLOCATABLE arrays are preferable to POINTER-based arrays because ALLOCATABLE arrays are guaranteed by Fortran 95 to be deallocated automatically when they go out of scope, eliminating the possibility of memory leakage. In addition, elements of allocatable arrays are contiguous, and aliasing is not an issue for optimization of array references, allowing compilers to generate faster code than in the case of pointers.[40])
Another important supplement to Fortran 95 was the ISO technical report TR-15580: Floating-point exception handling, informally known as the IEEE TR. This specification defined support for IEEE floating-point arithmetic and floating-point exception handling.
Conditional compilation and varying length strings
[edit]
In addition to the mandatory "Base language" (defined in ISO/IEC 1539-1 : 1997), the Fortran 95 language also included two optional modules:
Varying length character strings (ISO/IEC 1539-2 : 2000)
Conditional compilation (ISO/IEC 1539-3 : 1998)
which, together, compose the multi-part International Standard (ISO/IEC 1539).
According to the standards developers, "the optional parts describe self-contained features which have been requested by a substantial body of users and/or implementors, but which are not deemed to be of sufficient generality for them to be required in all standard-conforming Fortran compilers." Nevertheless, if a standard-conforming Fortran does provide such options, then they "must be provided in accordance with the description of those facilities in the appropriate Part of the Standard".
Modern Fortran
[edit]
The language defined by the twenty-first century standards, in particular because of its incorporation of object-oriented programming support and subsequently Coarray Fortran, is often referred to as 'Modern Fortran', and the term is increasingly used in the literature.[41]
Fortran 2003
[edit]
Fortran 2003, officially published as ISO/IEC 1539-1:2004, was a major revision introducing many new features.[42] A comprehensive summary of the new features of Fortran 2003 is available at the Fortran Working Group (ISO/IEC JTC1/SC22/WG5) official Web site.[43]
From that article, the major enhancements for this revision include:
Derived type enhancements: parameterized derived types, improved control of accessibility, improved structure constructors, and finalizers
Object-oriented programming support: type extension and inheritance, polymorphism, dynamic type allocation, and type-bound procedures, providing complete support for abstract data types
Data manipulation enhancements: allocatable components (incorporating TR 15581), deferred type parameters, VOLATILE attribute, explicit type specification in array constructors and allocate statements, pointer enhancements, extended initialization expressions, and enhanced intrinsic procedures
Input/output enhancements: asynchronous transfer, stream access, user specified transfer operations for derived types, user specified control of rounding during format conversions, named constants for preconnected units, the FLUSH statement, regularization of keywords, and access to error messages
Procedure pointers
Support for IEEE floating-point arithmetic and floating-point exception handling (incorporating TR 15580)
Interoperability with the C programming language
Support for international usage: access to ISO 10646 4-byte characters and choice of decimal or comma in numeric formatted input/output
Enhanced integration with the host operating system: access to command-line arguments, environment variables, and processor error messages
An important supplement to Fortran 2003 was the ISO technical report TR-19767: Enhanced module facilities in Fortran. This report provided sub-modules, which make Fortran modules more similar to Modula-2 modules. They are similar to Ada private child sub-units. This allows the specification and implementation of a module to be expressed in separate program units, which improves packaging of large libraries, allows preservation of trade secrets while publishing definitive interfaces, and prevents compilation cascades.
Fortran 2008
[edit]
ISO/IEC 1539-1:2010, informally known as Fortran 2008, was approved in September 2010.[44][45] As with Fortran 95, this is a minor upgrade, incorporating clarifications and corrections to Fortran 2003, as well as introducing some new capabilities. The new capabilities include:
Sub-modules – additional structuring facilities for modules; supersedes ISO/IEC TR 19767:2005
Coarray Fortran – a parallel execution model
The DO CONCURRENT construct – for loop iterations with no interdependencies
The CONTIGUOUS attribute – to specify storage layout restrictions
The BLOCK construct – can contain declarations of objects with construct scope
Recursive allocatable components – as an alternative to recursive pointers in derived types
The Final Draft international Standard (FDIS) is available as document N1830.[46]
A supplement to Fortran 2008 is the International Organization for Standardization (ISO) Technical Specification (TS) 29113 on Further Interoperability of Fortran with C,[47][48] which has been submitted to ISO in May 2012 for approval. The specification adds support for accessing the array descriptor from C and allows ignoring the type and rank of arguments.
Fortran 2018
[edit]
The Fortran 2018 revision of the language was earlier referred to as Fortran 2015.[49] It was a significant revision and was released on November 28, 2018.[50]
Fortran 2018 incorporates two previously published Technical Specifications:
ISO/IEC TS 29113:2012 Further Interoperability with C[51]
ISO/IEC TS 18508:2015 Additional Parallel Features in Fortran[52]
Additional changes and new features include support for ISO/IEC/IEEE 60559:2011 (the version of the IEEE floating-point standard before the latest minor revision IEEE 754–2019), hexadecimal input/output, IMPLICIT NONE enhancements and other changes.[53][54][55][56]
Fortran 2018 deleted the arithmetic IF statement. It also deleted non-block DO constructs - loops which do not end with an END DO or CONTINUE statement. These had been an obsolescent part of the language since Fortran 90.
New obsolescences are: COMMON and EQUIVALENCE statements and the BLOCK DATA program unit, labelled DO loops, specific names for intrinsic functions, and the FORALL statement and construct.
Fortran 2023
[edit]
Fortran 2023 (ISO/IEC 1539-1:2023) was published in November 2023, and can be purchased from the ISO.[57] Fortran 2023 is a minor extension of Fortran 2018 that focuses on correcting errors and omissions in Fortran 2018. It also adds some small features, including an enumerated type capability.
Language features
[edit]
Main article: Fortran 95 language features
A full description of the Fortran language features brought by Fortran 95 is covered in the related article, Fortran 95 language features. The language versions defined by later standards are often referred to collectively as 'Modern Fortran' and are described in the literature.
Science and engineering
[edit]
Although a 1968 journal article by the authors of BASIC already described FORTRAN as "old-fashioned",[58] programs have been written in Fortran for many decades and there is a vast body of Fortran software in daily use throughout the scientific and engineering communities.[59] Jay Pasachoff wrote in 1984 that "physics and astronomy students simply have to learn FORTRAN. So much exists in FORTRAN that it seems unlikely that scientists will change to Pascal, Modula-2, or whatever."[60] In 1993, Cecil E. Leith called FORTRAN the "mother tongue of scientific computing", adding that its replacement by any other possible language "may remain a forlorn hope".[61]
It is the primary language for some of the most intensive super-computing tasks, such as in astronomy, climate modeling, computational chemistry, computational economics, computational fluid dynamics, computational physics, data analysis,[62] hydrological modeling, numerical linear algebra and numerical libraries (LAPACK, IMSL and NAG), optimization, satellite simulation, structural engineering, and weather prediction.[63] Many of the floating-point benchmarks to gauge the performance of new computer processors, such as the floating-point components of the SPEC benchmarks (e.g., CFP2006, CFP2017) are written in Fortran. Math algorithms are well documented in Numerical Recipes.
Apart from this, more modern codes in computational science generally use large program libraries, such as METIS for graph partitioning, PETSc or Trilinos for linear algebra capabilities, deal.II or FEniCS for mesh and finite element support, and other generic libraries. Since the early 2000s, many of the widely used support libraries have also been implemented in C and more recently, in C++. On the other hand, high-level languages such as the Wolfram Language, MATLAB, Python, and R have become popular in particular areas of computational science. Consequently, a growing fraction of scientific programs are also written in such higher-level scripting languages. For this reason, facilities for inter-operation with C were added to Fortran 2003 and enhanced by the ISO/IEC technical specification 29113, which was incorporated into Fortran 2018 to allow more flexible interoperation with other programming languages.
Portability
[edit]
Portability was a problem in the early days because there was no agreed upon standard—not even IBM's reference manual—and computer companies vied to differentiate their offerings from others by providing incompatible features. Standards have improved portability. The 1966 standard provided a reference syntax and semantics, but vendors continued to provide incompatible extensions. Although careful programmers were coming to realize that use of incompatible extensions caused expensive portability problems, and were therefore using programs such as The PFORT Verifier,[64][65] it was not until after the 1977 standard, when the National Bureau of Standards (now NIST) published FIPS PUB 69, that processors purchased by the U.S. Government were required to diagnose extensions of the standard. Rather than offer two processors, essentially every compiler eventually had at least an option to diagnose extensions.[66][67]
Incompatible extensions were not the only portability problem. For numerical calculations, it is important to take account of the characteristics of the arithmetic. This was addressed by Fox et al. in the context of the 1966 standard by the PORT library.[65] The ideas therein became widely used, and were eventually incorporated into the 1990 standard by way of intrinsic inquiry functions. The widespread (now almost universal) adoption of the IEEE 754 standard for binary floating-point arithmetic has essentially removed this problem.
Access to the computing environment (e.g., the program's command line, environment variables, textual explanation of error conditions) remained a problem until it was addressed by the 2003 standard.
Large collections of library software that could be described as being loosely related to engineering and scientific calculations, such as graphics libraries, have been written in C, and therefore access to them presented a portability problem. This has been addressed by incorporation of C interoperability into the 2003 standard.
It is now possible (and relatively easy) to write an entirely portable program in Fortran, even without recourse to a preprocessor.
Obsolete variants
[edit]
Until the Fortran 66 standard was developed, each compiler supported its own variant of Fortran. Some were more divergent from the mainstream than others.
The first Fortran compiler set a high standard of efficiency for compiled code. This goal made it difficult to create a compiler so it was usually done by the computer manufacturers to support hardware sales. This left an important niche: compilers that were fast and provided good diagnostics for the programmer (often a student). Examples include Watfor, Watfiv, PUFFT, and on a smaller scale, FORGO, Wits Fortran, and Kingston Fortran 2.
Fortran 5 was marketed by Data General Corp from the early 1970s to the early 1980s, for the Nova, Eclipse, and MV line of computers. It had an optimizing compiler that was quite good for minicomputers of its time. The language most closely resembles FORTRAN 66.
FORTRAN V was distributed by Control Data Corporation in 1968 for the CDC 6600 series. The language was based upon FORTRAN IV.[68]
Univac also offered a compiler for the 1100 series known as FORTRAN V. A spinoff of Univac Fortran V was Athena FORTRAN.
Specific variants produced by the vendors of high-performance scientific computers (e.g., Burroughs, Control Data Corporation (CDC), Cray, Honeywell, IBM, Texas Instruments, and UNIVAC) added extensions to Fortran to take advantage of special hardware features such as instruction cache, CPU pipelines, and vector arrays. For example, one of IBM's FORTRAN compilers (H Extended IUP) had a level of optimization which reordered the machine code instructions to keep multiple internal arithmetic units busy simultaneously. Another example is CFD, a special variant of FORTRAN designed specifically for the ILLIAC IV supercomputer, running at NASA's Ames Research Center. IBM Research Labs also developed an extended FORTRAN-based language called VECTRAN for processing vectors and matrices.
Object-Oriented Fortran (obsolete) was an object-oriented extension of Fortran, in which data items can be grouped into objects, which can be instantiated and executed in parallel. It was available for Sun, Iris, iPSC, and nCUBE, but is no longer supported.
Such machine-specific extensions have either disappeared over time or have had elements incorporated into the main standards. The major remaining extension is OpenMP, which is a cross-platform extension for shared memory programming. One new extension, Coarray Fortran, is intended to support parallel programming.
FOR TRANSIT was the name of a reduced version of the IBM 704 FORTRAN language, which was implemented for the IBM 650, using a translator program developed at Carnegie in the late 1950s.[69] The following comment appears in the IBM Reference Manual (FOR TRANSIT Automatic Coding System C28-4038, Copyright 1957, 1959 by IBM):
The FORTRAN system was designed for a more complex machine than the 650, and consequently some of the 32 statements found in the FORTRAN Programmer's Reference Manual are not acceptable to the FOR TRANSIT system. In addition, certain restrictions to the FORTRAN language have been added. However, none of these restrictions make a source program written for FOR TRANSIT incompatible with the FORTRAN system for the 704.
The permissible statements were:
Arithmetic assignment statements, e.g., a = b
GO to n
GO TO (n1, n2, ..., nm), i
IF (a) n1, n2, n3
PAUSE
STOP
DO n i = m1, m2
CONTINUE
END
READ n, list
PUNCH n, list
DIMENSION V, V, V, ...
EQUIVALENCE (a,b,c), (d,c), ...
Up to ten subroutines could be used in one program.
FOR TRANSIT statements were limited to columns 7 through 56, only. Punched cards were used for input and output on the IBM 650. Three passes were required to translate source code to the "IT" language, then to compile the IT statements into SOAP assembly language, and finally to produce the object program, which could then be loaded into the machine to run the program (using punched cards for data input, and outputting results onto punched cards).
Two versions existed for the 650s with a 2000 word memory drum: FOR TRANSIT I (S) and FOR TRANSIT II, the latter for machines equipped with indexing registers and automatic floating-point decimal (bi-quinary) arithmetic. Appendix A of the manual included wiring diagrams for the IBM 533 card reader/punch control panel.
Fortran-based languages
[edit]
Prior to FORTRAN 77, a number of preprocessors were commonly used to provide a friendlier language, with the advantage that the preprocessed code could be compiled on any machine with a standard FORTRAN compiler. These preprocessors would typically support structured programming, variable names longer than six characters, additional data types, conditional compilation, and even macro capabilities. Popular preprocessors included EFL, FLECS, iftran, MORTRAN, SFtran, S-Fortran, Ratfor, and Ratfiv. EFL, Ratfor and Ratfiv, for example, implemented C-like languages, outputting preprocessed code in standard FORTRAN 66. The PFORT preprocessor was often used to verify that code conformed to a portable subset of the language. Despite advances in the Fortran language, preprocessors continue to be used for conditional compilation and macro substitution.
One of the earliest versions of FORTRAN, introduced in the '60s, was popularly used in colleges and universities. Developed, supported, and distributed by the University of Waterloo, WATFOR was based largely on FORTRAN IV. A student using WATFOR could submit their batch FORTRAN job and, if there were no syntax errors, the program would move straight to execution. This simplification allowed students to concentrate on their program's syntax and semantics, or execution logic flow, rather than dealing with submission Job Control Language (JCL), the compile/link-edit/execution successive process(es), or other complexities of the mainframe/minicomputer environment. A down side to this simplified environment was that WATFOR was not a good choice for programmers needing the expanded abilities of their host processor(s), e.g., WATFOR typically had very limited access to I/O devices. WATFOR was succeeded by WATFIV and its later versions.
program;s=0i=1,n;s=s+1;stop i;s='s'Stop
(line programming)
LRLTRAN was developed at the Lawrence Radiation Laboratory to provide support for vector arithmetic and dynamic storage, among other extensions to support systems programming. The distribution included the Livermore Time Sharing System (LTSS) operating system.
The Fortran-95 Standard includes an optional Part 3 which defines an optional conditional compilation capability. This capability is often referred to as "CoCo".
Many Fortran compilers have integrated subsets of the C preprocessor into their systems.
SIMSCRIPT is an application specific Fortran preprocessor for modeling and simulating large discrete systems.
The F programming language was designed to be a clean subset of Fortran 95 that attempted to remove the redundant, unstructured, and deprecated features of Fortran, such as the EQUIVALENCE statement. F retains the array features added in Fortran 90, and removes control statements that were made obsolete by structured programming constructs added to both FORTRAN 77 and Fortran 90. F is described by its creators as "a compiled, structured, array programming language especially well suited to education and scientific computing".[70] Essential Lahey Fortran 90 (ELF90) was a similar subset.
Lahey and Fujitsu teamed up to create Fortran for the Microsoft .NET Framework.[71] Silverfrost FTN95 is also capable of creating .NET code.[72]
Code examples
[edit]
Further information: Wikibooks:Fortran/Fortran examples
The following program illustrates dynamic memory allocation and array-based operations, two features introduced with Fortran 90. Particularly noteworthy is the absence of DO loops and IF/THEN statements in manipulating the array; mathematical operations are applied to the array as a whole. Also apparent is the use of descriptive variable names and general code formatting that conform with contemporary programming style. This example computes an average over data entered interactively.
program average ! Read in some numbers and take the average ! As written, if there are no data points, an average of zero is returned ! While this may not be desired behavior, it keeps this example simple implicit none real,allocatable::points(:) integer::number_of_points real::average_points,positive_average,negative_average average_points=0. positive_average=0. negative_average=0. write(*,*)"Input number of points to average:" read(*,*)number_of_points allocate(points(number_of_points)) write(*,*)"Enter the points to average:" read(*,*)points ! Take the average by summing points and dividing by number_of_points if(number_of_points>0)average_points=sum(points)/number_of_points ! Now form average over positive and negative points only if(count(points>0.)>0)positive_average=sum(points,points>0.)/count(points>0.) if(count(points<0.)>0)negative_average=sum(points,points<0.)/count(points<0.) ! Print result to terminal stdout unit 6 write(*,'(a,g12.4)')'Average = ',average_points write(*,'(a,g12.4)')'Average of positive points = ',positive_average write(*,'(a,g12.4)')'Average of negative points = ',negative_average deallocate(points)! free memory end program average
Humor
[edit]
During the same FORTRAN standards committee meeting at which the name "FORTRAN 77" was chosen, a satirical technical proposal was incorporated into the official distribution bearing the title "Letter O Considered Harmful". This proposal purported to address the confusion that sometimes arises between the letter "O" and the numeral zero, by eliminating the letter from allowable variable names. However, the method proposed was to eliminate the letter from the character set entirely (thereby retaining 48 as the number of lexical characters, which the colon had increased to 49). This was considered beneficial in that it would promote structured programming, by making it impossible to use the notorious GO TO statement as before. (Troublesome FORMAT statements would also be eliminated.) It was noted that this "might invalidate some existing programs" but that most of these "probably were non-conforming, anyway".[73][unreliable source?][74]
When X3J3 debated whether the minimum trip count for a DO loop should be zero or one in Fortran 77, Loren Meissner suggested a minimum trip count of two—reasoning (tongue-in-cheek) that if it were less than two, then there would be no reason for a loop.
When assumed-length arrays were being added, there was a dispute as to the appropriate character to separate upper and lower bounds. In a comment examining these arguments, Walt Brainerd penned an article entitled "Astronomy vs. Gastroenterology" because some proponents had suggested using the star or asterisk ("*"), while others favored the colon (":").[citation needed]
Variable names beginning with the letters I–N have a default type of integer, while variables starting with any other letters defaulted to real, although programmers could override the defaults with an explicit declaration.[75] This led to the joke: "In FORTRAN, GOD is REAL (unless declared INTEGER)."
See also
[edit]
Computer programming portal
f2c – program to convert Fortran 77 to C codePages displaying wikidata descriptions as a fallback
F2PY – Python library for numerical programming
FORMAC – computer algebra system based on FORTRANPages displaying wikidata descriptions as a fallback
List of Fortran compilers
List of Fortran numerical libraries
List of programming languages
Matrix representation – storage method in computer memoryPages displaying wikidata descriptions as a fallback
Spaghetti code – Software source code with poor structure
References
[edit]
Further reading
[edit]
Language standards
ANSI x3.9-1966. USA Standard FORTRAN (PDF). American National Standards Institute. Archived from the original (PDF) on May 15, 2011 . Informally known as FORTRAN 66.
ANSI x3.9-1978. American National Standard – Programming Language FORTRAN. American National Standards Institute. Archived from the original on October 29, 2013 . Also known as ISO 1539–1980, informally known as FORTRAN 77.
ANSI X3.198-1992 (R1997) / ISO/IEC 1539:1991. American National Standard – Programming Language Fortran Extended. American National Standards Institute / ISO/IEC. Archived from the original on May 1, 2002. Informally known as Fortran 90.
ISO/IEC 1539-1:1997. Information technology – Programming languages – Fortran – Part 1: Base language (PDF). Archived from the original (PDF) on August 18, 2011 . Informally known as Fortran 95. There are a further two parts to this standard. Part 1 has been formally adopted by ANSI.
ISO/IEC 1539-1:2004. Information technology – Programming languages – Fortran – Part 1: Base language (PDF). Archived from the original (PDF) on July 31, 2023. Informally known as Fortran 2003.
ISO/IEC 1539-1:2010 (Final Draft International Standard). Information technology – Programming languages – Fortran – Part 1: Base language (PDF).[permanent dead link] Informally known as Fortran 2008.
Related standards
Kneis, Wilfried (October 1981). "Draft standard Industrial Real-Time FORTRAN". ACM SIGPLAN Notices. 16 (7): 45–60. doi:10.1145/947864.947868. ISSN 0362-1340. S2CID 8662381.
ISO 8651-1:1988 Information processing systems – Computer graphics – Graphical Kernel System (GKS) language bindings – Part 1: FORTRAN. Geneva, Switzerland: ISO. 1988.
Other reference material
ECMA Standard on FORTRAN (PDF). European Computer Manufacturers Association. April 1965. Archived (PDF) from the original on May 10, 2008 .
FORTRAN 77 4.0 Reference Manual (PDF). Sun Microsystems, Inc. 1995. Archived from the original (PDF) on January 5, 2012 .
"FORTRAN Coding Form" (PDF). IBM. Archived from the original (PDF) on June 8, 2015 .
IBM System/360 and System/370 Fortran IV Language (PDF). IBM. May 1974. Archived from the original (PDF) on April 6, 2011 .
Goerz, Michael (2014). "Modern Fortran Reference Card" (PDF). Archived (PDF) from the original on December 14, 2014 .
Books
Adams, Jeanne C.; Brainerd, Walter S.; Hendrickson, Richard A.; Maine, Richard E.; Martin, Jeanne T.; Smith, Brian T. (2009). The Fortran 2003 Handbook (1st ed.). Springer. ISBN 978-1-84628-378-9.
Akin, J. E. (2003). Object-oriented programming via Fortran 90/95. Cambridge, New York: Cambridge University Press. ISBN 978-0-521-52408-7. OCLC 49906306.
Arjen, Markus (2012), "Modern Fortran in Practice", Cambridge Univ. Press, ISBN 978-1-13908479-6.
Brainerd, W.S.; Goldberg, C.H.; Adams, J.C. (1996). Programmer's Guide to Fortran 90 (3rd ed.). Springer. ISBN 978-0-387-94570-5.
Brainerd, Walter (2015). Guide to Fortran 2008 programming. London: Springer. ISBN 978-1-4471-6758-7. OCLC 920530344.
Chapman, Stephen J. (2018). Fortran for Scientists and Engineers (Fourth ed.). New York: McGraw-Hill Education. pp. xxiv + 1024. ISBN 978-0-07-338589-1.
Chivers, Ian; Sleightholme, Jane (2018). Introduction to Programming with Fortran (4th ed.). Springer. ISBN 978-3-319-75501-4.
Clerman, Norman (2012). Modern Fortran : style and usage. New York: Cambridge University Press. ISBN 978-0-521-51453-8. OCLC 776162717.
Curcic, Milan (2020). Modern Fortran: building efficient parallel applications. Shelter Island, NY: Manning Publications Company. ISBN 978-1-61729-528-7. OCLC 1256806439.
Ellis, T. M. R.; Phillips, Ivor R.; Lahey, Thomas M. (1994). Fortran 90 Programming (1st ed.). Addison Wesley. ISBN 978-0-201-54446-6.
Etter, D. M. (1990). Structured FORTRAN 77 for Engineers and Scientists (3rd ed.). The Benjamin/Cummings Publishing Company, Inc. ISBN 978-0-8053-0051-2.
Kerrigan, J. F. (1993). Migrating to Fortran 90 (1st ed.). O'Reilly & Associates, Inc. ISBN 1-56592-049-X.
Kupferschmid, Michael (2002). Classical Fortran: Programming for Engineering and Scientific Applications. Marcel Dekker (CRC Press). ISBN 978-0-8247-0802-3. (Supplemental materials)
Lorenzo, Mark Jones (2019). Abstracting Away the Machine: The History of the FORTRAN Programming Language (FORmula TRANslation). Independently published. ISBN 978-1082395949.
Loukides, Mike (1990). Unix for FORTRAN Programmers. Sebastopol, CA: O'Reilly & Associates, Inc. ISBN 0-937175-51-X.
McCracken, Daniel D. (1961). A Guide to FORTRAN Programming. New York: Wiley. LCCN 61016618.
Nyhoff, Larry; Sanford Leestma (1995). FORTRAN 77 for Engineers and Scientists with an Introduction to Fortran 90 (4th ed.). Prentice Hall. ISBN 978-0-13-363003-9.
Page, Clive G. (1988). Professional Programmer's Guide to Fortran77 (June 7, 2005 ed.). London: Pitman. ISBN 978-0-273-02856-7 .
Press, William H. (1996). Numerical Recipes in Fortran 90: The Art of Parallel Scientific Computing. Cambridge, UK: Cambridge University Press. ISBN 978-0-521-57439-6.
Ruetsch, Gregory; Fatica, Massimiliano (2013). CUDA Fortran for Scientists and Engineers (1st ed.). Elsevier. p. 338. ISBN 9780124169708.
Sleighthome, Jane; Chivers, Ian David (1990). Interactive Fortran 77: A Hands-On Approach. Computers and their applications (2nd ed.). Chichester: E. Horwood. ISBN 978-0-13-466764-5. Archived from the original on March 12, 2014 .
Articles
Allen, F.E. (September 1981). "A History of Language Processor Technology in IBM". IBM Journal of Research and Development. 25 (5): 535–548. doi:10.1147/rd.255.0535. S2CID 14149353.
J. W. Backus; R. J. Beeber; S. Best; R. Goldberg; L. M. Haibt; H. L. Herrick; R. A. Nelson; D. Sayre; P. B. Sheridan; H. Stern; L. Ziller; R. A. Hughes; R. Nutt (February 1957). The FORTRAN Automatic Coding System (PDF). Western Joint Computer Conference. pp. 188–198. doi:10.1145/1455567.1455599.
Chivers, Ian D.; Sleightholme, Jane (2022). "Compiler support for Fortran 2008 & 2018 standards".
Pigott, Diarmuid (2006). "FORTRAN – Backus et al high-level compiler (Computer Language)". The Encyclopedia of Computer Languages. Murdoch University. Archived from the original on October 8, 2009 .
Roberts, Mark L.; Griffiths, Peter D. (1985). "Design Considerations for IBM Personal Computer Professional FORTRAN, an Optimizing Compiler". IBM Systems Journal. 24 (1): 49–60. doi:10.1147/sj.241.0049.
|
||||||
4768
|
dbpedia
|
0
| 17
|
https://homepage.divms.uiowa.edu/~jones/retro/notes/13.shtml
|
en
|
CS:4980:4, Notes 13, Fall 2015
|
[] |
[] |
[] |
[
""
] | null |
[
"Douglas W. Jones"
] | null |
en
| null |
By the late 1950s, core memory sizes greater than 4K were natural, and high-level languages were starting to emerge. FORTRAN was introduced in 1957, preliminary discussions of Algol were being widely circulated in 1958, leading to a solid specification for Algol 60, released in 1960. FORTRAN II, released in 1958, had parameterized subroutines and functions, local variables, and global variables of a sort (named COMMON blocks). Algol 60 had hierarchical scope rules with nesting of blocks.
This required that programmers start to think about how to use memory in ways they had not in the era of very small memories. FORTRAN's memory model was based entirely on static storage allocation, but members of the Algol committee began to think in terms of dynamic allocation. How storage was to be allocated was not the subject of discussion, but the idea of using a stack was implicit in their thinking.
The paper assigned for this discussion (free if referenced from on campus, otherwise, extraordinarily expensive) is the first clear public discussion I know of to explain how to implement and use Algol-like languages with their implied use of a stack. For most programmers of the era, this paper was the first introduction to stacks, and as we will see, it had a huge influence on computer architecture, just as Algol 60 had a huge influence on programming languages.
An aside: Algol 60 begat CPL (Christopher Strachey's Combined or Cambridge Programming Language) which begat BCPL (Basic CPL) which begat B (at Bell Labs) which begat C and C++ (at Bell Labs) which begat Java (at Sun Microsystems). Algol 60 also begat Algol 68, Pascal and Simula 67. Simula 67 was upward compatable from Algol 60 and had objects (C++ added objects to C). Pascal (developed by Wirth) begat Ada (developed by Honeywell Bull in France).
Today, we take for granted the idea of local variables allocated on a stack allowing for the possibility of recursion and subroutine calls for granted. This was not always so. Programmers in 1960 assumed static memory alloction for just about everything, and Dijkstra's paper, the reading for today, demonstrates how hard it was to communicate these new ideas.
|
|||||||
4768
|
dbpedia
|
3
| 15
|
https://craftofcoding.wordpress.com/2017/04/29/the-evolution-of-if-i-from-fortran-i-to-algol-60/
|
en
|
The evolution of if (i): pre-Fortran to Algol 60
|
[
"https://craftofcoding.wordpress.com/wp-content/uploads/2017/04/firstif.jpg?w=1200",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://pixel.wp.com/b.gif?v=noscript"
] |
[] |
[] |
[
""
] | null |
[] |
2017-04-29T00:00:00
|
Arguably one of the most important control structures to evolve is "if". Without it, programs couldn't make any sort of decisions. Few algorithmic languages, apart from Plankalkül (1948), contained conditional statements. Plankalkül formed conditional statements with the help of a symbol which was an arrow with a period above it, which was used in the following…
|
en
|
https://s1.wp.com/i/favicon.ico
|
The Craft of Coding
|
https://craftofcoding.wordpress.com/2017/04/29/the-evolution-of-if-i-from-fortran-i-to-algol-60/
|
Arguably one of the most important control structures to evolve is “if“. Without it, programs couldn’t make any sort of decisions.
Few algorithmic languages, apart from Plankalkül (1948), contained conditional statements. Plankalkül formed conditional statements with the help of a symbol which was an arrow with a period above it, which was used in the following manner:
The left side of the statement, B, signifies the condition (Bedingung) and is an expression with a boolean value, and the right side, a, is an arbitrary statement. If B evaluates to 0 (nein), then the statement ends here, otherwise if B is 1 (ja), then the statement continues with a. There is no “else” statement. Heinz Rutishauser’s Superplan (1949-1951), did not have a decision statement.
Decision statements in programming languages are intrinsically linked to branch instructions in assembler. The first language to use something akin to the modern form of the if statement was likely Fortran I which used an if statement as a form of three-way goto statement.
IF (E) L1, L2, L3
The expression, E is evaluated and one of the alternative paths of L1, L2, and L3 is chosen based on whether E is negative, zero or positive. This became known as the arithmetic if. This could be used to derive a three-way decision statement of the form:
IF (X-Y) 10, 10, 30 10 MAXNUM = Y GO TO 20 30 MAXNUM = X 20 ...
This says that if X-Y is less than or equal to zero, then the maximum is Y, otherwise the maximum is X. This made sense in the context of unstructured jumps using go to. This allowed for a very limited decision structure, where the expression always had to be expressed in terms of some numeric output.
In 1957-58 John McCarthy, developer of Lisp, was writing a series of routines for legal chess moves in Fortran which prompted him to invent conditional expressions. He found the arithmetic if construct from Fortran I and II “awkward to use” [McCarthy81], and found it more natural to invent a Fortran function XIF(M,N1,N2) whose value was N1 or N2 based on whether M was zero or not (it was written in machine language). The function was likely not that efficient, as it required all three arguments to be evaluated before XIF() was entered. In Lisp, the conditional took the form of the cond function:
(cond (condition1 result1) (condition2 result2) ... (T resultN))
Later a more “traditional” like conditional operator was included into the specifications for Lisp, and appeared as follows:
X = IF (N .EQ. 0, ICAR(Y), ICDR(Y))
McCarthy suggested the use of this concept in Algol 58 when he was a member of the Algol committee. In the Algol 58 preliminary report the if statement took the form:
if (a>0); c:=a↑2↓×b↑2↓ if (a<0); c:=a↑2↓+b↑2↓ if (a=0); go to bed
Algol 58 did not really progress much, and was superseded by Algol 60. Algol 60 added the keyword then, to separate the logical expression from the statement to be executed. many considered this if-then combination to make the statement more readable. The Algol statement was also extended to include an “else” part. Here is an example of an if-then-else in Algol 60.
if x > 0 then pos := pos + 1 else negzero := negzero + 1
This lead to the ambiguity we know today as the “dangling-else”. Whereas a statement such as:
if x=0 then if y=0 then m:=m+1
is not ambiguous, the following statement could be:
if x=0 then if y=0 then m:=m+1 else n:=n-1
Is 1 to be subtracted from n when x is non-zero, whatever the value of y, OR when x is zero but y is not? A conundrum.
To further add to the structural space, these if statements were constrained to the control of a single statement, which limited their usefulness. Algol 60 dealt with this through the use of the compound statement it had introduced using the keywords begin and end. For example, a piece of code to swap two numbers if x < y:
if x<y then begin dummy:=x; x:=y; y:=dummy end
Or, written in a more readable manner (many early languages crammed as much as they could on one line – blame punch-cards):
if x<y then begin dummy:=x; x:=y; y:=dummy end
This structure could also be used to reduce the dangling-else problem:
if x=0 then begin if y=0 then m:=m+1 else n:=n-1 end
REF(S):
[McCarthy81] McCarthy, J., “LISP Session”, History of Programming Languages, pp.173-197, ACM (1981)
|
||||
4768
|
dbpedia
|
2
| 59
|
http://www.cs.man.ac.uk/CCS/res/res50.htm
|
en
|
Computer Resurrection Issue 50
|
[
"http://www.cs.man.ac.uk/CCS/res/images/res50a.jpg",
"http://www.cs.man.ac.uk/CCS/res/images/res50b.jpg",
"http://www.cs.man.ac.uk/CCS/res/images/res50c.jpg",
"http://www.cs.man.ac.uk/CCS/res/images/res50d.jpg",
"http://www.cs.man.ac.uk/CCS/res/images/res50e.jpg",
"http://www.cs.man.ac.uk/CCS/res/images/res50f.jpg",
"http://www.cs.man.ac.uk/CCS/res/images/res50g.jpg"
] |
[] |
[] |
[
""
] | null |
[] | null | null |
Resurrection Home Previous issue Next issue View Original Cover
Computer
RESURRECTION
The Bulletin of the Computer Conservation Society
ISSN 0958-7403
Number 50
Spring 2010
Contents
ALGOL 60 - A Binary Star Dik Leatherdale Society Activity News Round-Up Pioneer Profiles - Michael Woodger David Yates Reminiscences of Whetstone ALGOL Brian Randell On the Occasion of the 50th Anniversary
of the ALGOL 60 Report Tony Hoare An ALGOL 60 Spiritual User Luca Cardelli The Atlas ALGOL System Bob Hopgood ALGOL 60 - Questions of Naming Dik Leatherdale Forthcoming Events Committee of the Society Aims and Objectives
TopPrevious Next
ALGOL 60 - a Binary Star
Dik Leatherdale
ALGOL 60 was, perhaps, the most influential development in programming languages of all time. Other languages have been more successful in themselves. Certainly there are many whose use has been more widespread. But none have had the influence on thinking, none have demonstrated the great leap forward, none have received so much attention as ALGOL 60. The vast majority of programming languages in the last half century have owed a huge debt to ALGOL 60. The ALGOL 60 Report stands as a colossus of its kind. It was, and still is, a thing of beauty - a work of art.
So, in tribute to those clear-thinking pioneers of 50 years ago, we dedicate this 50th issue of Resurrection issue to ALGOL 60.
Not, of course, that ALGOL 60 was without fault. Nor, indeed, did it escape criticism. The lack of any defined input/output facilities was the point most frequently made. That led to the creation of many incompatible I/O systems which, in practice, severely restricted program portability. But for me, and with the benefit of 50 years of hindsight, ALGOL’s major shortcoming was that it was a paper tape-orientated language in what was soon to become a punched card world. And reserved word-based dialects such as Burroughs’ ALGOL were arguably not “strict” ALGOL.
But the fierceness of the arguments, the multiplicity of “improved” ALGOLs - ALGOL-W, Pascal, CPL etc. - only serve to demonstrate the extent to which the basic concept was accepted as essentially correct.
Today ALGOL is but rarely used. But it is far from forgotten. It lives on in its progeny - C, C++, Delphi, Ada, Java, C# and so many others.
I can surely do no better than to leave the last word to Tony Hoare.
Here is a language so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors.
Top Previous Next
Society Activity
Changes to the Committee - David Hartley
In December 2009 the CCS Committee reviewed its own membership in the context of the new constitution agreed in October. The membership is largely unchanged, and will be subject to the scrutiny of the next AGM which will be held in September or October later this year. We did, however, lose two long service members namely David Anderson of Portsmouth University, who has been a member since 2003, being variously London meetings coordinator and web site editor. Also, David Glover has represented The National Archives since 2005. We thank both of them for their past services.
North West Group
After 17 years Ben Gunn has decided to retire as Secretary. Chairman David Hartley paid tribute to Ben′s long service to the Society - Ben′s record is exceptional. The Society is hugely grateful for his dedication and we all wish him well. Meanwhile Member Gordon Adshead has agreed to take over the running of the Group. We all welcome him and look forward to an equally long and useful association.
Exhibition to mark the Turing Centenary - Simon Lavington
The CCS is working with the Science Museum on an idea for a temporary exhibition in 2012 to mark the centenary of the birth of Alan Turing. The project is in its early stages but the plan is for a small public outreach exhibition, lasting from June 2012 to June 2013, in the Maths and Computing Gallery at the Science Museum. The exhibition has received initial approval from the Science Museum Executive Group with an indicative budget of between £50,000 and £60,000 and the possibility of additional external sponsorship. All this will require further approval and sign-off from the Executive Group, Director and Trustees as the project develops.
The exhibition could be themed round four chronological phases of Alan Turing’s involvement with the theory and practice of digital computation. Briefly, these could be:
Theory at Cambridge and Princeton: the Universal Turing Machine.
Bletchley Park and cryptanalysis: a vital interlude.
NPL and the Pilot ACE: hardware and architecture.
The Manchester years: software and applications.
At its meeting on 18th February the CCS Committee set up a small working party under Simon Lavington to progress the project. Technical advice is being sought from a number of researchers whose work has rendered them expert in various aspects of Alan Turing’s work towards the development of digital computers and their application. There is clearly a need to identify relevant artefacts. A meeting has been held with the Bletchley Park Trust and areas of collaboration have been identified.
The timescale for the project is likely to be as follows.
Phase 1 (to be completed by end-November 2010): the CCS and others will finalise the intellectual content of the exhibition, which may include an indication of main themes, list of desired artefacts, simulators, models, etc.
Phase 2 (to start December 2010 and be completed before end-June 2011): the Museum’s team will organise some form of audience evaluation, to mould the intellectual content into a presentation suitable for public outreach.
Phase 3 (to be completed in time for the opening in June 2012): the Museum’s exhibition team undertakes the implementation of the whole project.
Anyone who has ideas to contribute is invited to contact Simon Lavington at .
Elliott 803 Project - Peter Onion
Not much to report this time. The 803 itself is continuing to work well. The only problem has been that over Christmas the tape reader seemed to become less reliable and eventually became unusable. After much cleaning and adjustment it was still unreliable, so the decision was made to try and set up one of the several spare readers. The one chosen was in good condition mechanically and initial tests with a tape loop showed all five data channels were producing correct output signals. However the sprocket-hole channel was producing a very low amplitude output pulse. The fault was quickly diagnosed and the cause identified as the output transistor’s load resistor which had become disconnected from the PCB track. The “new” reader has been performing well for the last few weeks.
A Calcomp 565 plotter has been retrieved from the TNMOC store and work will begin soon to restore this to operation. The 565 was supplied by Elliotts as an option for the 803. Sadly our 803 doesn′t have the optional instruction decoding circuits installed in the processor. Nor do we have the original interface unit to connect the plotter to the processor. However, a successful test has been conducted with an alternative scheme that requires minimal connections to be made to the processor. This will allow the plotter to respond correctly when the 803 executes the appropriate output instructions.
Elliott 401 Project - Chris Burton
The moratorium on accessing the machine continues. Remediation of asbestos- containing materials will take place by Museum conservation staff after similar work to the Pegasus. An alternative to the “firebar” resistors in the power supply will be sought. Procedures, roles and methods of working will have to be worked out with the Museum management before we will be allowed to operate the machine again.
Pegasus Project - Len Hewitt
After the discovery of asbestos in the Pegasus fuses and an alternator connection box very little happened as far as the Pegasus Project is concerned. In December the official asbestos reports were released and arrangements were made by the Science Museum for meetings to discuss the return of Pegasus to active service.
There were a number of items concerning Health and Safety, asbestos, and Project practices to be discussed. The good news from the CCS point of view is that it is being discussed and the Museum is leading the way forward. Chris Burton and Peter Holland attended the two meetings and I attended one for some time via a telephone conference call.
The asbestos problem with the alternator gasket has been dealt with by sealing. The asbestos-bearing fuses will be replaced with modern asbestos-free versions of otherwise identical type.
Various procedures have been put forward to the Museum on how the repairs should be completed. Before they can take place the Museum has a number of Health and Safety issues that have been considered and will be implemented Until that is done repairs to the damaged wiring cannot take place. From our photographs and circuit investigation and discovery of three blown fuses, we are hoping that a replacement of burnt wires and a damaged tag block should be all that is necessary. There may be other components damaged in the PSU but we are hopeful that the blown fuses should have protected the rest of the circuitry.
Harwell Dekatron Computer Project - Tony Frazer
There was considerable damage to the wiring on racks 4 and 5 (used for the Dekatron stores). The send/receive guide pulse and power connections run the length of the racks to all stores and across to the extension on rack 5. On both racks, these wires span exposed areas between the bays like the strings of a harp. Many had thus been broken off and others had suffered weakening near the solder joints. All such wiring on rack 5 has been replaced with single strand PVC insulated plate wire using the same insulation colour scheme. Fortunately, the laced loom running up the centre U-points appeared undamaged and has been left in place. Similar repairs will be necessary to rack 4. Spare U-points have been located on equipment in D-block which will be used to replace at least two broken units on rack 4.
In addition, 38 wires linking racks 4 and 5 which were hard-wired between the two racks had been cut at some point in order to separate racks 4 and 5. This wiring has been replaced on the rack 5 side and two Jones connectors have been attached using existing threaded holes on rack 4.
Three of the five store units have been cleaned and repopulated with tested Dekatrons - one store has been populated with GC10B dekatrons (much brighter than the original GC10As). This has almost exhausted our supply of spare GC10Bs, so we will be looking out for more of these. Three more stores and the rack will be fully populated. We have sufficient GC10Bs to populate the accumulator and one option may be to populate half the stores in each of two store boxes with GC10Bs to facilitate demonstration of the machine to visitors (i.e. bright dekatrons in both send and receive stores). I have also modified a Russian OG-4 Dekatron (which appear to be readily available) as a possible substitute by changing the pinout using a spare octal base as a hard-wired adaptor.
Eddie continues to clean and adjust relays - of particular note are relays where spring sets have been deliberately kinked rather than adjusting the travel to make contact. Eddie has completed work on the arithmetic rack and is now working through the store units on rack 5.
Johan has made a start on straightening the casing and internal frame of one of the stores (with all dekatrons removed!). It had evidently been impacted causing crumpling of the interior metalwork and possible damage to components. Similar repairs were necessary on two of the other stores.
ICT 1300 Project - Rod Brown
Work continues to build the high speed interface. This is to allow us to capture the data from the magnetic tape data transfer unit, to complete the software recovery aims of this project.
The project was very pleased to be invited to present its story and the history of the 1300 series to the CCS on the 15th of October 2009. This allowed us to continue the communication about the 1301 resurrection project, which is why the website at www.ict1301.co.uk was created.
All of this activity is running against a backdrop of growing doubt about the longer term future of the project. There is pressure from Ashford Borough Council to impose and collect non-domestic rates from the building which houses the project instead of being included in the domestic rates of the farm house. This would place the running costs beyond our current budget!
We are fighting this issue as decisions which are being made at a national level by the Valuation Office Agency (an executive agency of H.M. Revenue and Customs) could affect not only the 1301 project, but set a precedent for any other restored or conserved computers held by any organisation or individual in UK within garages, sheds, barns etc.
The BCS Legal Department have provided us with a supportive statement which has been included in the letter to the Minister responsible for the Valuation Office Agency.
Three representatives from the Valuation Office Agency visited the project on 17th February. The decision they reach will determine the final outcome and be used as a precedent for any other non-domestic equipment on which other Councils may seek to levy Rates.
The BBC has visited Flossie the ICT 1301 in its home in deepest Kent and interviewed its custodians. The result was broadcast on 18th February.
Bombe Rebuild Project - John Harper
In Resurrection 49 I said that we had completed all planned construction. With hindsight this statement has turned out to be premature. Soon after Resurrection 49 went to press the team decided that they wanted the extra three German surface fleet wheels in the Typex. This, as many will recall, is the British Typex set up to operate as a German three wheel Enigma machine. This construction is now well under way and when complete will allow us the decrypt any three wheel message and demonstrate a complete Intercept to Ultra process for any combination of eight wheels.
Training of Bombe operators is now our major activity. Recruitment has got off to a good start but we still need more people interested in operating our Bombe at weekends. One thing that had not been anticipated is that many volunteers are only able to be at Bletchley about once a month. Therefore we need more volunteers and need more time from existing team members for training. Having said that, the recruits we have so far are of high calibre and they are learning faster than we had expected. This will reduce the training time for all involved. Training documentation has been produced and is being refined as we go along. This too will all help when it comes to training later volunteers.
One thing that is satisfying is that the machine is standing up well to new potential operators. We thought that mistakes might be made by trainees and that the Bombe might suffer but this concern is so far unfounded. The machine continues to do all that is asked of it and virtually all the problems encountered can be put down to human error.
We now have a new activity. Letchworth - First Garden City Heritage Museum is staging a year-long exhibition entitled Letchworth Garden City and the Second World War from 20th March 2010 - 19th March 2011. The British Tabulating Machine Company and the Bombe design and manufacture will figure prominently. As part of this the Bletchley Park Trust is loaning display material and our team is involved in lending one of the Props Bombes produced specifically for filming of Enigma. Many will recall that this was a full length film featuring Kate Winslet and other well known actors. The Props Bombe we have chosen is slightly worse for wear and needs a good tidying up before it moves to Letchworth. This is already underway. When it arrives we have to refit parts left off during transit.
Our website is still at www.jharper.demon.co.uk/bombe1.htm
Top Previous Next
News Round-Up
-101010101-
It is our sad duty to report the death at the age of 91 of Andrew Booth, one of the UK’s computer pioneers. Starting in 1945 at Birkbeck College London, he built a series of small, low cost machines. His (then) revolutionary ambition was to build a machine cheap enough that every university could afford one! This, as it turned out, satisfied a need which nobody else working in the field had identified, prefiguring the economic rationale of minicomputers by more than a decade. The APEC was adopted by the British Tabulating Machine company as the basis of its earliest computers, the HEC series, later enhanced and renamed as the ICT 1200 which became the most popular British computer in the late 1950s.
Andrew Colin, writing in Resurrection 5, recalled Booth’s development of computers at Birkbeck. Former CCS Chairman Roger Johnson’s Computer Science at Birkbeck College - A Short History also contains a great deal of interesting material. Tilly Blyth and Roger Johnson took part in a tribute to Booth on BBC Radio 4’s Last Word programme on 22nd January.
-101010101-
The recession having hit newspaper advertising revenues, Technology Guardian ceased publication in December after some 27 years. One of the last articles to be published was Jack Schofield’s excellent coverage of the visit of Sir Maurice Wilkes to The National Museum of Computing at Bletchley Park, organised by our ever-energetic chairman, David Hartley. See www.guardian.co.uk/technology/2009/nov/18/maurice-wilkes-computing-witch
Happily Technology Guardian lives on in an online edition which is hidden away at www.guardian.co.uk/technology. Liberated from its weekly publication cycle, articles are now published as they become available. Schofield has promised to try to add an entry to his popular Ask Jack column every day. Recommended.
-101010101-
The National Museum of Computing and the National Physical Laboratory have combined to create a new gallery at Bletchley Park celebrating the creation of the Internet and recalling the invention of packet switching by Donald Davies. See Martin Campbell-Kelly’s profile of Davies in Resurrection 44.
-101010101-
The UK’s first ever Vintage Computer Festival will be held in The National Museum of Computing at Bletchley Park on the 19th and 20th June. Enthusiasts and collectors from across Europe are expected to enjoy the many exhibitions, demonstrations, flea market and auction and a number of special talks and presentations. CCS secretary Kevin Murrell describes the event as “The ultimate geek version of the Antiques Roadshow!”
-101010101-
Under Tilly Blyth’s direction the Oral History of British Science programme for National Life Stories at the British Library has been conducting interviews with many of Britain’s surviving computer pioneers. These life story interviews differ from others as they tend to focus on early experiences and education, as well as the significant advances during a career. Interviews have started with Geoff Tootill and Raymond Bird, and have been agreed with many more significant pioneers. Unless interviewees request they remain confidential, the interviews will be made available to the public through the Sound Archive at the British Library. More information about the programme is available at www.bl.uk/historyofscience.
-101010101-
The BBC, in conjunction with the British Museum, is currently broadcasting a series of 100 programmes on Radio 4 which tell the stories of 100 objects. As you might expect, this ambitious series is more about antiquities than anything in living memory although Colossus is mentioned on the series website. But a spinoff is, we hear, being developed which will deal with 20th century technology. We have no news of broadcast dates as yet.
-101010101-
The CCS has renewed for another year its bulk subscription to the I.E.E.E. Annals of the History of Computing. This allows CCS members to obtain a year′s copies at the discounted price of £30 - a figure which has not changed despite the financial upheavals of the past year. Members who are not already subscribers can join the scheme by contacting Hamish Carmichael.
Top Previous Next
Pioneer Profile: Michael Woodger
David Yates
Michael Woodger
Anyone working on the development of digital computers as early as 1946, as Mike Woodger was, clearly has impeccable credentials as one of the true pioneers of computing. At first an assistant to Alan Turing, no little distinction in itself, he went on to play important parts in the development and application of the Pilot ACE and ACE computers at the National Physical Laboratory, and later in the methodology, design and documentation of programming languages, especially ALGOL 60 and Ada.
Mike was born on 28th March 1923 in Epsom, Surrey, England, the eldest of four children. His father Joseph Henry Woodger was emeritus professor of biology in the University of London, and one of his particular ambitions was the construction of a foundation of mathematical logic for biology, based on Whitehead and Russell′s work on the logical basis of mathematics. He taught the young Mike symbolic logic as soon as he was able to follow it. This unusual early grounding in rigorous scientific thinking, and particularly in formal methods, evidently planted a seed which was to grow and bear fruit a hundredfold.
After graduating in mathematics from University College London in 1943 and working on military applications for the remainder of World War II, Mike joined the newly-established Mathematics Division of the National Physical Laboratory at Teddington in south-west London in May 1946. After a short period of training in numerical analysis, the main subject of the Division’s work, he was introduced to Turing’s plans for the ACE computer and von Neumann’s plans for EDVAC. He then assisted Turing with detailed logical design and experimental hardware, the latter including an abortive attempt to make an acoustic delay line store using the air column in a drainpipe. He remembers Turing as being kind, shy, and encouraging, and how they shared a common interest in logic. Eventually, after Turing’s departure in 1947 and NPL’s belated recruitment of staff with experience of pulse electronics, the development of a computer to be known as the Pilot ACE was undertaken in earnest. Mike’s main task was writing standard procedures for mathematical operations, which were successfully tested and put to use when Pilot ACE became operational in 1950. He also played a vital part in the application of Pilot ACE to practical problems. In an (unpublished) interview recorded in 1980 he says “I ended up developing an automatic system for handling matrices of reasonable size in the store . . . . . and this was very, very successful”. Indeed it was; his system allowed a whole program of matrix manipulations to be specified for the machine by someone not familiar with the machine language. It included space-saving and error checking facilities, and it was rewritten for the DEUCE and in this form widely used well into the 1960s. This work was important in two ways: it made the machines available to more users more efficiently and so greatly extended the range of applications to which they were put; and also, from a historical perspective, it proved to be one of the earliest practical steps on the road to modern high-level computer languages.
In similar vein, the development of the ALGOL 60 language is mentioned in the same interview, but Mike, modest chap that he is, does not stress the importance of his own contribution, which lay particularly in organising and capturing, in a precise written form, the ideas of Peter Naur and his other fellow members on the International ALGOL Committee. Mike is one of the joint authors of the ALGOL 60 report, a milestone in the history of programming languages.
At the time the interview was held in July 1980, Mike had started his second period as chairman of IFIP Working Group 2.3, the international forum for discussion of programming methodology. In this work he had stressed the idea that, since a large program is best constructed as a set of modules in such a way that modules at one level use the services of modules at the next lower level and provide a service to modules at the next higher level, the design of programming languages should reflect these same structural concepts. Each level should have its own terminology and not be cluttered with either text or concepts appropriate to other levels. His ideas, which have stood the test of time very well, are explained in more detail in ref. 1.
His next contribution, acting as consultant to a consortium led by CII - Honeywell Bull, was to the design of the language at first called Green, one of the four contenders short-listed in a competition to establish a standard language for use in the US Department of Defense. The language was renamed Ada after its success in the competition in 1979. Mike continued his contribution via Alsys S.A., a French company developing Ada documentation, both before and after his retirement from NPL in 1983. Jean Ichbiah, the head of this company, described him fittingly as “an artist of technical writing”. His part-time work for Alsys ended in 1991.
Fortunately for those interested in the history of computing, Mike has throughout his career kept documents and records systematically. When he retired from NPL the historical part of this considerable collection was moved to the library of the Science Museum in South Kensington, London, so that it could be made available to those with appropriate specialist interests.
The computing world has been fortunate that Mike saw the vacancy notice for mathematicians at NPL in 1946. It led him to a career which was to prove ideally matched to his talents: clarity of thought, ability to handle complex abstract concepts and meticulous attention to detail, qualities he has continued to show to the present day.
As published in Communications of the ACM
1 M Woodger. On semantic levels in programming. In: C V Freiman (ed.), Information Processing 71: Proc. IFIP Congress 71, held in Ljubljana, August 1971, North-Holland, 1972, pp.402-407.
Top Previous Next
Reminiscences of Whetstone Algol
Brian Randell
The co-author of ALGOL 60 Implementation, always known as Randell and Russell, muses on the origins of Whetstone ALGOL for the KDF9 and its influence. Whetstone ALGOL was inspired by a week-long school at Brighton Technical College in 1961 at which Edsger Dijkstra and Peter Naur seem to have cast a spell over everybody present (see also Tony Hoare in Resurrection 48).
I was hired from college in 1957 by English Electric to program nuclear reactor codes on DEUCE. This was at their Atomic Power Division in Whetstone, just outside Leicester. However, I first spent the summer with IBM at their main London offices, which were in Wigmore Street, behind Selfridges Department Store.
At IBM I programmed and operated a 650 computer, at the time IBM’s only computer in the UK. The 650 was proudly installed behind large plate glass windows, where it attracted great attention from the passers-by. After the sleek lines of the 650, with its simple and straightforward programming facilities, and its glossy printed documentation, the DEUCE’s workmanlike cabinetry, its binary-level programming, and its duplicated and stapled typewritten manuals, came as rather a shock to me when I arrived at Whetstone in the autumn of 1957.
The DEUCE, English Electric’s first computer, was largely based on Alan Turing’s 1945, i.e. post-Bletchley Park, plans at NPL for the ACE computer. It therefore differed greatly from just about all its contemporaries, both British and American. The ancestry of these other machines could all be traced back more or less directly to the work of von Neumann and his colleagues on the design for EDVAC at the Moore School of the University of Pennsylvania. In particular, DEUCE’s design emphasised computation speed and hardware economy. The assumption was that programmers would be willing and able to take full advantage of the machine’s very low-level form of programming. This involved working in binary, and directly controlling the timing of bringing operands to function units and of initiating the operation of these units - in effect a form of what later came to be known as microprogramming. The effect was that, in the hands of an expert programmer, DEUCE could considerably outperform other computers of basically similar hardware speed.
I and a fellow maths graduate from Imperial College, Mike Kelly, who joined Whetstone at the same time as me, soon came to regard DEUCE with affection, even as a giant toy. So we did not confine our programming to nuclear reactor codes, but instead, on our own initiative, wrote numerous demonstration programs, and started investigating how we could get DEUCE to help with some of the more detailed clerical aspects of its own programming.
DEUCE was in fact not only difficult to program, it was also very difficult to compile for. Most of the then-existing aids to programming DEUCE were interpreters rather than compilers. The effectiveness of these interpreters depended on the cost of the interpretation loop relative to the value of the individual interpreted operations. Thus NPL’s interpreter controlling execution of a set of (efficiently hand-coded) linear algebra routines was very cost effective, and being also convenient was heavily used. At the other end of the spectrum was Alphacode, an interpreter that had been developed for a simple language for floating point scalar arithmetic. This was normally used only for small one-off calculations since programs written in it were typically orders of magnitude slower than the equivalent hand-optimised machine code.
The fact that DEUCE, like all the earliest machines, did not have any built-in floating-point arithmetic was very significant. There were subroutines for the various basic floating-point arithmetic operations. But in the interests of efficiency many calculations were laboriously programmed using fixed-point arithmetic. Indeed we programmers regarded the use of floating point almost as cheating, and the coming of hardware floating point as more of a challenge than an opportunity. Against this background Mike and I came up with a compromise solution that took advantage of some peculiar DEUCE hardware features in order to produce a very fast interpreter, albeit just for fixed-point computations. (As I recall it, the main loop of our interpreter had only five instructions, whereas more conventional interpreters typically employed loops involving 50 to 100 instructions.) Despite management opposition - and indeed threats of dismissal - we pressed on with our ideas, and mainly in our own time developed a system which we called EASICODE.
EASICODE was so successful, particularly among the scientists and engineers who were the main computer users at Whetstone, that by the time the KDF9 came onto the horizon, I was in charge of a small “Automatic Programming Department”. Lawford Russell had joined me, replacing Mike Kelly who had left for pastures new at IBM (where he put his great coding skills to developing the microcode for what became the System 360/40). Moreover, there was full management support for us to exercise our skills and enthusiasms on this exciting new computer.
One important aspect of our environment was that we were working in very close proximity to the users of our programs, and had extensive experience of operating our own and other people’s programs on the DEUCE. We therefore had developed strong (self-defensive) views on the need for programs to be robust in the face of input, operator and hardware errors, and for them to provide meaningful feedback to users whenever possible. (Little did I know that such experiences and views would strongly influence most of my subsequent career, not just our plans for KDF9.)
KDF9 was a floating point machine, with a set of 16 high speed registers, organised as a small arithmetic stack or pushdown store, together with another set acting as a pushdown store for subroutine return addresses. So it was very different from DEUCE, and indeed all other computers. (Superficially it had similarities with the contemporary Burroughs B5000, but in fact it presented a very different and harder challenge to compiler writers than the B5000.) Indeed, as with DEUCE, carefully hand-coded KDF9 programs could be very efficient, but the problem of compiling comparably good code from a high-level language was far from straightforward.
Faced with this challenge, and a market where IBM’s scientific computers already had an effective optimising compiler for FORTRAN, English Electric’s Computer Division at Kidsgrove was planning a very ambitious full optimising compiler for the recently defined ALGOL 60. It was evident to us that this compiler would not be ready until some considerable time after the first KDF9s were due to become operational. And our view was that, though this compiler might produce fast code, the compiler itself was likely to be far from fast. We therefore decided to concentrate on the early provision of a programming facility that would be suitable for repeated use during program development.
Brighton Technical College
Then Lawford Russell and I, and members of the Kidsgrove team, attended a workshop at the Brighton College of Technology in 1961, organised by their Automatic Programming Information Centre. At this meeting Edsger Dijkstra described the then very new ALGOL 60 compiler for the Electrologica X1 computer that he and his colleague Jaap A. Zonneveld had implemented. At this stage Lawford and I had not decided what programming language we should support, and were showing dangerous signs of wanting to create our own. Luckily, one of the Kidsgrove group suggested that we should approach Dijkstra to find out if he would support our basing our work on his compiler. He readily agreed, and - with some difficulty - we got English Electric to send us to Amsterdam for a week to confer with him.
Our week of discussions with Dijkstra was spent not just on learning how the X1 compiler worked, but also on the design of a high speed translator/interpreter of full ALGOL 60 for the KDF9. These discussions we documented in a lengthy report. For the next few years Dijkstra used our report to defend himself from the numerous further requests he was getting from people who wanted to visit him and find out about the X1 compiler. (Having lost my own copy of this report many years ago, it was only recently that I found another one after a long search through numerous libraries and archives, and made it available on the Internet. I have also only just been kindly alerted - by Doaitse Swierstra - to the existence of a Report by F.E.J. Kruseman Aretz published in 2003 by the Centrum voor Wiskunde en Informatica, successor to the Mathematical Centre. This report contains the full assembly code text of the X1 Compiler together with that of an equivalent Pascal version.)
The X1 compiler was I believe the world’s first ALGOL compiler, certainly the first to cope with all the major new challenges of ALGOL 60, such as block structuring, call by name parameters, recursion, etc. (Indeed I recall that some of the German members of the ALGOL Committee, who were struggling to implement the new language that they had voted for, campaigned for the removal of some of these facilities from the language.)
There was at this time very little published literature on the subject of ALGOL compilation. I recall there being just one book on compilers, that by Halstead on compiling NELIAC, a dialect of ALGOL 58 language. (ALGOL 58 introduced a number of the features that were extended and generalised in ALGOL 60 - for example it allowed nesting of procedures, with corresponding scoping of identifiers, but it did not have block structuring.) There had, in addition, been a recent growth in publications on formal syntax and syntax analysis (in large part spurred by the ALGOL 60 Report’s use of what became known as BNF), and on arithmetic expression translation. But we relied almost entirely on information and advice that we received from Dijkstra.
We set to, and started the detailed design of what became known as the Whetstone ALGOL Translator. We never used the term “WALGOL” ourselves, and avoided the word “compiler” because we were converting ALGOL 60 into an intermediate language, of the form that later became known as P-code, rather than into machine instructions. In effect our intermediate language defined the order code of the machine that we would have liked to have had as our target - an order code that was, I learned some years later, one of the sources of inspiration for the Burroughs 6500 computer, the successor to their B5000.
There was considerable, in general friendly, rivalry between Whetstone and Kidsgrove, but also - thanks largely to Fraser Duncan, to whom the Kidsgrove team reported - good cooperation in ensuring the compatibility or our respective systems. In any arguments over such issues we used the ALGOL 60 Report as a neutral referee, so to speak, and thus both projects stuck pretty closely to the letter of the Report.
We at Whetstone placed considerable emphasis on easing the program development task. One consequence was that we were dissatisfied by Dijkstra’s strategy of having his compiler signal the first error that it found and then refuse to carry on any further - rather we worked hard on developing a strategy whereby our translator would process the entirety of even a very error-ridden program. It did this in such a way as to have a very good chance of producing an accurate list of many of the errors contained in the program, without getting confused and reporting lots of alleged errors in perfectly good sections of program text. I still recall that when Dijkstra wrote to acknowledge the copy of the report that we had sent him describing this scheme he for the first time addressed me as “Brian”. In all our extensive previous correspondence he had always addressed me as “Mr Randell”. I felt as though I’d had just been awarded a Masters degree!
We chose to invent a flow diagram notation to use for the detailed design and documentation of our translator and interpreter. (The idea of documenting our design in ALGOL and using bootstrapping as way a means of implementing it, an approach used for NELIAC, was too strange and unfamiliar an idea for us to contemplate at all seriously.) We then hand-coded the flow diagrams in the KDF9’s assembly language (“usercode”) and used a painfully slow emulator of the KDF9 that had been written for DEUCE to start the process of checking them.
But then the delivery date for the first KDF9 started slipping - just a few weeks at a time, but over and over again. These delays were very disruptive, but in fact we ended up putting them to very good use. Someone suggested that we publish the details of Whetstone ALGOL in book form. We sought Dijkstra’s views on this. He was very supportive, and gave me some very good advice, that I’ve since passed on to many graduate students. This was to the effect that a straightforward description of our system would be of interest to only a very limited readership. However, an account that documented all the possibilities we had considered for each design decision that we had taken, explained why we had made our particular design choices (admitting where necessary that a decision had been arbitrary), and reviewed the merits of these decisions in the light of our subsequent experience, would be of much greater value.
So we set out to write such a book, while we awaited the arrival of the KDF9. We worked on the book very intensively, since we had no idea how long the delays would continue, and we thought (in fact wrongly) that others elsewhere, in particular Peter Naur in Copenhagen and Gerrit van der Mey in at the Dr. Neher Laboratorium in the Netherlands, were already busy preparing books on their ALGOL efforts. We completed the book in nine months, mainly in our spare time.
Our inexperience as authors resulted in our providing the publishers, Academic Press, with a much more polished manuscript than they were used to receiving. We were mindful of all the horror stories that Dijkstra had told us of errors that had been introduced into some of his publications by the manual typesetting process - indeed he urged us to insist on direct photographic reproduction of our typescript. We did not follow this advice, but instead had several typewriters modified so as to have the full ALGOL 60 character set, even including the small subscript “10” character. And we also had all numeric digits use an italic font - so that there could be no confusion between the letter “O” and the digit “0”, or between the letter “l” and the digit “1”. These precautions paid off handsomely.
Somewhere during this process, reacting to an announcement of yet another two- week delay, for fun we started to see what we could produce during this period in the way of a little ALGOL -like system for DEUCE, with one-character identifiers (associated with specific words in a 32-word mercury delay line memory). A small succession of further delays, and consequent extensions of our little toy system, ended up with our having - to our own surprise - produced a useful system. This implemented block structure, simple arithmetic expressions, assignment and for statements, procedures, simple parameters, and input/output. Quite a few people started using it in anger for small one-off calculations. But once the first KDF9 became operational at Kidsgrove we abandoned this diversion.
Meanwhile various colleagues, at Whetstone and elsewhere, started to implement our flow diagrams in order to provide full ALGOL 60 on various other machines, even including DEUCE. In so doing they provided valuable feedback on both the diagrams, and the early drafts of our book. (I recall that the chapter on blocks and display, at the time very novel and difficult to understand concepts, was rewritten at least five times. It ended up as perhaps the clearest part of the book!) As a result, when we at last got access to a more-or-less working KDF9, our implementation of Whetstone ALGOL went very smoothly. (For a while, if we ever had unexpected results when running our system on the first KDF9, it was our habit to try to reproduce them on Kidsgrove’s DEUCE emulator for KDF9. Any time the results differed we could be sure that the machine and/or the emulator were at fault, and we would gleefully hand both back to their developers while they found out which needed to be mended.)
Our system was designed to work on a minimum configuration KDF9, with 8k word memory, and to read its input directly from paper tape. Given its intended use as a program development system, we put considerable effort into ensuring that the translator operated in a single pass. And Lawford did such a good job of overlapping translation with input that the translator kept the paper tape reader operating at full speed, and there was no perceptible delay after the end of tape was reached before the translated program started running. Thus users could regard our system as essentially cost-free. (In contrast, to our great scorn, KDF9’s assembler seemed to do most of its work only after reading had finished.)
Another ALGOL compiler activity of which we were directly aware was that by Tony Hoare at Elliott Brothers. His compiler, for the Elliott 803, was in fact I believe operational before any of the compilers based on our design. However, it was for a somewhat limited version of ALGOL. Thus when Tony published his famous QuickSort algorithm in the Communications of the ACM , he included a version that avoided the use of recursion, since this was not supported by his compiler. I’m afraid that, as soon as our system was operational, we rather mischievously sent in a certification of the original recursive version of QuickSort, whose publication in effect signalled the completion of Whetstone ALGOL.
I have already alluded to the ALGOL compiler designed by Gerrit Van der Mey - probably the most impressive of the early ALGOL compiler efforts. This was not just because it was the most complete implementation of the language of which I was aware - it even handled dynamic own arrays and various other unfortunate little complications of the language that just about everybody else avoided tackling. Rather it was because van der Mey, who I met just once when I was taken to his house by Fraser Duncan, was both totally blind and deaf. Yet despite these handicaps, he had almost singlehandedly designed his compiler, though colleagues at the Dr. Neher Laboratorium helped with such debugging as was needed. The one other ALGOL Compiler I can recall knowing about in those days was an amazing effort by Brian Higman, for an incredibly small machine, an experimental process control computer being developed by G.E.C. whose total memory capacity was 512 20-bit words. But elsewhere in the UK the emphasis was on Autocode rather than ALGOL, but this was a subject we had very little contact with.
By 1964 the Whetstone System was operational, our book had been published, more compiler projects had been undertaken based on it, and I had been head-hunted by IBM to join the T.J. Watson Research Center in Yorktown Heights. I had agreed to join IBM on condition that I was not asked to continue working on ALGOL compilers - I wanted a change. They agreed, and moreover agreed to my accepting the invitation that I’d had a long time earlier to join the IFIP Working Group 2.1 on ALGOL, an invitation that English Electric had not let me accept because of the costs of the foreign travel involved. (The Working Group was formed in 1962 as a successor to the original ALGOL Committee.) So I attended the 1964 meeting of the Committee, and the associated IFIP Working Conference on “Formal Language Description Languages”, held in Baden, near Vienna. (This conference attempted, not altogether successfully, to bring together people interested in language theory and ones interested in actual programming languages and their compilation.) And thus, over the next few years, I found myself in the midst of the growing controversy over plans for a successor to ALGOL 60, culminating in being one of the small group of committee members who in 1968 wrote a Minority Report on ALGOL 68, and resigned en masse from the Committee. But that is another story.
One final anecdote: when I returned to the UK in 1969, to the Chair of Computing Science at Newcastle University, I found that Whetstone ALGOL was still the mainstay of their first-year programming courses, and so was being used very extensively on their soon-to-be-replaced KDF9. I took care to make it clear that I wanted no part in dealing with any problems that the University might have with the system, only to be told: “We wouldn′t let you touch it even if you wanted to!” Similarly, I also can claim no credit at all for the recent splendid effort by David Holdsworth and colleagues that has succeeded in getting the Whetstone ALGOL system working again, starting from a lineprinter listing of its code, on an emulator of the KDF9.
Editor’s note: This is a transcript of a talk given by the author at the Science Museum on 14 January 2010. Brian Randell is Emeritus Professor of Computer Science at the University of Newcastle and was recently made a fellow of the ACM for his outstanding contribution to computer history. He can be contacted at .
CCS Web Site Information
The Society has its own Web site, which is now located at www.computerconservationsociety.org. It contains news items, details of forthcoming events, and also electronic copies of all past issues of Resurrection, in both HTML and PDF formats, which can be downloaded for printing. We also have an FTP site at ftp.cs.man.ac.uk/pub/CCS-Archive, where there is other material for downloading including simulators for historic machines. Please note that the latter URL is case-sensitive.
Top Previous Next
On the Occasion of the 50th Anniversary of the ALGOL 60 Report
Tony Hoare
“Here is a language [ALGOL 60] so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors. Of particular interest are its introduction of all the main program-structuring concepts, and the simplicity and clarity of its description, rarely equalled and never surpassed.”
The quotation is from the appendix to the published version of a keynote talk that I gave at the ACM SIGPLAN Conference in Boston, October 1973. The talk put forward the view that “the primary purpose of a programming language is to help the programmer in the practice of his art... it should give him the greatest assistance in the most difficult aspects of his art, namely program design, documentation, and debugging”. The relevant criteria I gave for judging quality of a language included “simplicity, security2, and readability”. Most of my talk described how several of ALGOL 60’s rivals and successor languages pursued alternative criteria.
“...I do not wish to deny that there are many other desirable properties of a programming language: for example, machine independence, stability of specification, use of familiar notations, a large and useful library, existing popularity, or sponsorship by a rich and powerful organisation.” Subsequent history has shown that these desirable properties have been dominant in the choice of programming language by the vast majority of programmers; it was the failure of the ALGOL family of languages to meet these criteria that contributed to its eclipse.
Even so, the ideas and the ideals of the ALGOL family of programming languages live on. They are most apparent in newer language designs such as Java and C#, and in the continuing stream of features that are being added to those languages. Even beyond just language design the ALGOL influence stretches to the design of Application Program Interfaces for general-purpose and special-purpose library packages. It extends also to design patterns, which are widely copied (with adaptation) into programs written in existing languages. A design pattern is an embodiment of a software architecture, or an implementation of an abstract programming concept. Effectively, the user of a design pattern is coding by hand a new structure or feature of a more abstract programming language.
There remains the objection that extensions to existing languages only build further complex superstructure on foundations that are already too complicated, and inherently insecure. So let us not abandon research to discover a language which combines the simplicity and security of ALGOL 60 with the full range of modern program-structuring principles used in libraries and design patterns: these include resource management, control of responsiveness, object orientation, distribution and concurrency. Such a language would play the same fundamental and unifying role in Computer Science as the framework of natural law plays in other branches of science. Even if the language never achieves wide usage, everyone who understands and appreciates it would thereafter be a better programmer, no matter what language they use for coding their designs. Because that, in essence, has been the really enduring achievement of ALGOL 60 itself.
2 Security was defined as the property that the effect of running a program (even an incorrect one) is entirely predictable from the specification of the high level language, and does not require knowledge of the underlying machine or operating system or translator.
Editor’s note: This is a paper submitted by the author to the seminar on ALGOL 60 at the Science Museum on 14 January 2010. Professor Sir Tony Hoare is Emeritus Professor of Computer Science at Oxford University and is now a principal researcher at Microsoft Research in Cambridge. He can be contacted at .
Top Previous Next
An ALGOL 60 Spiritual User
Luca Cardelli
Today most respectable programming languages inherit from ALGOL 60. Here we consider a small selection of them and show what ideas have been copied to best effect.
Luca Cardelli
ALGOL 60 was one of the most talked-about programming languages when I was an undergraduate in Pisa in the early 70’s, and it clearly had a great influence on my studies. I should make it clear right away, though, that a language with a grand total of three types (real, integer, Boolean) had no conceivable practical use for me! The avant-garde languages of the time were Simula 67 and ALGOL 68. I carefully read the grid-arranged chapters of the ALGOL 68 book in both orders, and the object-ridden Simula 67 book a few times. Although more attracted to ALGOL 68, I ended up using Simula 67 as my main programming language because it was available on my IBM mainframe. Considering the alternatives (Fortran, COBOL, PL/1, and Lisp 1.5) there was really little choice. Simula 67 had ALGOL 60 as a subset, but I was attracted to it mostly by its rich type system, garbage collection, and string processing, none of which were features of ALGOL 60. ALGOL block structures were put to new uses, as objects, inventing object-oriented programming over 10 years before it became widely popular. There is no doubt that, through Simula, I was raised in the algorithmic programming style of ALGOL 60, and on the Structured Programming approach of Dijkstra and Hoare. To the point that later in life I could never bring myself to read assembly language algorithms (i.e. Knuth’s): that must surely count as a negative influence of ALGOL 60!
The other main influence on my education was λ-calculus, and through it Lisp 1.5. I used Lisp extensively too, but I never enjoyed it because, compared to Simula it had no types, and its supposedly advanced interactive environment was actually quite bizarre. The main connection between λ-calculus and ALGOL 60 goes through Landin: I was familiar with that work too as an undergraduate. ALGOL 60 also deeply influenced my studies of denotational semantics, because some of its more challenging applications were in the semantics of imperative languages (typically ALGOL - like) through Wadsworth’s continuations.
After my graduation from Pisa, the influence of ALGOL 60 on me stopped for a while: I moved to Edinburgh and started studying various schools of formal semantics, and functional programming. I found the compilation of functional languages to be an interesting topic, and decided to implement an ML compiler. Ah, but implement it in what language? Surely not in error-prone Lisp, and ML itself was not up to the task yet. I had no longer access to Simula 67, but we had Pascal on the department’s VAX. Hence Pascal became the most ALGOL- like language I ever actually used: it had no objects or garbage collection, but was perfect for the systems programming tasks of the times, including implementing a compiler. A bit later, my first paper on ML type-checking was published with an appendix consisting of a Modula-2 version of my Pascal code for the Hindley-Milner algorithm: that was because ALGOL still had such a large influence that the reviewers rejected the “obscure” ML code that I had first written for that appendix.
The purpose of the discussion so far is to set the context of my involvement in Modula-3, because that is the only (possibly) ALGOL - like language that I helped design. Functional programming, the main subject of my research, was not up to the task of writing real systems programs. During my first job at Bell Labs, in the original Unix group, I used C extensively. By the time I moved to Digital Equipment Corporation I was glad to return to a more ALGOL - like language. Modula-2+ was a Xerox-PARC inspired extension of Modula-2 that was widely used at DEC Systems Research Centre: it had a strong list of innovations, but was in need of a redesign. The general problem to be solved then (and now) was to carry out systems programming in a well-structured and type-safe language. My main contributions to Modula-3 were in reformulating the type system of Modula-2+, using techniques coming largely from functional programming. An explicit goal of the Modula-3 committee as a whole was to be inspired by both Simula 67 and ALGOL 68 (and Modula-2 of course). For about 10 years, between Simula 67 and Java, Modula-3 became my favourite language, and my favourite working language of all times. I still regard its module system as unsurpassed, and I say this as a user rather than a designer: it made it possible to cleanly structure the large programs I was writing in ways that I have not found in other languages, including modern or functional languages.
Which brings us back to ALGOL 60. The original spirit of ALGOL 60 was the effort to structure algorithms so that they can be better understood, in contrast, for example to structuring functions, or equations, or formulas, or machine architectures. That is what has always driven me, as a language user and designer. ALGOL 60 did not have much of a type system, but its direct successors ALGOL 68 and Simula 67 made radical and divergent contribution to structuring algorithms by types. It did not have Simula’s classes or Modula’s modules, but both evolved from program blocks as independent disciplines of programming. Algol 60 lacked all the most important features, and yet its spirit inspired them all. ALGOL 60 is my favourite language that I have never used, except in spirit.
Editor’s note: This is a paper submitted by the author to the seminar on ALGOL 60 at the Science Museum on 14 January 2010. Luca Cardelli is a Principal Researcher at Microsoft Research Cambridge. He heads the Programming Principles and Tools Group and his main interests are in type theory and operational semantics, mostly for applications to language design and distributed computing. He can be contacted at .
Cliff Jones
In the absence of their respective authors, both the paper above and that of Tony Hoare were read at the Science Museum by Cliff Jones of Newcastle University. The meeting was held jointly with the Advanced Programming Group of the BCS with speakers organised by the APG’s John Florentin. The CCS is grateful to everybody who contributed to a fascinating afternoon.
Top Previous Next
The Atlas ALGOL System
Bob Hopgood
To overcome the variation between dialects of ALGOL 60, the Chilton Atlas team implemented an ingenious and versatile pre-processor for the language.
I joined the programming group at Harwell in 1959 and spent much of the next three years writing quantum chemistry programs in Fortran, initially for the IBM 704 at Aldermaston and later the 7090. Source code consisted of about five 2,000-card trays and the relocatable binary about 500 cards. A typical development run was the relocatable binary card set plus the source of any revised subroutines. Initially this was sent by car to Aldermaston or plane to Risley but later an IBM 1401 link was established. In 1962 the UKAEA had 10 times the computer power of the UK academic community as a whole. In 1963 Aldermaston replaced the 7090 with an IBM 7030 (Stretch) and the 7090 moved to Risley. Unfortunately, the Fortran compiler delivered by IBM was very poor and Alick Glennie at Aldermaston assembled a team to produce three Fortran compilers in succession (S1, S2 and S3) for the 7030. S1 was a simple fast-compilation system, S2 had basic-block optimisation and S3 included global optimisation. I worked on the lexical/syntax analysis parts of S2 for the first half of 1963 before joining the new Atlas Computer Laboratory (ACL) at Chilton in September 1963.
In 1961 work started on a Fortran compilation system for the Ferranti Atlas called Hartran. This was written in Fortran and ran initially on the 7090 before being moved to Atlas.
The role of ACL was to provide additional computer facilities for the universities, Harwell and the Rutherford High Energy Laboratory. It was clear early on that the interest in ALGOL in the universities was sufficiently high that an ALGOL Compiler was needed on the Chilton Atlas that could handle programs originating from the Elliott and English Electric families as well as from Ferranti machines. Alex Bell had joined ACL from Manchester. Alex and I were given the responsibility of producing an ALGOL system on Atlas that would be of use to the UK academic community. Initially, the idea was to write yet another ALGOL compiler but after discussions with Robin Kerr (Ferranti, later ICT) and John Clegg (Manchester University) we decided to use the ALGOL compiler that they were writing for Atlas using the Brooker-Morris Compiler-Compiler system and replace the first stage by a flexible pre-processor that would allow the different ALGOL dialects in use to be input and run with their own input/output system.
The two computers for which ALGOL programs were most frequently written were the English Electric KDF9 and the Elliott 803. In each case both the input/output facilities provided and the punched form of the program were completely different from the Atlas ALGOL conventions. The 803 used five-hole paper tape, the KDF9 eight hole while Atlas used seven hole! Luckily the tape reader on Atlas could read all three forms of tape. Reading University had an Elliott 803 and we were much indebted to Leonard Dresel who allowed us to come over and run tests on a Friday afternoon. In return, when the pre-processor was up and running in late 1965, we were able to run an eight hour shift of Elliott 803 ALGOL programs through Atlas in about five minutes. To get round the lack of separate compilation facilities in the ALGOL language, the Atlas system wrapped each ALGOL program inside an outer block which had its own variable declarations and a set of procedures. This allowed the individual I/O libraries, the ALGOL library and the GROATS graphics system to be selectively added to any ALGOL program.
The pre-processor eventually had over 15 different dialects of ALGOL that it could handle including both French and Danish dialects. KDF9 ALGOL programs could run with Elliott I/O and vice versa if needed. We introduced a card-based dialect and attempted to overcome the sheer size of the source programs by having a pseudo-binary format that encoded each ALGOL basic symbol, identifier and number as a single column on a card together with a symbol table. This made the size manageable but made editing difficult. So we added an editor to allow the pseudo-binary cards to be changed. Even so, running large ALGOL programs was always more difficult that running Fortran programs. Despite all our efforts, the ALGOL usage on Atlas was never more than about 10% of the workload compared with 65% for Fortran and the remainder spread across machine code, Mercury and Atlas Autocodes etc. Each week, Atlas read a million cards and 30 miles of paper tape!
The elegance of the ALGOL language never overcame its basic weaknesses: no defined markup, no standard I/O, no defined libraries, no sub-compilation and no relocatable binary loader. Despite all our efforts, the bulk of the scientific community stayed with Fortran despite its language deficiencies. Fortran COMMON and EQUIVALENCE statements together with sub-compilation and relocatable binary made it the more attractive option.
Editor’s note: Bob Hopgood is Emeritus Professor of Computer Science at Oxford Brookes University. He can be contacted at .
Top Previous Next
ALGOL 60 - Questions of Naming
Dik Leatherdale
Take pity upon your poor editor!
Contributions to this, the special ALGOL 60 edition of Resurrection, arrive from hither and thither and yet no sign of any agreement about how to write “ALGOL”. Some write “Algol”, others “ALGOL”. It’s a short form of “Algorithmic Language” of course, but shouldn’t that be “AlgoL”? But nobody writes “AlgoL”. So we refer, of course, to Wikipedia which tells us that The name of the family is sometimes given in mixed case (Algol 60) and sometimes in all uppercase (ALGOL 68). For simplicity, this article uses ALGOL. - a ruling Wikipedia then proceeds to ignore with what we used to call “gay abandon”.
The report was originally published in the Communications of the ACM and in the Computer Journal. As you will see on page 13, Comm. ACM uses “ALGOL” which looks rather elegant. Other sources, however, do not. In the end I decided to do whatever the Computer Journal did. Our esteemed chairman looked it up for me. So “ALGOL” it is then.
But the original name (of ALGOL 58) was the International Algebraic Language - IAL for short. In the hands of Jules Schwartz, this grew into the once-popular JOVIAL - Jule’s Own Version of the International Algebraic Language, reminding us of Donald Knuth’s wise words -
The most important thing in the programming language is the name. A language will not succeed without a good name. I have recently invented a very good name and now I am looking for a suitable language
And then again, there is another Algol - the first example to be discovered of a “binary star” - two or more stars orbiting around one another which appear at first sight to be a single celestial object. Once again, we are indebted to Wikipedia to discover that: The name Algol derives from Arabic ra′s al-ghul : head (ra’s) of the ogre (al-ghul) (“the ghoul”) which was given from its position in the constellation Perseus.
So there you have it - a programming language, a bon mot and a binary star.
So take pity upon your poor editor. This is what he does all day!
TopPrevious Next
Forthcoming Events
London Seminar Programme
15 Apr 2010Software Development at Hursley:
The First 35 Years and BeyondGeoffrey Sharman
& Andy Krasun 20 May 2010Pegasus @ 50 Chris Burton, Len
Hewitt & others 16 Sep 2010Research MachinesJohn Leighfield 28 Oct 2010Computers & Communications John Naughton &
Bill Thomson 18 Nov 2010Konrad Zuse and the Origins of the
Computer Horst Zuse
London meetings take place in the Director’s Suite of the Science Museum, starting at 14:30. The Director’s Suite entrance is in Exhibition Road, next to the exit from the tunnel from South Kensington Station, on the left as you come up the steps. Queries about London meetings should be addressed to Roger Johnson at , or by post to Roger at Birkbeck College, Malet Street, London WC1E 7HX.
Manchester Seminar Programme
Meetings are currently suspended while refurbishment proceeds at the Museum. It is expected that resumption will be in the autumn.
North West Group meetings take place in the Conference Room at the Manchester Museum of Science and Industry, usually starting at 17:30; tea is served from 17:00. Queries about Manchester meetings should go to Gordon Adshead at .
Details are subject to change. Members wishing to attend any meeting are advised to check the events page on the Society website at www.computerconservationsociety.org for final details which will be published in advance of each event. Details will also be published on the BCS website (in the BCS events calendar) and in the Events Diary columns of Computing and Computer Weekly.
Museums
MOSI : Demonstrations of the replica Small-Scale Experimental Machine at the Museum of Science and Industry in Manchester have been suspended due to development work in the museum. Resumption is likely to be late next summer.
Bletchley Park : daily. Guided tours and exhibitions, price £10.00, or £8.00 for concessions (children under 12, free). Exhibition of wartime code-breaking equipment and procedures, including the replica Bombe and replica Colossus, plus tours of the wartime buildings. Go to www.bletchleypark.org.uk to check details of times and special events.
The National Museum of Computing : Thursday and Saturdays from 13:00. Entry to the Museum is included in the admission price for Bletchley Park. The Museum covers the development of computing from the wartime Colossus computer to the present day and from ICL mainframes to hand-held computers. See www.tnmoc.org for more details.
Science Museum :. Pegasus “in steam” days have been suspended for the time being. Please refer to the society website for updates.
North West Group contact details
Chairman Tom Hinchliffe: Tel: 01663 765040.
Email:
Secretary Gordon Adshead Tel: 01625 549770.
Email:
Top Previous Next
Committee of the Society
[The printed version carries contact details of committee members]
Chairman Dr David Hartley FBCS CEng
Secretary & Leader, DEC Project Kevin Murrell
Treasurer Dan Hayton
Chairman, North West Group Tom Hinchliffe
Editor, Resurrection Dik Leatherdale MBCS
Web Site Editor Alan Thomson
Archivist Hamish Carmichael FBCS
Meetings Secretary Dr Roger Johnson FBCS
Digital Archivist & Leader, Our Computer Heritage Project Professor Simon Lavington FBCS FIEE CEng
Science Museum representative Dr Tilly Blyth
MOSI representative Catherine Rushmore
Codes and Ciphers Heritage Trust representative Pete Chilvers
Leader, Colossus Project Tony Sale Hon FBCS
Leader, Elliott 401 Project Chris Burton CEng FIEE FBCS
Leader, Bombe Rebuild Project John Harper Hon FBCS CEng MIEE
Leader, Elliott 803 Project John Sinclair
Leader, Pegasus Project Len Hewitt MBCS
Leader, Software Conservation Project Dr Dave Holdsworth CEng Hon FBCS
Leader, 1301 Project Rod Brown
Leader, Harwell Dekatron Computer Project Tony Frazer
Professor Martin Campbell-Kelly
Peter Holland
Dr Doron Swade CEng FBCS MBE
Point of Contact
Readers who have general queries to put to the Society should address them to the Secretary: contact details are given elsewhere. Members who move house should notify Kevin Murrell of their new address to ensure that they continue to receive copies of Resurrection. Those who are also members of the BCS should note that the CCS membership is different from the BCS list and is therefore maintained separately.
Top Previous
Aims and objectives
The Computer Conservation Society (CCS) is a co-operative venture between the British Computer Society, the Science Museum of London and the Museum of Science and Industry (MOSI) in Manchester.
The CCS was constituted in September 1989 as a Specialist Group of the British Computer Society (BCS). It thus is covered by the Royal Charter and charitable status of the BCS.
The aims of the CCS are to
To promote the conservation of historic computers and to identify existing computers which may need to be archived in the future
To develop awareness of the importance of historic computers
To develop expertise in the conservation and restoration of historic computers,
To represent the interests of Computer Conservation Society members with other bodies,
To promote the study of historic computers, their use and the history of the computer industry,
To publish information of relevance to these objectives for the information of Computer Conservation Society members and the wider public.
Membership is open to anyone interested in computer conservation and the history of computing.
The CCS is funded and supported by voluntary subscriptions from members, a grant from the BCS, fees from corporate membership, donations, and by the free use of the facilities of both museums. Some charges may be made for publications and attendance at seminars and conferences.
There are a number of active Projects on specific computer restorations and early computer technologies and software. Younger people are especially encouraged to take part in order to achieve skills transfer.
|
||||||||
4768
|
dbpedia
|
0
| 56
|
https://jmvdveer.home.xs4all.nl/en.post.perspective-of-algol-68.html
|
en
|
A perspective of Algol 68
|
[
"https://jmvdveer.home.xs4all.nl/images.banner.png",
"https://jmvdveer.home.xs4all.nl/images.algol-68-genie.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.nas.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.veldpost.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.sals-sans.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.endgame.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.marcel-van-der-veer.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.sun.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.pooh.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.pleiades.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.nas.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.vm-mvs.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.a68c.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.cd.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.interface.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.language-banner.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.thermodynamics.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.cwi.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.interface.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.download.marcel-van-der-veer-thesis.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.download.marcel-van-der-veer-foaralder.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.a68c.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.blog.eeyore.png",
"https://jmvdveer.home.xs4all.nl/thumb.images.algol-68-genie.png",
"https://jmvdveer.home.xs4all.nl/images.veer.png"
] |
[] |
[] |
[
""
] | null |
[
"Marcel van der Veer"
] |
2024-07-03T15:03:26+00:00
|
To better understand the position of Algol 68 among todayâs plethora of programming languages, we should consider their development …
|
images.icons.a68g.ico
|
https://jmvdveer.home.xs4all.nl/
|
https://jmvdveer.home.xs4all.nl//en.post.perspective-of-algol-68.html
|
The development of Algol played an important role in establishing computer science as an academic discipline. The former Mathematisch Centrum Amsterdam, now CWI, Centrum voor Wiskunde & Informatica was a leading institute in the design of Algol 68, as it was in later years in the development of Python. To better understand the position of Algol 68 among todayâs plethora of programming languages, we should consider their development. First I will give a chronological overview discussing the period 1950-1960 in which a number of programming languages evolved, then the period 1980-1990 that saw consolidation of imperative languages and later the recombination and maturation of existing ideas in the 1990âs, and finally current trends. After that, the particular history of Algol 68 is discussed, and its current position. You will of course be aware that this post is an Algol 68 implementer's perspective.
In the period 1950-1960 a number of programming languages evolved, the descendants of which are still widely used. The most notable are Fortran by Backus et al., Lisp by McCarthy et al., Cobol by Hopper et al. and Algol 60 by a committee of European and American academics including Backus. Algol 60 was particularly influential in the design of later languages since it introduced nested block structure, lexical scope, and a syntax in Backus-Naur form (BNF). Nearly all subsequent programming languages have used a variant of BNF to describe context-free syntax.
At the time of the development of Algol 68, programming languages were required to serve two purposes. They should provide concepts and statements allowing a precise formal description of computing processes and facilitate communication between programmers, and they should provide a tool to solve small to medium-sized problems without specialist help. The context of Algol 68âs development is perhaps adequately illustrated by a quote [1] from Edsger Dijkstra: The intrinsic difficulty of the programming task has never been refuted ⦠I vividly remember from the late 60âs the tendency to blame the programming languages in use and to believe in all naivety that, once the proper way of communicating with the machines had been found, all programming ills would have been cured.
The early procedural programming languages served above purposes required for them. However, the evolving need to build complex interactive systems asked for decomposition of a problem into "natural" components, resulting in object oriented programming languages starting as early as the 1960âs. The object oriented and procedural paradigms each have strengths and weaknesses and it is not always clear which paradigm is best suited to certain problems. In numerical and scientific computing for instance, the benefit of object oriented languages over procedural languages is controversial since in heavy number crunching, efficiency is a top priority. The latter is a likely reason that software packages coded in Fortran are operational today, despite the rise of for instance Python in this field.
The period 1960 to 1980 produced most of the major language paradigms now in use. Algol 68 was conceived as a successor to Algol 60. Its syntax and semantics became even more orthogonal and were defined by a Van Wijngaarden grammar, a formalism designed specifically for this purpose. Simula by Nygaard and Dahl was a superset of Algol 60 supporting object oriented programming, while Smalltalk by Kay, Ingalls and Kaehler, was a newly designed object oriented language. C, the Unix system programming language, was developed by Ritchie and Thompson at Bell Laboratories between 1969 and 1973. Prolog by Colmerauer, Roussel, and Kowalski was the first logic programming language. ML by Milner built a polymorphic type system on top of Lisp, pioneering statically typed functional programming languages. Each of these languages spawned a family of descendants, and most modern languages count at least one of them in their ancestry. Other important languages that were developed in this period include Pascal, Forth, Scheme and SQL.
During the 1980's imperative languages consolidated. Although major new paradigms for imperative languages did not appear, many researchers elaborated on existing ideas from the 1970âs, for example object oriented programming, and adapting them to new contexts, for example to distributed systems. C++ combined object oriented programming and system programming. The United States government standardised Ada as a system programming language for defense contractors. Mainly in Japan major efforts were spent investigating so-called fifth-generation programming languages that incorporated logic programming constructs. The functional languages community standardised ML and Lisp. Research in Miranda, a functional language with lazy evaluation, began in this decade. An important trend in 1980âs language design was increased focus on programming large-scale systems through the use of modules, reflected in the development of Modula, Ada and ML. Some other notable languages from the 1980âs are Objective C and Perl.
In the 1990âs recombination and maturation of existing ideas continued. An important motivation in this period was productivity. Many rapid application development (RAD) languages emerged, which usually were descendants of older, typically object oriented, languages that were equipped with an IDE and garbage collection. These languages included Object Pascal, Visual Basic, and Java. Java in particular received much attention. More radical and innovative were new scripting languages. These did not directly descend from other languages and featured new syntax and liberal incorporation of features. Many consider these scripting languages as more productive than RAD languages, though others will forward that scripting languages may make small programs simpler but large programs are more difficult to write and maintain. Nevertheless, scripting languages came to be the most prominent ones used in connection with the internet. Some important languages that were developed in the 1990âs are Haskell, Python and PHP.
Some current trends in programming languages are mechanisms for security and reliability verification, alternative mechanisms for modularity, component-oriented software development, constructs to support concurrent and distributed programming, metaprogramming, and integration with databases. This century has so far seen the introduction of for example C#, Visual Basic.NET and Go.
Algol, ALGOrithmic Language, is a family of imperative computer programming languages which greatly influenced many other languages and became the de facto way algorithms were described in textbooks and academic works for almost three decades. The two specifications relevant to this publication are Algol 60, revised in 1963, and Algol 68, revised in 1976. Algol 58, originally known as IAL (International Algebraic Language), was an early member of the Algol family soon superseded by Algol 60. Algol 58 introduced a compound statement which was restricted to flow of control only and did not relate to lexical scope as do Algol 60âs blocks.
Ideally, a programming language supports systematic expression of algorithms by offering appropriate control structures and data structures, and a precise, consistent formal definition to avoid surprises and portability issues resulting from details that are implementation dependent. Members of the Algol family (Algol 60 and Algol 68, Simula, Pascal and also Ada, …) are considered reasonable approximations of such "ideal" languages, although all of them have strong points as well as disadvantages. Algol 68 offers appropriate means of abstraction and exemplary control structures that lead to a good understanding of structured programming. Its orthogonality results in economic use of language constructs making it a beautiful tool for programming.
The design of Algol was firmly rooted in the computing community, at the time a term for the small but growing group of computer professionals and scientists. It formed an international platform for discussing programming languages, compiler construction, program construction, et cetera, and thus Algol had an important part in establishing computer science as an academic discipline in its own right. Algol 60 was designed by and for numerical mathematicians; in its day it was the Lingua Franca of computer science. The language introduced block structure with lexical scope and a concise BNF definition that were appreciated by people with a background in mathematics, but it lacked compilers and industrial support which gave the advantage to languages as Fortran and Cobol. To promote Algol, its application range had to be extended. IFIP [2] Working Group 2.1 Algorithmic Languages and Calculi (WG 2.1), that to this day has continuing responsibility for Algol 60 and Algol 68, assumed the task of developing a successor to Algol 60.
In the early 1960âs WG 2.1 discussed this successor and in 1965 descriptions of a language Algol X based on these discussions were invited. This resulted in various language proposals by Wirth, Seegmüller and Van Wijngaarden [3] and other significant contributions by Hoare and Naur. Van Wijngaardenâs paper Orthogonal design and description of a formal language [4] featured a new technique for language design and definition and formed the basis for what would develop into Algol 68. Many features found in Algol 68 were first proposed in Algol Bulletin by the original authors of Algol 60 like Peter Naur, by new members of WG 2.1 like Tony Hoare and Niklaus Wirth, and by many others from the world-wide computing community.
Reference [5] gives a first hand account of the events leading to Algol 68. Algol 68 has had a large influence on the development of programming languages since it addressed many issues; for example orthogonality, a strong type system, procedures as types, memory management, treatment of arrays, a rigorous description of syntax, and parallel processing, but also ideas that caused debate over the years such as context-sensitive coercions and quite complicated input-output formatting. After various, apparently lively, meetings WG 2.1 had not reached unanimous consent. Eventually, Algol 68 was produced by those members wanting a new milestone in language design. Dijkstra, Hoare, Seegmüller and others wrote a brief "Minority Report", more a letter than a report actually, outlining their view on what Algol 60's successor should have been; many years later it was commented that no programming language developed since, would have satisfied that vision. Yet others, for example Wirth and Hoare, opted to shorten the development cycle by improving Algol 60, which eventually produced Algol W and later Pascal.
Algol 60 syntax is in BNF form whereas Algol 68 syntax is described by a two-level W-grammar (W for Van Wijngaarden) that can define a context-sensitive grammar. Simply put, in a W-grammar a high-level grammar produces a low-level grammar for all correct particular programs. The concept of context-sensitive grammar was introduced by Chomsky in the 1950âs to describe the syntax of natural language where a word may or may not be appropriate in a certain position, depending on context. Analogously, Algol 68 syntax defines syntactic restrictions; for example demanding that applied-identifiers or operators be declared (but not necessarily before they are applied), or demanding that modes result in finite objects that require finite coercion, et cetera. To reject incorrect programs, a context-free syntax must be complemented with extra rules formulated in natural language. The rigorous Algol 68 specification had a few loose ends, while the C specification had a few hundred.
Probably because of the formal character of the Revised Report, which takes some study to comprehend, the misconception got hold that Algol 68 is a complex language. In fact it is rather lean, and the alleged obscurity of description is toned down by most who studied it [5]. Since it is argued that the specification of many contemporary languages, including that of modern C, is more complex than that of Algol 68 [6], one can ask whether the latter just looked complex at the time, fifty years ago.
Algol 68 was defined in a formal document, first published in January 1969, and later published in Acta Informatica and also printed in Sigplan Notices. A Revised Report was issued in 1976 [7]. Algol 68 was the first major language for which a full formal definition was made before it was implemented. Though known to be terse, the Revised Report does contain humour solis sacerdotibus â to quote Koster from [5]: The strict and sober syntax permits itself small puns, as well as a liberal use of portmanteau words. Transput is input or output. âStowedâ is the word for structured or rowed. Hipping is the coercion for the hop, skip and jump. MOID is MODE or void. All metanotions ending on ETY have an empty production. Just reading aloud certain lines of the syntax, slightly raising the voice for capitalized words, conveys a feeling of heroic and pagan fun (â¦) Such lines cannot be read or written with a straight face.
Algol 68 was designed for programmers, not for compiler writers, in a time when the field of compiler construction was not as advanced as it is today. Implementation efforts based on formal methods generally failed; Algol 68âs context-sensitive grammar required some invention to parse [8]. At the time compilers usually were made available on mainframes by computing centres, which may explain why Algol 68 was popular in locations rather than areas, for instance Amsterdam, Berlin or Cambridge. It was relatively popular in the United Kingdom, where the ALGOL68R , ALGOL68RS and ALGOL68C compilers were developed. Commercial initiatives were relatively unsuccessful; for instance the FLACC compiler sold just twenty-two copies [9].
Industry did not adopt Algol 68, with a few exceptions. There may be as many opinions on this as there are experts you might ask. IBM would not support it (though CDC did, on demand of academia), it was not as fast as Fortran (but might have been), there was no community outside academia, implementations were lagging ⦠All too often, the timing must be right, the environment must be open to acceptance and adequate marketing should be in place, for an innovation to have a chance. But this is my two cents.
Once university mainframes were phased out, the compilers on them were decommissioned. For scientific work, many migrated in the 1990's from mainframes to workstations that did not have Algol 68. In fact, me leaving academia in 1991 led me to write my own Algol 68 implementation so I could continue programming in it.
Algol 68 apparently influenced many who were exposed to it, and it may still do so. A few individuals wonder, even today, whether it would have a future. In my opinion, the world has moved on. Although the language was innovative in its days, development stopped so it reflects the era it was developed in. Probably there is a newer language like Python that better suits your modern application on modern infrastructure, and has a large community. Algol 68 will not catch up anymore and further development would probably convert it into an anachronism. On the other hand, Algol 68 appears very familiar to young people, which is a testament to the influence it had on how we think about programming languages.
Therefore the Algol 68 Genie project preserves Algol 68 out of educational as well as scientific-historical interest. Those interested in the language may study it and actually use it to understand the language and the influence it had, and experience what computer science and computing was like in its pioneering years, half a century ago. Considering the reactions to the project, I think it is fair to say that it succeeds in this purpose. Of course, people also use it for programming, for instance in numerical mathematics as I like to do. Despite its age, even today Algol 68 is a beautiful means for writing algorithms.
Notes
[1] Transcript from keynote delivered at the ACM 1984 South Central Regional Conference. Source: E. W. Dijkstra Archive-the manuscripts of Edsger W. Dijkstra.
[2] IFIP, the International Federation for Information Processing is an umbrella organisation for national information processing organisations. It was established in 1960 under the auspices of UNESCO.
[3] Adriaan van Wijngaarden (1916-1987) is considered by many to be the founding father of computer science in the Netherlands. He was co-founder of IFIP and one of the designers of Algol 60 and later Algol 68. As leader of the Algol 68 committee, he made a profound contribution to the field of programming language design, definition and description.
[4] A. van Wijngaarden, Orthogonal design and description of a formal language. Available online from CWI.
[5] Proceedings, Conference on the history of Algol 68 [1993]. Available online from CWI.
[6] K. Henney. Procedural Programming: Itâs Back? It Never Went Away. Available for viewing on YouTube.
[7] A. van Wijngaarden et al., Revised report on the algorithmic language Algol 68 [1976]. Available online from CWI, or in HTML as a previous blog post, or as part of Learning Algol 68 Genie .
[8] B. Mailloux, On the implementation of Algol 68 [1968]. Available online from CWI. Algol 68 Genie employs a multi-pass scheme to parse Algol 68. Extracting declarations before parsing as proposed by Mailloux, renders the W-grammar LALR.
[9] Source: Chris Thomson, formerly with Chion Corporation, on comp.lang.misc [1988].
|
|||||
4768
|
dbpedia
|
1
| 36
|
http://groups.umd.umich.edu/cis/course.des/cis400/algol/algol.html
|
en
|
The ALGOL Programming Language
|
[
"http://groups.umd.umich.edu/cis/course.des/cis400/algol/algol.jpg",
"http://groups.umd.umich.edu/cis/course.des/cis400/prev_btn.gif",
"http://groups.umd.umich.edu/cis/course.des/cis400/home_btn.gif",
"http://groups.umd.umich.edu/cis/course.des/cis400/next_btn.gif"
] |
[] |
[] |
[
""
] | null |
[] | null | null |
Significant Language Features
ALGOL was the first second-generation programming language and its characteristics are typical of the entire generation. First consider the data structures, which are very close to first generation structures. In ALGOL 60 the block structure was introduced: the ability to create blocks of statements for the scope of variables and the extent of influence of control statements. Along with that, two different means of passing parameters to subprograms; call by value and call by name. Structured control statements: if - then - else and the use of a general condition for iteration control were also features, as was the concept of recursion: the ability of a procedure to call itself.
One of the greatest impacts ALGOL 60 had was a result of its description as found in Naur (1963). A major contribution of this report was the introduction of BNF notation for defining the syntax of the language. Overall, ALGOL is considered to be perhaps the most orthogonal programming language, meaning it has a relatively small number of basic constructs and a set of rules for combining those constructs. Every construct has a type associated with it and there are no restrictions on those types. In addition, most constructs produce values. Several of ALGOL�s other characteristics are listed below:
|
||||||||
4768
|
dbpedia
|
0
| 6
|
https://thehistoryofcomputing.net/website/algol
|
en
|
The History of Computing: ALGOL
|
http://assets.libsyn.com/content/73786964?height=250&width=250&overlay=true
|
http://assets.libsyn.com/content/73786964?height=250&width=250&overlay=true
|
[
"https://ssl-static.libsyn.com/p/assets/platform/websuite/deezer-logo.png",
"https://ssl-static.libsyn.com/p/assets/platform/websuite/stitcher.png",
"https://ssl-static.libsyn.com/p/assets/platform/websuite/radio-public.png",
"https://assets.libsyn.com/secure/show/197558/become-a-patron-button-1.png"
] |
[] |
[] |
[
""
] | null |
[] | null |
Today we’re going to cover a computer programming language many might not have heard of, ALGOL. ALGOL was written in 1958. It wasn’t like many of the other languages in that it was built by committee. The Association for Computing Machinery and the German Society of Applied Mathematics and Mechanics were floating around ideas for a universal computer programming language. Members from the ACM were a who’s who of people influential in the transition from custom computers that were the size of small homes to mainframes. John Backus of IBM had written a programming language called Speedcoding and then Fortran. Joseph Wegstein had been involved in the development of COBOL. Alan Perlis had been involved in Whirlwind and was with the Carnegie Institute of Technology. Charles Katz had worked with Grace Hopper on UNIVAC and FLOW-MATIC. The Germans were equally as influential. Frederich Bauer had brought us the stack method while at the Technical University of Munich. Hermann Bottenbruch from The Institute for Applied Mathematics had written a paper on constructing languages. Klaus Samelson had worked on a computer called PERM that was similar to the MIT Whirlwind project. He’d come into computing while studying Eigenvalues. Heinz Ritishauser had written a number of papers on programming techniques and had codeveloped the language Superplan while at the The Swiss Federal Institute of Technology. This is where the meeting would be hosted. They went from May 27th to June 2nd in 1958 and initially called the language they would develop as IAL, or the International Algebraic Language. But would expand the name to ALGOL, short for Algorithmic Language. They brought us code blocks, the concept that you have a pair of words or symbols that would begin and end a stanza of code, like begin and end. They introduced nested scoped functions. They wrote the whole language right there. You would name a variable by simply saying integer or setting the variable as a := 1. You would substantiate a for and define the steps to perform until - the root of what we would now call a for loop. You could read a variable in from a punch card. It had built-in SIN and COSIN. It was line based and fairly simple functional programming by today’s standards. They defined how to handle special characters, built boolean operators, floating point notation. It even had portable types. And by the end had a compiler that would run on the Z22 computer from Konrad Zuse. While some of Backus’ best work it effectively competed with FORTRAN and never really gained traction at IBM. But it influenced almost everything that happened afterwards. Languages were popping up all over the place and in order to bring in more programmers, they wanted a formalized way to allow languages to flourish, but with a standardized notation system so algorithms could be published and shared and developers could follow along with logic. One outcome of the ALGOL project was the Backus–Naur form, which was the first such standardization. That would be expanded by Danish Peter Naur for ALGOL 60, thus the name. In ALGOL 60 they would meet in Paris, also adding Father John McCarthy, Julien Green, Bernard Vauquois, Adriaan van Wijngaarden, and Michael Woodger. It got refined, yet a bit more complicated. FORTRAN and COBOL use continued to rage on, but academics loved ALGOL. And the original implementation now referred to as the ZMMD implementation, gave way to X1 ALGOL, Case ALGOL, ZAM in Poland, GOGOL, VALGOL, RegneCentralen ALGOL, Whetstone ALGOL for physics, Chinese ALGOL, ALGAMS, NU ALGOL out of Norway, ALGEK out of Russia, Dartmouth ALGOL, DG/L, USS 90 Algol, Elliot ALGOL, the ALGOL Translator, Kidsgrove Algol, JOVIAL, Burroughs ALGOL, Niklaus Firths ALGOL W, which led to Pascal, MALGOL, and the last would be S-algol in 1979. But it got overly complicated and overly formal. Individual developers wanted more flexibility here and there. Some wanted simpler languages. Some needed more complicated languages. ALGOL didn’t disappear as much as it evolved into other languages. Those were coming out fast and with a committee to approve changes to ALGOL, they were much slower to iterate. You see, ALGOL profoundly shaped how we think of programming languages. That formalization was critical to paving the way for generations of developers who brought us future languages. ALGOL would end up being the parent of CPL and through CPL, BCPL, C, C++, and through that Objective-C. From ALGOL also sprang Simula and through Simula, Smalltalk. And Pascal and from there, Modula and Delphi. It was only used for a few years but it spawned so much of what developers use to build software today. In fact, other languages evolved as anti-ALGOL-derivitives, looking at how you did something and deciding to do it totally differently. And so we owe this crew our thanks. They helped to legitimize a new doctrine, a new career, computer programmer. They inspired. They coded. And in so doing, they helped bring us into the world of functional programming and set structures that allowed the the next generation of great thinkers to go even further, directly influencing people like Adele Goldberg and Alan Kay. And it’s okay that the name of this massive contribution is mostly lost to the annals of history. Because ultimately, the impact is not. So think about this - what can we do to help shape the world we live in? Whether it be through raw creation, iteration, standardization, or formalization - we all have a role to play in this world. I look forward to hearing more about yours as it evolves!
|
en
|
https://assets.libsyn.com/secure/content/47114459?height=16&width=16
|
https://thehistoryofcomputing.net/website/algol
|
Today we’re going to cover a computer programming language many might not have heard of, ALGOL.
ALGOL was written in 1958. It wasn’t like many of the other languages in that it was built by committee. The Association for Computing Machinery and the German Society of Applied Mathematics and Mechanics were floating around ideas for a universal computer programming language.
Members from the ACM were a who’s who of people influential in the transition from custom computers that were the size of small homes to mainframes. John Backus of IBM had written a programming language called Speedcoding and then Fortran. Joseph Wegstein had been involved in the development of COBOL. Alan Perlis had been involved in Whirlwind and was with the Carnegie Institute of Technology. Charles Katz had worked with Grace Hopper on UNIVAC and FLOW-MATIC.
The Germans were equally as influential. Frederich Bauer had brought us the stack method while at the Technical University of Munich. Hermann Bottenbruch from The Institute for Applied Mathematics had written a paper on constructing languages. Klaus Samelson had worked on a computer called PERM that was similar to the MIT Whirlwind project. He’d come into computing while studying Eigenvalues.
Heinz Ritishauser had written a number of papers on programming techniques and had codeveloped the language Superplan while at the The Swiss Federal Institute of Technology. This is where the meeting would be hosted.
They went from May 27th to June 2nd in 1958 and initially called the language they would develop as IAL, or the International Algebraic Language. But would expand the name to ALGOL, short for Algorithmic Language. They brought us code blocks, the concept that you have a pair of words or symbols that would begin and end a stanza of code, like begin and end. They introduced nested scoped functions. They wrote the whole language right there. You would name a variable by simply saying integer or setting the variable as a := 1. You would substantiate a for and define the steps to perform until - the root of what we would now call a for loop. You could read a variable in from a punch card. It had built-in SIN and COSIN. It was line based and fairly simple functional programming by today’s standards. They defined how to handle special characters, built boolean operators, floating point notation. It even had portable types.
And by the end had a compiler that would run on the Z22 computer from Konrad Zuse. While some of Backus’ best work it effectively competed with FORTRAN and never really gained traction at IBM. But it influenced almost everything that happened afterwards.
Languages were popping up all over the place and in order to bring in more programmers, they wanted a formalized way to allow languages to flourish, but with a standardized notation system so algorithms could be published and shared and developers could follow along with logic. One outcome of the ALGOL project was the Backus–Naur form, which was the first such standardization. That would be expanded by Danish Peter Naur for ALGOL 60, thus the name.
In ALGOL 60 they would meet in Paris, also adding Father John McCarthy, Julien Green, Bernard Vauquois, Adriaan van Wijngaarden, and Michael Woodger. It got refined, yet a bit more complicated. FORTRAN and COBOL use continued to rage on, but academics loved ALGOL.
And the original implementation now referred to as the ZMMD implementation, gave way to X1 ALGOL, Case ALGOL, ZAM in Poland, GOGOL, VALGOL, RegneCentralen ALGOL, Whetstone ALGOL for physics, Chinese ALGOL, ALGAMS, NU ALGOL out of Norway, ALGEK out of Russia, Dartmouth ALGOL, DG/L, USS 90 Algol, Elliot ALGOL, the ALGOL Translator, Kidsgrove Algol, JOVIAL, Burroughs ALGOL, Niklaus Firths ALGOL W, which led to Pascal, MALGOL, and the last would be S-algol in 1979.
But it got overly complicated and overly formal. Individual developers wanted more flexibility here and there. Some wanted simpler languages. Some needed more complicated languages. ALGOL didn’t disappear as much as it evolved into other languages. Those were coming out fast and with a committee to approve changes to ALGOL, they were much slower to iterate.
You see, ALGOL profoundly shaped how we think of programming languages. That formalization was critical to paving the way for generations of developers who brought us future languages. ALGOL would end up being the parent of CPL and through CPL, BCPL, C, C++, and through that Objective-C. From ALGOL also sprang Simula and through Simula, Smalltalk. And Pascal and from there, Modula and Delphi. It was only used for a few years but it spawned so much of what developers use to build software today.
In fact, other languages evolved as anti-ALGOL-derivitives, looking at how you did something and deciding to do it totally differently.
And so we owe this crew our thanks. They helped to legitimize a new doctrine, a new career, computer programmer. They inspired. They coded. And in so doing, they helped bring us into the world of functional programming and set structures that allowed the the next generation of great thinkers to go even further, directly influencing people like Adele Goldberg and Alan Kay.
And it’s okay that the name of this massive contribution is mostly lost to the annals of history. Because ultimately, the impact is not. So think about this - what can we do to help shape the world we live in? Whether it be through raw creation, iteration, standardization, or formalization - we all have a role to play in this world. I look forward to hearing more about yours as it evolves!
|
|||
4768
|
dbpedia
|
2
| 34
|
https://www.cs.utexas.edu/~EWD/transcriptions/EWD03xx/EWD340.html
|
en
|
E.W.Dijkstra Archive: The Humble Programmer (EWD 340)
|
[
"https://www.cs.utexas.edu/~EWD/spam_vaccine/at_medium.gif"
] |
[] |
[] |
[
""
] | null |
[] | null | null |
The Humble Programmer
by
Edsger W. Dijkstra
As a result of a long sequence of coincidences I entered the programming profession officially on the first spring morning of 1952 and as far as I have been able to trace, I was the first Dutchman to do so in my country. In retrospect the most amazing thing was the slowness with which, at least in my part of the world, the programming profession emerged, a slowness which is now hard to believe. But I am grateful for two vivid recollections from that period that establish that slowness beyond any doubt.
After having programmed for some three years, I had a discussion with A. van Wijngaarden, who was then my boss at the Mathematical Centre in Amsterdam, a discussion for which I shall remain grateful to him as long as I live. The point was that I was supposed to study theoretical physics at the University of Leiden simultaneously, and as I found the two activities harder and harder to combine, I had to make up my mind, either to stop programming and become a real, respectable theoretical physicist, or to carry my study of physics to a formal completion only, with a minimum of effort, and to become....., yes what? A programmer? But was that a respectable profession? For after all, what was programming? Where was the sound body of knowledge that could support it as an intellectually respectable discipline? I remember quite vividly how I envied my hardware colleagues, who, when asked about their professional competence, could at least point out that they knew everything about vacuum tubes, amplifiers and the rest, whereas I felt that, when faced with that question, I would stand empty-handed. Full of misgivings I knocked on van Wijngaarden’s office door, asking him whether I could “speak to him for a moment”; when I left his office a number of hours later, I was another person. For after having listened to my problems patiently, he agreed that up till that moment there was not much of a programming discipline, but then he went on to explain quietly that automatic computers were here to stay, that we were just at the beginning and could not I be one of the persons called to make programming a respectable discipline in the years to come? This was a turning point in my life and I completed my study of physics formally as quickly as I could. One moral of the above story is, of course, that we must be very careful when we give advice to younger people; sometimes they follow it!
Another two years later, in 1957, I married and Dutch marriage rites require you to state your profession and I stated that I was a programmer. But the municipal authorities of the town of Amsterdam did not accept it on the grounds that there was no such profession. And, believe it or not, but under the heading “profession” my marriage act shows the ridiculous entry “theoretical physicist”!
So much for the slowness with which I saw the programming profession emerge in my own country. Since then I have seen more of the world, and it is my general impression that in other countries, apart from a possible shift of dates, the growth pattern has been very much the same.
Let me try to capture the situation in those old days in a little bit more detail, in the hope of getting a better understanding of the situation today. While we pursue our analysis, we shall see how many common misunderstandings about the true nature of the programming task can be traced back to that now distant past.
The first automatic electronic computers were all unique, single-copy machines and they were all to be found in an environment with the exciting flavour of an experimental laboratory. Once the vision of the automatic computer was there, its realisation was a tremendous challenge to the electronic technology then available, and one thing is certain: we cannot deny the courage of the groups that decided to try and build such a fantastic piece of equipment. For fantastic pieces of equipment they were: in retrospect one can only wonder that those first machines worked at all, at least sometimes. The overwhelming problem was to get and keep the machine in working order. The preoccupation with the physical aspects of automatic computing is still reflected in the names of the older scientific societies in the field, such as the Association for Computing Machinery or the British Computer Society, names in which explicit reference is made to the physical equipment.
What about the poor programmer? Well, to tell the honest truth: he was hardly noticed. For one thing, the first machines were so bulky that you could hardly move them and besides that, they required such extensive maintenance that it was quite natural that the place where people tried to use the machine was the same laboratory where the machine had been developed. Secondly, his somewhat invisible work was without any glamour: you could show the machine to visitors and that was several orders of magnitude more spectacular than some sheets of coding. But most important of all, the programmer himself had a very modest view of his own work: his work derived all its significance from the existence of that wonderful machine. Because that was a unique machine, he knew only too well that his programs had only local significance and also, because it was patently obvious that this machine would have a limited lifetime, he knew that very little of his work would have a lasting value. Finally, there is yet another circumstance that had a profound influence on the programmer’s attitude to his work: on the one hand, besides being unreliable, his machine was usually too slow and its memory was usually too small, i.e. he was faced with a pinching shoe, while on the other hand its usually somewhat queer order code would cater for the most unexpected constructions. And in those days many a clever programmer derived an immense intellectual satisfaction from the cunning tricks by means of which he contrived to squeeze the impossible into the constraints of his equipment.
Two opinions about programming date from those days. I mention them now, I shall return to them later. The one opinion was that a really competent programmer should be puzzle-minded and very fond of clever tricks; the other opinion was that programming was nothing more than optimizing the efficiency of the computational process, in one direction or the other.
The latter opinion was the result of the frequent circumstance that, indeed, the available equipment was a painfully pinching shoe, and in those days one often encountered the naive expectation that, once more powerful machines were available, programming would no longer be a problem, for then the struggle to push the machine to its limits would no longer be necessary and that was all what programming was about, wasn’t it? But in the next decades something completely different happened: more powerful machines became available, not just an order of magnitude more powerful, even several orders of magnitude more powerful. But instead of finding ourselves in the state of eternal bliss of all programming problems solved, we found ourselves up to our necks in the software crisis! How come?
There is a minor cause: in one or two respects modern machinery is basically more difficult to handle than the old machinery. Firstly, we have got the I/O interrupts, occurring at unpredictable and irreproducible moments; compared with the old sequential machine that pretended to be a fully deterministic automaton, this has been a dramatic change and many a systems programmer’s grey hair bears witness to the fact that we should not talk lightly about the logical problems created by that feature. Secondly, we have got machines equipped with multi-level stores, presenting us problems of management strategy that, in spite of the extensive literature on the subject, still remain rather elusive. So much for the added complication due to structural changes of the actual machines.
But I called this a minor cause; the major cause is... that the machines have become several orders of magnitude more powerful! To put it quite bluntly: as long as there were no machines, programming was no problem at all; when we had a few weak computers, programming became a mild problem, and now we have gigantic computers, programming had become an equally gigantic problem. In this sense the electronic industry has not solved a single problem, it has only created them, it has created the problem of using its products. To put it in another way: as the power of available machines grew by a factor of more than a thousand, society’s ambition to apply these machines grew in proportion, and it was the poor programmer who found his job in this exploded field of tension between ends and means. The increased power of the hardware, together with the perhaps even more dramatic increase in its reliability, made solutions feasible that the programmer had not dared to dream about a few years before. And now, a few years later, he had to dream about them and, even worse, he had to transform such dreams into reality! Is it a wonder that we found ourselves in a software crisis? No, certainly not, and as you may guess, it was even predicted well in advance; but the trouble with minor prophets, of course, is that it is only five years later that you really know that they had been right.
Then, in the mid-sixties, something terrible happened: the computers of the so-called third generation made their appearance. The official literature tells us that their price/performance ratio has been one of the major design objectives. But if you take as “performance” the duty cycle of the machine’s various components, little will prevent you from ending up with a design in which the major part of your performance goal is reached by internal housekeeping activities of doubtful necessity. And if your definition of price is the price to be paid for the hardware, little will prevent you from ending up with a design that is terribly hard to program for: for instance the order code might be such as to enforce, either upon the programmer or upon the system, early binding decisions presenting conflicts that really cannot be resolved. And to a large extent these unpleasant possibilities seem to have become reality.
When these machines were announced and their functional specifications became known, quite a few among us must have become quite miserable; at least I was. It was only reasonable to expect that such machines would flood the computing community, and it was therefore all the more important that their design should be as sound as possible. But the design embodied such serious flaws that I felt that with a single stroke the progress of computing science had been retarded by at least ten years: it was then that I had the blackest week in the whole of my professional life. Perhaps the most saddening thing now is that, even after all those years of frustrating experience, still so many people honestly believe that some law of nature tells us that machines have to be that way. They silence their doubts by observing how many of these machines have been sold, and derive from that observation the false sense of security that, after all, the design cannot have been that bad. But upon closer inspection, that line of defense has the same convincing strength as the argument that cigarette smoking must be healthy because so many people do it.
It is in this connection that I regret that it is not customary for scientific journals in the computing area to publish reviews of newly announced computers in much the same way as we review scientific publications: to review machines would be at least as important. And here I have a confession to make: in the early sixties I wrote such a review with the intention of submitting it to the CACM, but in spite of the fact that the few colleagues to whom the text was sent for their advice, urged me all to do so, I did not dare to do it, fearing that the difficulties either for myself or for the editorial board would prove to be too great. This suppression was an act of cowardice on my side for which I blame myself more and more. The difficulties I foresaw were a consequence of the absence of generally accepted criteria, and although I was convinced of the validity of the criteria I had chosen to apply, I feared that my review would be refused or discarded as “a matter of personal taste”. I still think that such reviews would be extremely useful and I am longing to see them appear, for their accepted appearance would be a sure sign of maturity of the computing community.
The reason that I have paid the above attention to the hardware scene is because I have the feeling that one of the most important aspects of any computing tool is its influence on the thinking habits of those that try to use it, and because I have reasons to believe that that influence is many times stronger than is commonly assumed. Let us now switch our attention to the software scene.
Here the diversity has been so large that I must confine myself to a few stepping stones. I am painfully aware of the arbitrariness of my choice and I beg you not to draw any conclusions with regard to my appreciation of the many efforts that will remain unmentioned.
In the beginning there was the EDSAC in Cambridge, England, and I think it quite impressive that right from the start the notion of a subroutine library played a central role in the design of that machine and of the way in which it should be used. It is now nearly 25 years later and the computing scene has changed dramatically, but the notion of basic software is still with us, and the notion of the closed subroutine is still one of the key concepts in programming. We should recognise the closed subroutines as one of the greatest software inventions; it has survived three generations of computers and it will survive a few more, because it caters for the implementation of one of our basic patterns of abstraction. Regrettably enough, its importance has been underestimated in the design of the third generation computers, in which the great number of explicitly named registers of the arithmetic unit implies a large overhead on the subroutine mechanism. But even that did not kill the concept of the subroutine, and we can only pray that the mutation won’t prove to be hereditary.
The second major development on the software scene that I would like to mention is the birth of FORTRAN. At that time this was a project of great temerity and the people responsible for it deserve our great admiration. It would be absolutely unfair to blame them for shortcomings that only became apparent after a decade or so of extensive usage: groups with a successful look-ahead of ten years are quite rare! In retrospect we must rate FORTRAN as a successful coding technique, but with very few effective aids to conception, aids which are now so urgently needed that time has come to consider it out of date. The sooner we can forget that FORTRAN has ever existed, the better, for as a vehicle of thought it is no longer adequate: it wastes our brainpower, is too risky and therefore too expensive to use. FORTRAN’s tragic fate has been its wide acceptance, mentally chaining thousands and thousands of programmers to our past mistakes. I pray daily that more of my fellow-programmers may find the means of freeing themselves from the curse of compatibility.
The third project I would not like to leave unmentioned is LISP, a fascinating enterprise of a completely different nature. With a few very basic principles at its foundation, it has shown a remarkable stability. Besides that, LISP has been the carrier for a considerable number of in a sense our most sophisticated computer applications. LISP has jokingly been described as “the most intelligent way to misuse a computer”. I think that description a great compliment because it transmits the full flavour of liberation: it has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts.
The fourth project to be mentioned is ALGOL 60. While up to the present day FORTRAN programmers still tend to understand their programming language in terms of the specific implementation they are working with —hence the prevalence of octal and hexadecimal dumps—, while the definition of LISP is still a curious mixture of what the language means and how the mechanism works, the famous Report on the Algorithmic Language ALGOL 60 is the fruit of a genuine effort to carry abstraction a vital step further and to define a programming language in an implementation-independent way. One could argue that in this respect its authors have been so successful that they have created serious doubts as to whether it could be implemented at all! The report gloriously demonstrated the power of the formal method BNF, now fairly known as Backus-Naur-Form, and the power of carefully phrased English, at least when used by someone as brilliant as Peter Naur. I think that it is fair to say that only very few documents as short as this have had an equally profound influence on the computing community. The ease with which in later years the names ALGOL and ALGOL-like have been used, as an unprotected trade mark, to lend some of its glory to a number of sometimes hardly related younger projects, is a somewhat shocking compliment to its standing. The strength of BNF as a defining device is responsible for what I regard as one of the weaknesses of the language: an over-elaborate and not too systematic syntax could now be crammed into the confines of very few pages. With a device as powerful as BNF, the Report on the Algorithmic Language ALGOL 60 should have been much shorter. Besides that I am getting very doubtful about ALGOL 60’s parameter mechanism: it allows the programmer so much combinatorial freedom, that its confident use requires a strong discipline from the programmer. Besides expensive to implement it seems dangerous to use.
Finally, although the subject is not a pleasant one, I must mention PL/1, a programming language for which the defining documentation is of a frightening size and complexity. Using PL/1 must be like flying a plane with 7000 buttons, switches and handles to manipulate in the cockpit. I absolutely fail to see how we can keep our growing programs firmly within our intellectual grip when by its sheer baroqueness the programming language —our basic tool, mind you!— already escapes our intellectual control. And if I have to describe the influence PL/1 can have on its users, the closest metaphor that comes to my mind is that of a drug. I remember from a symposium on higher level programming language a lecture given in defense of PL/1 by a man who described himself as one of its devoted users. But within a one-hour lecture in praise of PL/1. he managed to ask for the addition of about fifty new “features”, little supposing that the main source of his problems could very well be that it contained already far too many “features”. The speaker displayed all the depressing symptoms of addiction, reduced as he was to the state of mental stagnation in which he could only ask for more, more, more... When FORTRAN has been called an infantile disorder, full PL/1, with its growth characteristics of a dangerous tumor, could turn out to be a fatal disease.
So much for the past. But there is no point in making mistakes unless thereafter we are able to learn from them. As a matter of fact, I think that we have learned so much, that within a few years programming can be an activity vastly different from what it has been up till now, so different that we had better prepare ourselves for the shock. Let me sketch for you one of the possible futures. At first sight, this vision of programming in perhaps already the near future may strike you as utterly fantastic. Let me therefore also add the considerations that might lead one to the conclusion that this vision could be a very real possibility.
The vision is that, well before the seventies have run to completion, we shall be able to design and implement the kind of systems that are now straining our programming ability, at the expense of only a few percent in man-years of what they cost us now, and that besides that, these systems will be virtually free of bugs. These two improvements go hand in hand. In the latter respect software seems to be different from many other products, where as a rule a higher quality implies a higher price. Those who want really reliable software will discover that they must find means of avoiding the majority of bugs to start with, and as a result the programming process will become cheaper. If you want more effective programmers, you will discover that they should not waste their time debugging, they should not introduce the bugs to start with. In other words: both goals point to the same change.
Such a drastic change in such a short period of time would be a revolution, and to all persons that base their expectations for the future on smooth extrapolation of the recent past —appealing to some unwritten laws of social and cultural inertia— the chance that this drastic change will take place must seem negligible. But we all know that sometimes revolutions do take place! And what are the chances for this one?
There seem to be three major conditions that must be fulfilled. The world at large must recognize the need for the change; secondly the economic need for it must be sufficiently strong; and, thirdly, the change must be technically feasible. Let me discuss these three conditions in the above order.
With respect to the recognition of the need for greater reliability of software, I expect no disagreement anymore. Only a few years ago this was different: to talk about a software crisis was blasphemy. The turning point was the Conference on Software Engineering in Garmisch, October 1968, a conference that created a sensation as there occurred the first open admission of the software crisis. And by now it is generally recognized that the design of any large sophisticated system is going to be a very difficult job, and whenever one meets people responsible for such undertakings, one finds them very much concerned about the reliability issue, and rightly so. In short, our first condition seems to be satisfied.
Now for the economic need. Nowadays one often encounters the opinion that in the sixties programming has been an overpaid profession, and that in the coming years programmer salaries may be expected to go down. Usually this opinion is expressed in connection with the recession, but it could be a symptom of something different and quite healthy, viz. that perhaps the programmers of the past decade have not done so good a job as they should have done. Society is getting dissatisfied with the performance of programmers and of their products. But there is another factor of much greater weight. In the present situation it is quite usual that for a specific system, the price to be paid for the development of the software is of the same order of magnitude as the price of the hardware needed, and society more or less accepts that. But hardware manufacturers tell us that in the next decade hardware prices can be expected to drop with a factor of ten. If software development were to continue to be the same clumsy and expensive process as it is now, things would get completely out of balance. You cannot expect society to accept this, and therefore we must learn to program an order of magnitude more effectively. To put it in another way: as long as machines were the largest item on the budget, the programming profession could get away with its clumsy techniques, but that umbrella will fold rapidly. In short, also our second condition seems to be satisfied.
And now the third condition: is it technically feasible? I think it might and I shall give you six arguments in support of that opinion.
A study of program structure had revealed that programs —even alternative programs for the same task and with the same mathematical content— can differ tremendously in their intellectual manageability. A number of rules have been discovered, violation of which will either seriously impair or totally destroy the intellectual manageability of the program. These rules are of two kinds. Those of the first kind are easily imposed mechanically, viz. by a suitably chosen programming language. Examples are the exclusion of goto-statements and of procedures with more than one output parameter. For those of the second kind I at least —but that may be due to lack of competence on my side— see no way of imposing them mechanically, as it seems to need some sort of automatic theorem prover for which I have no existence proof. Therefore, for the time being and perhaps forever, the rules of the second kind present themselves as elements of discipline required from the programmer. Some of the rules I have in mind are so clear that they can be taught and that there never needs to be an argument as to whether a given program violates them or not. Examples are the requirements that no loop should be written down without providing a proof for termination nor without stating the relation whose invariance will not be destroyed by the execution of the repeatable statement.
I now suggest that we confine ourselves to the design and implementation of intellectually manageable programs. If someone fears that this restriction is so severe that we cannot live with it, I can reassure him: the class of intellectually manageable programs is still sufficiently rich to contain many very realistic programs for any problem capable of algorithmic solution. We must not forget that it is not our business to make programs, it is our business to design classes of computations that will display a desired behaviour. The suggestion of confining ourselves to intellectually manageable programs is the basis for the first two of my announced six arguments.
Argument one is that, as the programmer only needs to consider intellectually manageable programs, the alternatives he is choosing between are much, much easier to cope with.
Argument two is that, as soon as we have decided to restrict ourselves to the subset of the intellectually manageable programs, we have achieved, once and for all, a drastic reduction of the solution space to be considered. And this argument is distinct from argument one.
Argument three is based on the constructive approach to the problem of program correctness. Today a usual technique is to make a program and then to test it. But: program testing can be a very effective way to show the presence of bugs, but is hopelessly inadequate for showing their absence. The only effective way to raise the confidence level of a program significantly is to give a convincing proof of its correctness. But one should not first make the program and then prove its correctness, because then the requirement of providing the proof would only increase the poor programmer’s burden. On the contrary: the programmer should let correctness proof and program grow hand in hand. Argument three is essentially based on the following observation. If one first asks oneself what the structure of a convincing proof would be and, having found this, then constructs a program satisfying this proof’s requirements, then these correctness concerns turn out to be a very effective heuristic guidance. By definition this approach is only applicable when we restrict ourselves to intellectually manageable programs, but it provides us with effective means for finding a satisfactory one among these.
Argument four has to do with the way in which the amount of intellectual effort needed to design a program depends on the program length. It has been suggested that there is some kind of law of nature telling us that the amount of intellectual effort needed grows with the square of program length. But, thank goodness, no one has been able to prove this law. And this is because it need not be true. We all know that the only mental tool by means of which a very finite piece of reasoning can cover a myriad cases is called “abstraction”; as a result the effective exploitation of his powers of abstraction must be regarded as one of the most vital activities of a competent programmer. In this connection it might be worth-while to point out that the purpose of abstracting is not to be vague, but to create a new semantic level in which one can be absolutely precise. Of course I have tried to find a fundamental cause that would prevent our abstraction mechanisms from being sufficiently effective. But no matter how hard I tried, I did not find such a cause. As a result I tend to the assumption —up till now not disproved by experience— that by suitable application of our powers of abstraction, the intellectual effort needed to conceive or to understand a program need not grow more than proportional to program length. But a by-product of these investigations may be of much greater practical significance, and is, in fact, the basis of my fourth argument. The by-product was the identification of a number of patterns of abstraction that play a vital role in the whole process of composing programs. Enough is now known about these patterns of abstraction that you could devote a lecture to about each of them. What the familiarity and conscious knowledge of these patterns of abstraction imply dawned upon me when I realized that, had they been common knowledge fifteen years ago, the step from BNF to syntax-directed compilers, for instance, could have taken a few minutes instead of a few years. Therefore I present our recent knowledge of vital abstraction patterns as the fourth argument.
Now for the fifth argument. It has to do with the influence of the tool we are trying to use upon our own thinking habits. I observe a cultural tradition, which in all probability has its roots in the Renaissance, to ignore this influence, to regard the human mind as the supreme and autonomous master of its artefacts. But if I start to analyse the thinking habits of myself and of my fellow human beings, I come, whether I like it or not, to a completely different conclusion, viz. that the tools we are trying to use and the language or notation we are using to express or record our thoughts, are the major factors determining what we can think or express at all! The analysis of the influence that programming languages have on the thinking habits of its users, and the recognition that, by now, brainpower is by far our scarcest resource, they together give us a new collection of yardsticks for comparing the relative merits of various programming languages. The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague. In the case of a well-known conversational programming language I have been told from various sides that as soon as a programming community is equipped with a terminal for it, a specific phenomenon occurs that even has a well-established name: it is called “the one-liners”. It takes one of two different forms: one programmer places a one-line program on the desk of another and either he proudly tells what it does and adds the question “Can you code this in less symbols?” —as if this were of any conceptual relevance!— or he just asks “Guess what it does!”. From this observation we must conclude that this language as a tool is an open invitation for clever tricks; and while exactly this may be the explanation for some of its appeal, viz. to those who like to show how clever they are, I am sorry, but I must regard this as one of the most damning things that can be said about a programming language. Another lesson we should have learned from the recent past is that the development of “richer” or “more powerful” programming languages was a mistake in the sense that these baroque monstrosities, these conglomerations of idiosyncrasies, are really unmanageable, both mechanically and mentally. I see a great future for very systematic and very modest programming languages. When I say “modest”, I mean that, for instance, not only ALGOL 60’s “for clause”, but even FORTRAN’s “DO loop” may find themselves thrown out as being too baroque. I have run a little programming experiment with really experienced volunteers, but something quite unintended and quite unexpected turned up. None of my volunteers found the obvious and most elegant solution. Upon closer analysis this turned out to have a common source: their notion of repetition was so tightly connected to the idea of an associated controlled variable to be stepped up, that they were mentally blocked from seeing the obvious. Their solutions were less efficient, needlessly hard to understand, and it took them a very long time to find them. It was a revealing, but also shocking experience for me. Finally, in one respect one hopes that tomorrow’s programming languages will differ greatly from what we are used to now: to a much greater extent than hitherto they should invite us to reflect in the structure of what we write down all abstractions needed to cope conceptually with the complexity of what we are designing. So much for the greater adequacy of our future tools, which was the basis of the fifth argument.
As an aside I would like to insert a warning to those who identify the difficulty of the programming task with the struggle against the inadequacies of our current tools, because they might conclude that, once our tools will be much more adequate, programming will no longer be a problem. Programming will remain very difficult, because once we have freed ourselves from the circumstantial cumbersomeness, we will find ourselves free to tackle the problems that are now well beyond our programming capacity.
You can quarrel with my sixth argument, for it is not so easy to collect experimental evidence for its support, a fact that will not prevent me from believing in its validity. Up till now I have not mentioned the word “hierarchy”, but I think that it is fair to say that this is a key concept for all systems embodying a nicely factored solution. I could even go one step further and make an article of faith out of it, viz. that the only problems we can really solve in a satisfactory manner are those that finally admit a nicely factored solution. At first sight this view of human limitations may strike you as a rather depressing view of our predicament, but I don’t feel it that way, on the contrary! The best way to learn to live with our limitations is to know them. By the time that we are sufficiently modest to try factored solutions only, because the other efforts escape our intellectual grip, we shall do our utmost best to avoid all those interfaces impairing our ability to factor the system in a helpful way. And I cannot but expect that this will repeatedly lead to the discovery that an initially untractable problem can be factored after all. Anyone who has seen how the majority of the troubles of the compiling phase called “code generation” can be tracked down to funny properties of the order code, will know a simple example of the kind of things I have in mind. The wider applicability of nicely factored solutions is my sixth and last argument for the technical feasibility of the revolution that might take place in the current decade.
In principle I leave it to you to decide for yourself how much weight you are going to give to my considerations, knowing only too well that I can force no one else to share my beliefs. As each serious revolution, it will provoke violent opposition and one can ask oneself where to expect the conservative forces trying to counteract such a development. I don’t expect them primarily in big business, not even in the computer business; I expect them rather in the educational institutions that provide today’s training and in those conservative groups of computer users that think their old programs so important that they don’t think it worth-while to rewrite and improve them. In this connection it is sad to observe that on many a university campus the choice of the central computing facility has too often been determined by the demands of a few established but expensive applications with a disregard of the question how many thousands of “small users” that are willing to write their own programs were going to suffer from this choice. Too often, for instance, high-energy physics seems to have blackmailed the scientific community with the price of its remaining experimental equipment. The easiest answer, of course, is a flat denial of the technical feasibility, but I am afraid that you need pretty strong arguments for that. No reassurance, alas, can be obtained from the remark that the intellectual ceiling of today’s average programmer will prevent the revolution from taking place: with others programming so much more effectively, he is liable to be edged out of the picture anyway.
There may also be political impediments. Even if we know how to educate tomorrow’s professional programmer, it is not certain that the society we are living in will allow us to do so. The first effect of teaching a methodology —rather than disseminating knowledge— is that of enhancing the capacities of the already capable, thus magnifying the difference in intelligence. In a society in which the educational system is used as an instrument for the establishment of a homogenized culture, in which the cream is prevented from rising to the top, the education of competent programmers could be politically impalatable.
Let me conclude. Automatic computers have now been with us for a quarter of a century. They have had a great impact on our society in their capacity of tools, but in that capacity their influence will be but a ripple on the surface of our culture, compared with the much more profound influence they will have in their capacity of intellectual challenge without precedent in the cultural history of mankind. Hierarchical systems seem to have the property that something considered as an undivided entity on one level, is considered as a composite object on the next lower level of greater detail; as a result the natural grain of space or time that is applicable at each level decreases by an order of magnitude when we shift our attention from one level to the next lower one. We understand walls in terms of bricks, bricks in terms of crystals, crystals in terms of molecules etc. As a result the number of levels that can be distinguished meaningfully in a hierarchical system is kind of proportional to the logarithm of the ratio between the largest and the smallest grain, and therefore, unless this ratio is very large, we cannot expect many levels. In computer programming our basic building block has an associated time grain of less than a microsecond, but our program may take hours of computation time. I do not know of any other technology covering a ratio of 1010 or more: the computer, by virtue of its fantastic speed, seems to be the first to provide us with an environment where highly hierarchical artefacts are both possible and necessary. This challenge, viz. the confrontation with the programming task, is so unique that this novel experience can teach us a lot about ourselves. It should deepen our understanding of the processes of design and creation, it should give us better control over the task of organizing our thoughts. If it did not do so, to my taste we should not deserve the computer at all!
It has already taught us a few lessons, and the one I have chosen to stress in this talk is the following. We shall do a much better programming job, provided that we approach the task with a full appreciation of its tremendous difficulty, provided that we stick to modest and elegant programming languages, provided that we respect the intrinsic limitations of the human mind and approach the task as Very Humble Programmers.
|
||||||||
4768
|
dbpedia
|
3
| 39
|
https://stevanussugianto.wordpress.com/2013/03/12/concepts-of-programming-languages-chapter-2-evolution-of-the-major-programming-languages/
|
en
|
Concepts of Programming Languages – Chapter 2 – Evolution of The Major Programming Languages
|
[
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://pixel.wp.com/b.gif?v=noscript"
] |
[] |
[] |
[
""
] | null |
[] |
2013-03-12T00:00:00
|
Review Questions: 1. In what year was Plankalkul designed? In what years was that design published? Plankalkul was designed in 1943 by Konrad Zuse as his proposal for his Ph. D. dissertation. In a lengthy manuscript dated 1945. The design was published in 1972. 2. Mention an interesting feature of Zuse's programs. An interesting feature…
|
en
|
https://s1.wp.com/i/favicon.ico
|
Stevanus Sugianto
|
https://stevanussugianto.wordpress.com/2013/03/12/concepts-of-programming-languages-chapter-2-evolution-of-the-major-programming-languages/
|
Review Questions:
1. In what year was Plankalkul designed? In what years was that design published?
Plankalkul was designed in 1943 by Konrad Zuse as his proposal for his Ph. D. dissertation. In a lengthy manuscript dated 1945. The design was published in 1972.
2. Mention an interesting feature of Zuse’s programs.
An interesting feature of Zuse’s programs was the inclusion of mathematical expressions showing the current relationships between program variables. These expression stated that would be true during execution at the points in the code where they appeared. These are very similar to the assertions of Java and in those in axiomatic semantics.
3. What does Plankalkul mean?
Plankalkul means program calculus. It was defined and the algorithms were written in language to solve a wide variety of problems by Zuse.
4. Speedcoding was invented to overcome two significant shortcomings of the computer hardware of the early 1950s. What were they?
They were the novel facility of automatically incrementing address registers which didn’t appear in hardware until the UNIVAC 1107 computers of 1962. The speedcoding interpreter effectively converted the problem of 701 ti a virtual three-address floating-point calculator.
5. What is the number of bits in a single word of the UNIVAC I’s memory? How are the bits grouped?
The number of bits in a single word of the UNIVAC I’s memory is 72 bits and the bits are grouped as 12 six-bit bytes.
6. What hardware capability that first appeared in the IBM 704 computer strongly affected the evolution of programming language? Explain why.
Indexing and floating-point instructions in hardware, because one of the primary reasons why the slowness of interpretive system was tolerated from the late 1940s to the mid-1950s was the lack of floating-point hardware in the available computers. All floating-point operations had to be simulated in software, a very time-consuming process. Because so much processors time was spent in software floating-point processing, the overhead of interpretation and the simulation of indexing were relatively insignificant. As long as floating-point had to be done by software, interpretation was an acceptable expense.
7. Who developed the Speedcoding system for the IBM 701?
The speedcoding system was developed by John Backus for the IBM 701 which is an example of such a system (Backus, 1954).
8. Who developed Short Code? Why is Short Code called automatic programming?
Short Code was developed by John Mauchly. It is called automatic programming because it is not translated to machine code, rathter it was implemented with a pure interpreter. It clearly simplifies the programming process, but at the expense of execution time.
9. Under what environmental consideration as Fortran developed? Which is the first version of Fortran?
Environmental consideration as Fortran developed was under:
– Computers had small memories and were slow and relatively unreliable
– The primary use of computer was for scientific computations
– There were no existing efficient and effective ways to proram computers
– Because of the high cost of computers compare to the cost of programmers, speed of the generated object code was the primary goal of the first Fortran compilers.
The first version of Fortran is Fortran 0.
10. What was the most significant feature added to Fortran I to get Fortran II?
The most significant feature added to Fortran I to get Fortran II was the independent compilation of subroutines. Without it, any change in a program required that the entire program be recompiled.
11. What control flow statements were added to Fortran IV to get Fortran 77?
Control flow statements added to Fortran IV to get Fortran 77 were logical loop control statements and an If with an optional Else clause.
12. Which versions of Fortran was the first to have any sort of dynamic variables?
Fortran 90(ANSI, 1992) was the first version of Fortran to have any sort of dynamic variables.
13. Which version of Fortran was the first to have character string handling?
Version of Fortran that was the first to have character string handling was Fortran 77. Moreover it became standard in 1978 and it retained most of features of Fortran IV.
14. Why are linguist interested in artificial intelligence in the late 1950s?
Linguist were interested in artificial intelligence in the late 1950s because the were concerned with natural language processing.
15. What are the different data types and structures in Common LISP?
The different data types and structures in common LISP are the Common LISP has a large number of data types and structures, including records, arrays, complex number, and character strings. It also has a form of a packages for modularizing collections of functions and data providing access control.
23. In what year did the COBOL design process begin?
COBOL design process was begun in 1959 when it was held the meeting of the subject of a common language for business applications, which was sponsored by the Department of Defense at the Pentagon.
27. Why was BASIC was an important language in the early 1980s?
In early 1980s BASIC was an important language because it was easy for beginners to learn, especially those were not science oriented, and its smaller dialects can be implemented on computers with very small memories.
Problem Set:
1. What features of Fortran IV do you think would have had the greatest influence of Java if the Java designers had been familiar with Fortran?
It think they would be the capability of passing subprograms as parameter to others subprogram, explicit type declarations for variables, and a logical if construct.
3. Write a short history of the Fortran 0, Fortran I, Fortran II, Fortran IV systems.
Fotran 0 was the first version of Fortran. It stated that it would provide the efficiency of hand-coded programs and the ease of programming of the interpretive pseudocode systems. Environmental consideration as Fortran developed was under: computers had small memories and were slow and relatively unreliable, the primary use of computer was for scientific computations, there were no existing efficient and effective ways to proram computers, because of the high cost of computers compare to the cost of programmers, speed of the generated object code was the primary goal of the first Fortran compilers.
Fortran I included input/output formatting, variable names of up to six characters, user-defined subroutines, the if selection statement and the Do loop statement. There were no data typing, variable whose names began with I, J, K, L, M, N were implicity integer type, and all others were implicitly floating-point.
Fortran II fixed many of the bugs in the Fortran I compilation system and added some significant features to the language, most important being the independent compilation of subroutines. Without independent compilation, any change in a program required that the entire program be recompiled. The capability of including precompiled machine language versions of subprograms shortened the compilation process considerably and made it practical to develop much larger programs.
Fortran IV became one of the most widely used programming languages of its time. It was an improvement over Fortran II in many ways. It added explicit type declaration for variable, a logical If construct, and the capability of passing subprograms as parameters to other subprograms.
9. Why in your opinion, did Fortran allow names that began with I, J, K, L, M, and N as implicitly integer type?
In my opinion Fortran allowed it due to the programmers habits of using those character as name of variable. And it eased the programmers to declare the variable with int data type whenever they needed it.
10. Outline the major developments in ALGOL 60.Major development of ALGOL 60:
– The concept of block structure was introduced. This allowed the programmer to localize parts of programs by introducing new data environments, or scopes.
– Two different means of passing parameters to subprograms were allowed: pass by value and pass by name.
– Procedures were allowed to be recursive. The ALGOL 58 description was unclear on this issue. Note that although this recursion was new for the imperative languages, LISP had already provided recursive functions in 1959.
– Stack-dynamic arrays were allowed. A stack-dynamic array is one for which the subscript range or ranges are specified by variables, so that the size of the arrays is set at the time storage is allocated to they array, which happens when the declaration is reached during execution.
15. Are there any non procedural languages other than Prolog?
Yes there are, SQL is non procedural languages.
16. What is your opinion of the argument that languages that are too complex are too dangerous to use, and we should therefore keep all languages small and simple?
My opinion about it is that those language are literally not dangerous. It is dangerous if an amateur tries to create a complex program with less understanding of concept of that languages which are too complex. If it is reachable to create a language which are small and simple, why not? As long as it can provide all that we need to create a program. Regarding simplicity is one of the characteristic that a programming language should have.
24. Why, in your opinion, do new scripting languages appear more frequently than new compiled languages?
In opinion I think it is because a scripted language is easier for the beginner to learn and implement than a compiled language. And also due to scripting languages efficiency of its ease built in interfaces and give the fastest turnaround from script to execution.
25. Give a brief general description of the Java servlet.
Java servlet is Java platform technology of choice for extending and enhancing Web servers. Servlets provide a component-based, platform-independent method for building Web-based applications, without the performance limitations of CGI programs. And unlike proprietary server extension mechanisms (such as the Netscape Server API or Apache modules), servlets are server- and platform-independent.
|
||||
4768
|
dbpedia
|
2
| 75
|
https://joequery.me/notes/programming-language-design-issues/
|
en
|
Programming Language Design Issues
|
[] |
[] |
[] |
[
""
] | null |
[] | null | null |
(This post is part of the programming languages 4th edition series.)
These notes are based on Programming Languages: Design and Implementation.
Section 1.1: Why study programming languages?
1. To improve your ability to develop effective algorithms
Many languages provide features that can be extremely useful when used properly but waste a large amount of time when used improperly. Algorithms behave similarly, so the habit of making correct decisions with a programming language can carry over to making correct decisions with an algorithm.
2. To improve your use of your existing programming language
By understanding exactly how the features in a programming language are implemented, you learn how to effectively use them. When you know how the language components such as lists/arrays/strings work, you can use them more efficiently.
3. To increase your vocab of useful programming constructs
When thinking about a set of data/program structures to solve a problem, it's typical to only think about the structures that are immediately available within the programming language you're working with. By knowing the constructs of other programming languages, you can find one that fits your scenario better and possibly implement it yourself.
4. To allow a better choice of programming language
Some languages are better suited than others for particular projects. You can reduce effort by picking the one that works best.
5. To make it easier to learn a new language
A thorough knowledge of programming language constructs and implementation techniques allows programmers to learn new languages more easily.
6. To make it easier to design a new language
Though you may not be making the next C or Java programming language, it's actually fairly common to create a form of programming language on a small scale within a project.
Studying features and their implementations
There's a lot more to studying programming languages than just comparing their features. Some languages may share the same features, but those features could be implemented in entirely different ways, with some implementations possibly more efficient than others
Section 1.2: A short history of programming languages
1950s: FORTRAN, LISP
1970s: Ada, C, Pascal, Prolog, Smalltalk
1980s: C++, ML, Perl, Postscript
1990s: Java
In a survey conducted by the US DoD in the 1970s, it found that over 500 languages were being used on defense projects.
Section 1.2.1: Development of Early Languages
We'll summarize language development from the mid 1950s to the early 1970s. Later developments will be covered throughout the series.
Numerically based languages
In the 1940s, the main role for computers during WWII was solving differential equations to determine ballistics trajectories.
In the 1950s, symbolic notations started to appear:
Grace Hopper developed the A-0 language
John Backus developed Speedcoding
Both A-0 and Speedcoding were designed to compile simple arithmetic expressions into executable machine language.
FORTRAN
In 1957, Backus managed a team to develop FORTRAN, meaning FORmula TRANSlator. While FORTRAN was oriented around numerical calculations, its goal was a full-fledged programming language including:
control structures
conditionals
input/output statements
Many doubted FORTRAN would compete with hand-coded assembly, so much effort was put into making FORTRAN execution extremely efficient. Some of this efficiency came from making statements specifically for the IBM 704. FORTRAN was extremely successful and dominated in the 1970s.
In 1958, FORTRAN was revised as FORTRAN II, and FORTRAN IV came a few years later. Soon, many manufacturers had implemented a version of the language, as there was no standard. In 1966, FORTRAN IV became the standard, and went by the name FORTRAN 66. FORTRAN 66 has been updated to FORTRAN 77 and then FORTRAN 90.
ALGOL
The success of FORTRAN caused fear of IBM's industry dominance. GAMM (the Germany Society of Applied Mathematics) organized a committee to design a universal language. The US's ACM, (Association f or Computing Machinery) organized a similar committee, and the two committees eventually merged. Under the leadership of Peter Naur, the International Algorithmic Language (IAL) was developed. The name was eventually changed to ALGOL, and became known as ALGOL 58. The ALGOL 60 revision became the standard academic computing language from the 1960s to the early 1970s.
FORTRAN was designed for efficient execution on the IBM 704 machine, but ALGOL had different goals.
ALGOL notation should be close to standard math notation
ALGOL should be useful for the description of algorithms
Programs in ALGOL should be compilable into machine language
ALGOL should not be bound to a single computer architecture
These goals had large implications. This meant that, in the context of the time period, input/output was not included in the language, nor was special procedures. Implementations of ALGOL were incompatible. Subprograms were viewed as macro substitution, a concept named call by name.
Backus was editor of the ALGOL report defining the language, and he used a notation similar to that developed by Chomsky for describing context free languages. This introduced formal grammar theory to programming languages. Peter Naur's and Backus' role in the development of ALGOL caused the notation used to represent the grammar of a language to be called Backus Naur Form (BNF).
Business Languages
After establishing domain of numerical calculations with computers, business data processing came soon after. Grace Hopper developed FLOWMATIC in 1955, with the goal of developing business applications with a form of English-like text. In 1959, The US DoD sponsored a meeting to develop the Common Business Language(CBL), a language whose goal was to use English as much as possible for its notation. The specifications for CBL designed in 1960 became the designs for COBOL (COmmon Business Oriented Language). Cobol was standardized in 1968.
Artificial-intelligence languages
The Information Processing Language (IPL) was developed by the Rand Corporation. IPL-V was widely known but had limited use. John McCarthy of MIT designed LIst PRocessing (LISP) for the IBM 704. LISP 1.5 became the standard LISP implementation for many years.
Systems languages
In this early era, efficiency was required, so assembly language was mostly used in the systems area. CPL and BCPL were designed but never gained traction. C changed the systems area, but that did not come until the 1970s.
Section 1.2.2: Evolution of Software Architectures
The hardware that supports a language has a large impact language design.
define operating/target environment:
The external environment supporting the execution of a program
define host environment:
The environment in which a program is designed, coded, tested, and debugged
The Mainframe Era
The large mainframe dominated between the 1940s and 1970s. A single, expensive computer filled a roomed and required many technicians.
Batch environments
Only external files of data. A program would take a set of data files as input and produce a set of output files after processing. FORTRAN, COBOL and Pascal were initially designed for batch-processing.
Interactive environments
Interactive programming appeared in the early 1970s near the end of the mainframe era. CRT terminals were connected to the computer. Computers were able to divide computation time amongst all concurrent users, a process known as time share
Effects on language design
Languages were designed for batch processing bearing in mind that files were the basis for I/O. Other properties of languages designed for this era:
A terminated program must be completely rerun, but termination is acceptable
No external help from the user is possible
No way to monitor or change the speed of execution
File I/O and interactive I/O end up being very different. Languages designed for batch-processing can have a hard time adapting to interactive environments. In interactive environments, termination is usually not acceptable.
Personal Computer Era
Time sharing in mainframes did not last very long before personal computers came about.
Personal Computers
In the 1970s, computers became progressively smaller, cheaper, and faster. In 1978, Apple released the Apple II, the first true commercial PC. It was a small desktop that ran BASIC, and had a major impact on the educational market.
In 1981, the PC was released by IBM. In January 1984, Apple announced the Macintosh computer during the US superbowl. It contained a windows-based GUI with a mouse. Although the technology was previously developed by Xerox, the Macintosh was the first commercial applications. Microsoft quickly mimicked Macintosh with the Windows OS.
Embedded-system environments
define embedded computer:
A computer system that controls a larger system such as a plant, aircraft, machine, etc.
Embedded system failure can be catastrophic
Reliability and correctness are essential
Ada, C, C++ used
Effects on language design
Performance is less of a concern in many domains
UI interfaces and low prices lead to single-user computers. No time share
Users are used to windows-based UI, which can be hard to program. OO makes it easy via libraries and packages.
In regards to embedded systems,
Programs must interact directly with device for I/O
Access to devices provided via registers, memory locations, subprograms
All errors must be handled. Termination is not acceptable
Embedded system must be able to work in real-time
Often a distributed system
Networking Era
Distributed computing
Faster, cheaper, smaller computers in 1980s caused more to be used in business
Uses included
payroll
report writing
order processing
Local Area Networks (LANs) developed to facilitate communication between computers.
LANs used a client-server model
Airline reservation client prime example of single server having multiple clients.
Internet
Mid 1990s, distributed LAN known as the Internet emerged
1970s, Defense Advanced Research Projects Agency (DARPA) wanted to link mainframes via network to aid military information transfer.
The network, ARPANET, evolved into the research-oriented Internet
Early internet use required certain protocols to retrieve information.
telnet made user appear as part of a server
FTP allowed client to send or receive files
In both cases, the user had to already know what machine had the desired info
Simple Mail Transfer Protocol (SMTP)
Basis for today's email
Machines have unique names, users have unique logins
Could send mail using SMTP client just knowing user and machine name.
A goal in the late 1980s was to make information retrieval as easy as sending email.
1989 at CERN, Bernerls-Lee developed the HTML hyperlink
Mosaic web browser in 1993 + HTTP led to the internet as we know it
Effects on language design
Users access servers via local client machines for local processing
The language must allow interaction between server and client
Initial web pages were static. E-commerce needed two-way information flow
Java and Perl provide such features
Security became very important
Performance in server-client communication became important
Servers could become overloaded with client requests
A way of coping is to make client process information
Required the sending of small executables
Server doesn't know what client computer is like, so the executables must work regardless of client computer type
Java aimed to solve these problems
Section 1,2.3: Application Domains
The best language to be used often depends on the application domain.
Applications of the 1960s
Business processing
Large data processing applications running on big iron mainframes
order-entry
inventory
payroll
Read in data on tape drives, write out new data
COBOL developed for these applications
COBOL devs needed months to compose a what-if application
Scientific
Characterized by solution of math equations
numerical analysis
differential equations
integral functions
generating statistics
Computer first developed in the Scientific domain for WWII ballistics tables
FORTRAN dominated
Syntax is close to math
Scientists find it easy to use
System
For building OSes and implementing compilers, no effective language existed
Assembly language was often the choice
JOVIAL, an ALGOL variant, was used on some U.S DoD projects
Forth was developed to address power station / assembly line control
Assembly language was still the most often used in these domains
Artificial Intelligence
AI relatively new research area
LISP dominated
Programs search through large data spaces
Example: Playing chess, searching for the best move
Applications of the 21st century
The programming language situation today is probably more complex than it was during the 60s.
Business processing
COBOL dominated in the late 90s, though C(++) were sometimes used
Spreadsheet applications handle the what-if scenario COBOL struggled with
Fourth-generation languages (4GLs) have some market share
languages designed for specific business application domains
typically provide a window-based programmer interface
provide easy access to database records
Easy to generate input forms / output reports
E-commerce has large influences on business programming
The need for interaction between user and vendor gave rise to new languages
Java, Perl, JavaScript used
Scientific
FORTRAN, C++, Java
System
C and C++ dominate system programming
efficient execution
full access to the OS and hardware
Modula and modern BASIC variatons
Ada (though not a major shareholder)
For real-time applications, C, C++, and Ada are used.
Examples of real-time applications include
cars
microwave ovens
video games
digital watches
Artificial Intelligence
LISP is still used, but modern versions are used in place of LISP 1.5
Scheme
Common Lisp
Both are very good at searching applications
Publishing
Relatively new application
Own syntax for input commands and output files
The TEX translator produces Postscript programs
Process
Processes are programs that control another
back up files every midnight
sync time once an hour
send automatic email reply on vacation
automatically test a program when it compiles successfully
Unix Shell / Shell scripts
Other scripting languages such as TCL and Perl are used for similar purposes
New paradigms
ML used to investigate type theory
Smalltalk has a large influence on language design
Not used very much commercially, though
OO features in C++ and Ada had origins in Smalltalk
Section 1.3: Role of Programming Languages
By mid 1960s, language priority became ease of use over efficiency
ALGOL is no longer used
Newer languages like Java and C++ have been greatly influenced by the languages before them. These influences include:
Computer capabilities
Speed of modern computers
OS software between programming language and hardware
Applications
Applications are no longer only for military, science, business
PCs, Internet, and gaming have new requirements languages must fill
Programming methods
Language designs have changed to reflect good programming practices
Implementation methods
Better implementation methods affected feature choices in languages
Theoretical studies
Formal mathematical methods have deepened our understanding of languages
Strengths and weaknesses of language features have been deeply evaluated
Standardization
The need for programs to move from computer to computer led to standards
These standards shape how the language is designed
Section 1.3.1: What makes a good language?
All languages have shortcomings, but some are still much better than others. External forces can influence language popularity, but programmers ultimately decide what languages live and die.
Attributes of a good language
There are a lot of reasons why one programming language could be viewed as preferable to another.
Clarity, simplicity, unity
The language must provide a framework for thinking about algorithms
The language must provide a way to express ideas of algorithm implementations
Its concepts should be simple so they can be used in developing algorithms
The language should have a minimum number of concepts (conceptual integrity)
Language syntax determines ease of writing, reading, testing, modifying code
Cryptic syntax may be easy to write but is usually hard to read
Constructs that mean different things should look different.
Orthogonality
define orthogonality
the attribute of being able to combine various features with all combinations of features being meaningful.
Ex: Language can evaluate expression to get boolean T/F. Language also has conditionals based on T/F. If any expression can be used/evaluated within a conditional statement, these features are orthogonal.
Orthogonal features makes a language easier to learn
A negative aspect is that it may hide errors
Naturalness for the application
Language syntax should allow the program structure to reflect the algorithm
Examples of algorithms differing in structure are:
sequential
concurrent
logic
The language should provide the following tools for solving a problem:
appropriate data structures
appropriate operations
appropriate control structures
a natural syntax for the problem
Support for abstraction
There is always a gap between abstract data structures and primitive data structures.
The language should allow data structures, data types, and operations to be defined and maintained as self-contained abstractions.
Ada developed for lack of abstraction in Pascal, C++ for lack of abstraction in C
Ease of program verification
Reliability of programs is a huge concern
Methods of testing program correctness:
formal verification
desk checking
testing via input data
A combination of the above methods are used for large projects
Hard program verification may outweigh language features
Simple syntax/semantics makes program verification simpler
Programming environment
A weak language with a good environment can be easier than a strong language
Factors that influence the programming environment include:
reliable, efficient, well-documented implementation
Special editors
Special testing packages
Version control
Smalltalk designed around a programming environment
Portability of programs
Transportability of the computer a program is developed on to other computers
A language should be widely available
A language's definition is independent of a particular machine
ADA, FORTRAN, C have standardized definitions for portable application dev
Cost of use
Cost of program execution
Main focus of early programming years
Speed not a high concern anymore
10% or 20% execution time tolerated for other benefits
Cost of program translation
Student programs are compiled frequently but executed few times
A fast compiler is important for programming education
Cost of program creation, testing, use
Smalltalk and Perl can help solve problems with minimum investment in programmer time and energy.
Programming time minimized, but execution time may be larger in comparison to other languages.
Cost of program maintenance
Studies show largest cost is over the life of the program, not initial
Maintenance includes repair of errors and enhancements
A language that makes it easy to adjust programs may be cheaper in the long run
Syntax and Semantics
define syntax:
What the program looks like. How statements, declarations, and other language constructs are written.
define semantics:
The meaning given to the various syntactic constructs.
Section 1.3.2: Language Paradigms
There are four basic computational models that describe most programming.
Imperative languages
Also known as procedural
Command-driven or statement-oriented
A program consists of a sequence of statements, execution causes machine to enter a new state.
Syntax of imperative languages is of the form
statement1; statement2; ...
Program development is building successive states to arrive at solution
Usually the first model one learns, and also the most widely used
C
C++
FORTRAN
ALGOL
PL/I
Pascal
Ada
Smalltalk
COBOL
Model follows from the hardware of a computer that executes sequentially
Applicative languages
Also known as functional languages
Look at the function the program represents
Look at the desired result rather than available data
Program develops by creating functions from previous functions that manipulate the initial data set until the solution is achieved.
Once the functions are created, we apply the initial data
The syntax of function languages is of the form
function_n(...function_2(function_1(data))...)
LISP and ML are functional languages
Rule-based languages
Also known as logical programming languages
Check for condition, executes an appropriate action
Most common rule-based language is Prolog
Set of filters to apply to data storage
Similar to imperative but statements are not sequential
The syntax of rule-based languages is of the form
enabling condition_1 => action_1 enabling condition_2 => action_2 ... enabling condition_n => action_n
Common business application of rules-based languages are decision tables
Programming often consists of building a matrix/table of conditions and the appropriate actions
BNF parsing techniques and YACC are rule-based techniques
Object-oriented programming
Complex data objects are ubilt, functions are designed to operate on the data
Complex objects are extensions of simpler objects, inheriting properties
A combination of the applicative and imperative models
Generality of the Computational Model
How one uses a language depends on a programmer
You can technically write "sequential LISP" or "functional C"
Applicative techniques provide effective ways to prove correctness
Section 1.3.3: Language Standardization
A concept of a programming language is tied to an implementation of it
Ex: Can you move a C program to another computer using a different compiler?
If you extend features of a language, is it still that language?
To solve the above issues, many languages have standard definitions. All implementations should adhere to the standard. There are two types of standards:
Proprietary standards: definitions by the company that developed and owns the language. This doesn't work for widely used languages.
Consensus standards: Documents produced by organizations on an agreement. Currently the major method of ensuring uniformity.
Almost every country has one or more organizations in charge of developing standards.
In US: American National Standards Institute (ANSI)
Prog language standards: Committee X3 of Computer Business Equipment Manufacturers Association.
Institute of Electrical and Electronic Engineers (IEEE)
In UK, British Standards Institute (BSI)
Internationally, International Standards Organization (ISO)
Standards making is a consensus process. Not everyone gets their way
Creating standards is partially technical, partially political
To use standards effectively we need , timeliness, conformance, and Obsolescence
Timeliness
When to standardize is important
FORTRAN standard came after there were many incompatible versions
ADA standard in 1983 came before any implementation
No one knew if the language would even work
First compilers did not even appear until 1987
Goal should be early enough to avoid incompatible implementations, but not so early that no one has experience with the language.
Conformance
A program is conformant if it only uses features defined in the standard
A conformant compiler produces correct output given a conformant program
The standard says nothing about computation results of non-conformant programs
Many compilers have features not addressed by the standard
Obsolescence
Standards should be reviewed every so often
Most standards require backwards compatibility
Languages may get messy with so many obsolete constructs
Deprecation and obsolescent features have recently developed
obsolescent features may be dropped in the next version
deprecated features may be obsolescent in the next version
This cycle provides a 10-20 year warning that new programs can't use the old features.
Section 1.3.4: Internationalization
Programming increasingly a global activity
Languages must be usable in multiple countries
Local conventions on the storage and processing of data leads to some issues
Collating sequences
In what collating sequence should the characters be order?
Sorting - The position of non-Roman characters is not uniformly defined
Case - Some languages don't have uppercase/lowercase
Japanese
Arabic
Hebrew
Thai
Scanning direction - Most languages read left to right, but some right to left
Country specific date/time formats
11/26/02 in the US is 26/11/02 in England, 26.11.02 in France
5:40pm in the US is 17:40 in Japan, 17h40 in France
Time zones
Some time zones are not a whole number apart.
Time changes do not occur uniformly around the world.
Translating local time to worldwide standard time is not easy.
Ideographic systems
Some languages do not use characters forming an alphabet
Japanese
Chinese
Korean
16 bits might be needed to represent these languages
Currency
Representation of currency varies by country.
Section 1.4: Programming Environments
Programming environment is where programs are created, tested.
Usually consists of support tools and command language for invoking them
Typical tools include:
editors
debuggers
verifiers
pretty printers
test data generators
Section 1.4.1: Effects on Language Design
Programming environments have had two large effects on language design:
Features aiding separate compilation/assembly from components
Features aiding program testing and debugging
Separate compilation
For large programs, different programmers will be working on separate parts
This requires a language that can compile the parts and merge together later
Separate compilation can be difficult because subprograms might need each other
There are ways to provide information to subprograms during separate compilation:
Information may need to be redeclared (FORTRAN)
An order of compilation may be required (Ada)
A library containing relevant specifications may be required (Java/C++)
Option 1 above uses independent compilation. The subprogram is entirely self contained. The disadvantage is inability to check inconsistency of data between external declaration and internal redeclaration. You will have assembly errors even though the subprograms may have 0 errors.
Options 2 and 3 require the use of libraries. The body is usually omitted during the compilation of subprograms.
Separate compilation has the side effect of enabling name collisions issues
Several subprograms or portions of programs may have the same name
This may not be determined until attempting to merge all subprograms
There are three main ways languages avoid these name collisions
Use of naming conventions (obligation is the programmer's)
Use of scoping rules (Used by Pascal, C, Ada)
Add name definitions from external library(inheritance in Object oriented)
Testing and debugging
A few typical examples of testing and debugging features:
Execution trace
Prolog, LISP, many others have execution tracing tools.
Allows for statements and variables to be tagged for tracing
When a tagged statement is called, program stops, debug trace is printed.
Breakpoints
Specified by programmer
When breakpoint is reached, execution is interrupted, control given to user
User can inspect and modify variables and restart the program
Assertions
A special conditional expression
If assertion fails, program is interrupted.
Exception handler may print or take other actions upon failed assertion
After debugging, assertions may be disabled.
Section 1.4.2: Environment Frameworks
Support environment: Uses infrastructure services called environment framework
Environment framework: supplies data repository, GUI, security, communication
Ex: An environment framework would contain the following
A window manager such as Motif
VB and Visual Studio provide for libraries to build windows
Section 1.4.3 Job Control and Process Languages
Used to wait for each compilation step to finish before fixing/moving on
1960s, programs began using return code
This allowed for OS to process sequence of preloaded steps.
UNIX extended concept of job control
Control language was more complex, not just job checking
Programs could link together operations from other programs
Programs could use results of previous operations and also file contents
This led to UNIX shells such as Bash, C, Korn
The shell concept led to process/scripting languages
Usually interpreted
View programs and files as primitive data
AWK, Perl, TCL
Scripting languages important for communicating with web server
Internet time: The process software dev needs to match speed of the Internet
Scripting languages allows for rapid prototyping of applications
Section 1.5: C Overview
History of C
Developed 1972 by Dennis Ritchie and Ken Thompson
AT&T Bell Telephone Laboratories
Related to ALGOL/Pascal in style
General purpose language, but popular as systems programming language
In late 1960s, Bell Labs dropped out of MIT and GE project to developer Multics OS. Ken Thompson began UNIX.
Multics programmed in PL/I. Thompson didn't want to use PL/I for UNIX.
Developed B, a minimal subset of BCPL.
Types, structures, and more were added to B, and B became C.
C is general-purpose, but closely associated with systems programming.
First used to write UNIX kernel
Tied to UNIX implementations ever since.
In 1970s, C mostly a university curiosity
In 1980s, commercial UNIX OS's began to appear
Commercial UNIX versions led to C popularity
1989 ANSI C standard
Many new languages have syntax and semantics partially based on C
Brief overview of C
The C Language:
Relatively small
Limited number of features
The C Preprocessor
Not part of the C language!
The C interface assumptions
Set of conventions have grown
interface definitions assumed to be defined in header file
The C library
Many important functions were written in C, and are not part of the language
ANSI C lists these functions as required library functions for compilers
A C module consists of global declarations and a sequence of functions.
Multiple modules loaded together to form executable
Each function can invoke a function and access local or global data
Makes data storage simple
Local data is dynamic, allows for recursion
Each function has access to global data
C has pointers
"Equivalence" between arrays and pointers
Equivalence permits programs to use appropriate access method
Strings are arrays of characters
Large set of arithmetic operators
Flexible type definition
Strongly AND weakly typed?!
Most data items are subtypes of integers
Some type errors may be missed for this reason.
Compiler executes
Invokes preprocessor first
#define and #include evaluated
Tagged as computer science
Tweet
(This post is part of the programming languages 4th edition series.)
|
||||||||
correct_award_00093
|
FactBench
|
3
| 41
|
https://www.collegesidekick.com/study-docs/14502761
|
en
|
[] |
[] |
[] |
[
""
] | null |
[] | null | null | ||||||||||
correct_award_00093
|
FactBench
|
0
| 14
|
https://www.goodreads.com/book/show/18480237-nobel-prize-library-jensen-1944
|
en
|
NOBEL PRIZE LIBRARY JENSEN 1944 by Johannes V. Jensen
|
[
"https://i.gr-assets.com/images/S/compressed.photo.goodreads.com/siteheaderbannerimages/1717004244i/409.jpg",
"https://images-na.ssl-images-amazon.com/images/S/compressed.photo.goodreads.com/books/1378851771i/18480237.jpg",
"https://images-na.ssl-images-amazon.com/images/S/compressed.photo.goodreads.com/books/1378851771i/18480237.jpg",
"https://i.gr-assets.com/images/S/compressed.photo.goodreads.com/authors/1466856013i/1545124._UX200_CR0,41,200,200_.jpg",
"https://dryofg8nmyqjw.cloudfront.net/images/badge-apple_app_store@2x.png",
"https://dryofg8nmyqjw.cloudfront.net/images/badge-google_play_store@2x.png"
] |
[] |
[] |
[
""
] | null |
[
"Johannes V. Jensen"
] | null |
Read reviews from the world’s largest community
for readers. undefined
|
en
|
/favicon.ico
|
Goodreads
|
https://www.goodreads.com/book/show/18480237-nobel-prize-library-jensen-1944
|
Awarded the Nobel Prize in Literature in 1944 "for the rare strength and fertility of his poetic imagination with which is combined an intellectual curiosity of wide scope and a bold, freshly creative style."
Johannes Vilhelm Jensen (almindeligt kendt som Johannes V. Jensen) (20. januar 1873 i Farsø - 25. november 1950 på Østerbro, København) var en dansk forfatter der modtog Nobelprisen i litteratur i 1944.
Han var opvokset i Farsø i Himmerland i et dyrlægehjem med antireligiøse forældre. Han blev lægestuderende på Københavns universitet og arbejdede som journalist ved siden af for at finansiere sine studier. Efter 3 års studier valgte han at skifte karriere og gav sig selv til litteraturen. På det tidspunkt havde han allerede udgivet 12 romaner.[1] Johannes V. Jensen er kendt for digte, artikler, noveller og romaner. Hans mest kendte er novellesamlingen Himmerlandshistorier, romanerne Kongens Fald og Den lange Rejse. Han har også skrevet skuespillet Trods med baggrund i hans egen novelle Cecil fra Himmerlandshistorier. Det blev opført i USA af danske indvandrere.
Johannes V. Jensen rejste en del - Paris, London, Berlin, Norge, Sverige, Chicago & New York.
|
||||
correct_award_00093
|
FactBench
|
3
| 16
|
https://en.wikipedia.org/wiki/List_of_Nobel_laureates_by_country
|
en
|
List of Nobel laureates by country
|
https://en.wikipedia.org/static/favicon/wikipedia.ico
|
https://en.wikipedia.org/static/favicon/wikipedia.ico
|
[
"https://en.wikipedia.org/static/images/icons/wikipedia.png",
"https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-wordmark-en.svg",
"https://en.wikipedia.org/static/images/mobile/copyright/wikipedia-tagline-en.svg",
"https://upload.wikimedia.org/wikipedia/en/thumb/b/b4/Ambox_important.svg/40px-Ambox_important.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/9/99/Question_book-new.svg/50px-Question_book-new.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/f/f2/Edit-clear.svg/40px-Edit-clear.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/a/a4/Flag_of_the_United_States.svg/23px-Flag_of_the_United_States.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/a/ae/Flag_of_the_United_Kingdom.svg/23px-Flag_of_the_United_Kingdom.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/b/ba/Flag_of_Germany.svg/23px-Flag_of_Germany.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/c/c3/Flag_of_France.svg/23px-Flag_of_France.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/4/4c/Flag_of_Sweden.svg/23px-Flag_of_Sweden.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/f/f3/Flag_of_Russia.svg/23px-Flag_of_Russia.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/a/a9/Flag_of_the_Soviet_Union.svg/23px-Flag_of_the_Soviet_Union.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/9/9e/Flag_of_Japan.svg/23px-Flag_of_Japan.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/d/d9/Flag_of_Canada_%28Pantone%29.svg/23px-Flag_of_Canada_%28Pantone%29.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/0/08/Flag_of_Switzerland_%28Pantone%29.svg/16px-Flag_of_Switzerland_%28Pantone%29.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/4/41/Flag_of_Austria.svg/23px-Flag_of_Austria.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/2/20/Flag_of_the_Netherlands.svg/23px-Flag_of_the_Netherlands.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/0/03/Flag_of_Italy.svg/23px-Flag_of_Italy.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/1/12/Flag_of_Poland.svg/23px-Flag_of_Poland.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/c/c1/Flag_of_Hungary.svg/23px-Flag_of_Hungary.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/8/88/Flag_of_Australia_%28converted%29.svg/23px-Flag_of_Australia_%28converted%29.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/9/9c/Flag_of_Denmark.svg/20px-Flag_of_Denmark.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/d/d9/Flag_of_Norway.svg/21px-Flag_of_Norway.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/d/d4/Flag_of_Israel.svg/21px-Flag_of_Israel.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/4/41/Flag_of_India.svg/23px-Flag_of_India.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/9/92/Flag_of_Belgium_%28civil%29.svg/23px-Flag_of_Belgium_%28civil%29.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/4/45/Flag_of_Ireland.svg/23px-Flag_of_Ireland.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/a/af/Flag_of_South_Africa.svg/23px-Flag_of_South_Africa.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/f/fa/Flag_of_the_People%27s_Republic_of_China.svg/23px-Flag_of_the_People%27s_Republic_of_China.svg.png",
"https://upload.wikimedia.org/wikipedia/en/thumb/9/9a/Flag_of_Spain.svg/23px-Flag_of_Spain.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/c/cb/Flag_of_the_Czech_Republic.svg/23px-Flag_of_the_Czech_Republic.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/1/1a/Flag_of_Argentina.svg/23px-Flag_of_Argentina.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/b/bc/Flag_of_Finland.svg/23px-Flag_of_Finland.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/f/fe/Flag_of_Egypt.svg/23px-Flag_of_Egypt.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/7/73/Flag_of_Romania.svg/23px-Flag_of_Romania.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/4/49/Flag_of_Ukraine.svg/23px-Flag_of_Ukraine.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/7/72/Flag_of_the_Republic_of_China.svg/23px-Flag_of_the_Republic_of_China.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/8/85/Flag_of_Belarus.svg/23px-Flag_of_Belarus.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/1/1b/Flag_of_Croatia.svg/23px-Flag_of_Croatia.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/1/11/Flag_of_Lithuania.svg/23px-Flag_of_Lithuania.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/f/fc/Flag_of_Mexico.svg/23px-Flag_of_Mexico.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/3/3e/Flag_of_New_Zealand.svg/23px-Flag_of_New_Zealand.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/6/61/Flag_of_Yugoslavia_%281946-1992%29.svg/23px-Flag_of_Yugoslavia_%281946-1992%29.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/b/bf/Flag_of_Bosnia_and_Herzegovina.svg/23px-Flag_of_Bosnia_and_Herzegovina.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/7/78/Flag_of_Chile.svg/23px-Flag_of_Chile.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/2/21/Flag_of_Colombia.svg/23px-Flag_of_Colombia.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/2/26/Flag_of_East_Timor.svg/23px-Flag_of_East_Timor.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/5/5c/Flag_of_Greece.svg/23px-Flag_of_Greece.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/e/ec/Flag_of_Guatemala.svg/23px-Flag_of_Guatemala.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/b/b8/Flag_of_Liberia.svg/23px-Flag_of_Liberia.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/d/da/Flag_of_Luxembourg.svg/23px-Flag_of_Luxembourg.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/5/5c/Flag_of_Portugal.svg/23px-Flag_of_Portugal.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/9/9f/Flag_of_Saint_Lucia.svg/23px-Flag_of_Saint_Lucia.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/c/ce/Flag_of_Tunisia.svg/23px-Flag_of_Tunisia.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/b/b4/Flag_of_Turkey.svg/23px-Flag_of_Turkey.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/c/ca/Flag_of_Iran.svg/23px-Flag_of_Iran.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/3/32/Flag_of_Pakistan.svg/23px-Flag_of_Pakistan.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/7/77/Flag_of_Algeria.svg/23px-Flag_of_Algeria.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/f/f9/Flag_of_Bangladesh.svg/23px-Flag_of_Bangladesh.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/9/9a/Flag_of_Bulgaria.svg/23px-Flag_of_Bulgaria.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/f/f2/Flag_of_Costa_Rica.svg/23px-Flag_of_Costa_Rica.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/d/d4/Flag_of_Cyprus.svg/23px-Flag_of_Cyprus.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/6/6f/Flag_of_the_Democratic_Republic_of_the_Congo.svg/20px-Flag_of_the_Democratic_Republic_of_the_Congo.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/7/71/Flag_of_Ethiopia.svg/23px-Flag_of_Ethiopia.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/3/3c/Flag_of_the_Faroe_Islands.svg/21px-Flag_of_the_Faroe_Islands.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/1/19/Flag_of_Ghana.svg/23px-Flag_of_Ghana.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/5/5b/Flag_of_Hong_Kong.svg/23px-Flag_of_Hong_Kong.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/c/ce/Flag_of_Iceland.svg/21px-Flag_of_Iceland.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/f/f6/Flag_of_Iraq.svg/23px-Flag_of_Iraq.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/4/49/Flag_of_Kenya.svg/23px-Flag_of_Kenya.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/2/2f/Flag_of_Armenia.svg/23px-Flag_of_Armenia.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/8/84/Flag_of_Latvia.svg/23px-Flag_of_Latvia.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/5/59/Flag_of_Lebanon.svg/23px-Flag_of_Lebanon.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/7/79/Flag_of_North_Macedonia.svg/23px-Flag_of_North_Macedonia.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/2/2c/Flag_of_Morocco.svg/23px-Flag_of_Morocco.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/8/8c/Flag_of_Myanmar.svg/23px-Flag_of_Myanmar.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/7/79/Flag_of_Nigeria.svg/23px-Flag_of_Nigeria.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/0/00/Flag_of_Palestine.svg/23px-Flag_of_Palestine.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/c/cf/Flag_of_Peru.svg/23px-Flag_of_Peru.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/9/99/Flag_of_the_Philippines.svg/23px-Flag_of_the_Philippines.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/f/f0/Flag_of_Slovenia.svg/23px-Flag_of_Slovenia.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/0/09/Flag_of_South_Korea.svg/23px-Flag_of_South_Korea.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/3/38/Flag_of_Tanzania.svg/23px-Flag_of_Tanzania.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/3/3c/Flag_of_Tibet.svg/23px-Flag_of_Tibet.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/6/64/Flag_of_Trinidad_and_Tobago.svg/23px-Flag_of_Trinidad_and_Tobago.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/0/06/Flag_of_Venezuela.svg/23px-Flag_of_Venezuela.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/2/21/Flag_of_Vietnam.svg/23px-Flag_of_Vietnam.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/8/89/Flag_of_Yemen.svg/23px-Flag_of_Yemen.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/d/dc/BarPlot_by_Category_-_Country_of_Birth.svg/220px-BarPlot_by_Category_-_Country_of_Birth.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/5/5d/BarPlot_by_Category_-_Country_of_Dying.svg/220px-BarPlot_by_Category_-_Country_of_Dying.svg.png",
"https://upload.wikimedia.org/wikipedia/commons/thumb/9/9d/Clinton%261998NobelLaureates.jpg/240px-Clinton%261998NobelLaureates.jpg",
"https://upload.wikimedia.org/wikipedia/en/thumb/e/ed/Nobel_Prize.png/100px-Nobel_Prize.png",
"https://login.wikimedia.org/wiki/Special:CentralAutoLogin/start?type=1x1",
"https://en.wikipedia.org/static/images/footer/wikimedia-button.svg",
"https://en.wikipedia.org/static/images/footer/poweredby_mediawiki.svg"
] |
[] |
[] |
[
""
] | null |
[
"Contributors to Wikimedia projects"
] |
2005-04-28T20:47:27+00:00
|
en
|
/static/apple-touch/wikipedia.png
|
https://en.wikipedia.org/wiki/List_of_Nobel_laureates_by_country
|
This is a list of Nobel Prize laureates by country. Listings for Economics refer to the related Nobel Memorial Prize in Economic Sciences. The Nobel Prizes and the Prize in Economic Sciences have been awarded 577 times to 889 recipients, of which 26 awards (all Peace Prizes) were to organizations. Due to some recipients receiving multiple awards, the total number of recipients is 860 individuals and 22 organizations.[1]
The present list ranks laureates under the country/countries that are stated by the Nobel Prize committee on its website.[2] The list does not distinguish between laureates who received a full prize and the majority who shared a prize.[3] Some laureates are listed under more than one country, because the official website mentions multiple countries in relation to the laureate.[4] If a country is merely mentioned as the place of birth, an asterisk (*) is used in the respective listing to indicate this.[5] In this case, the birth country is mentioned in italics at the other listings of this laureate.
Organizations are listed here if the Nobel Prize committee relates them to a single country.[6]
Notes
César Milstein*, Physiology or Medicine, 1984
Adolfo Pérez Esquivel, Peace, 1980
Luis Federico Leloir, born in France, Chemistry, 1970
Bernardo Houssay, Physiology or Medicine, 1947
Carlos Saavedra Lamas, Peace, 1936
Ardem Patapoutian, born in Lebanon, Physiology or Medicine, 2021
Brian Schmidt, born in the United States, Physics, 2011
Elizabeth Blackburn*, Physiology or Medicine, 2009
Barry Marshall, Physiology or Medicine, 2005
J. Robin Warren, Physiology or Medicine, 2005
Peter C. Doherty, Physiology or Medicine, 1996
Rolf Zinkernagel, Physiology or Medicine, 1996
John Cornforth*, Chemistry, 1975
Patrick White, born in the United Kingdom, Literature, 1973
Aleksandr M. Prokhorov, Physics, 1964
John Carew Eccles, Physiology or Medicine, 1963
Sir Frank Macfarlane Burnet, Physiology or Medicine, 1960
Howard Florey, Physiology or Medicine, 1945
William Lawrence Bragg, Physics, 1915
William Henry Bragg, Physics, 1915
Anton Zeilinger, Physics, 2022
Peter Handke, Literature, 2019
Martin Karplus*, Chemistry, 2013
Elfriede Jelinek, Literature, 2004
Eric Kandel*, Physiology or Medicine, 2000
Walter Kohn*, Chemistry, 1998
Friedrich Hayek, Economics, 1974
Konrad Lorenz, Physiology or Medicine, 1973
Karl von Frisch*, Physiology or Medicine, 1973
Max Perutz, Chemistry, 1962
Carl Ferdinand Cori, born in Austria, Physiology or Medicine, 1947
Gerty Cori, born in Austria, Physiology or Medicine, 1947
Wolfgang Pauli, Physics, 1945
Richard Kuhn*, Chemistry, 1938
Otto Loewi, born in Germany, Physiology or Medicine, 1936
Victor Francis Hess, Physics, 1936
Erwin Schrödinger, Physics, 1933
Karl Landsteiner, Physiology or Medicine, 1930
Julius Wagner-Jauregg, Physiology or Medicine, 1927
Fritz Pregl, born in Austria-Hungary, now Slovenia, Chemistry, 1923
Alfred Hermann Fried, Peace, 1911
Bertha von Suttner, born in the Austrian Empire, now Czech Republic, Peace, 1905
Vladimir Prelog*, Resided in what is now Croatia, born in Condominium of Bosnia and Herzegovina, Austro-Hungarian Empire (now Bosnia and Herzegovina), Chemistry, 1975
Leopold Ružička*, Resided in what is now Croatia, born in the Kingdom of Croatia-Slavonia part of Kingdom of Hungary in Austro-Hungarian Empire (now Croatia), Chemistry, 1939
Ivo Andrić*, Born in Condominium of Bosnia and Herzegovina, Austro-Hungarian Empire (now Bosnia and Herzegovina), Ethnic Croat, Literature, 1961
Muhammad Yunus, Peace, 2006
Ales Bialiatski, Peace, 2022
Svetlana Alexievich, born in Ukraine, Literature, 2015
Zhores Alferov*, Physics, 2000
François Englert, Physics, 2013
Ilya Prigogine, born in Russia, Chemistry, 1977
Christian de Duve, born in the United Kingdom, Physiology or Medicine, 1974
Albert Claude, Physiology or Medicine, 1974
Dominique Pire, Peace, 1958
Corneille Heymans, Physiology or Medicine, 1938
Jules Bordet, Physiology or Medicine, 1919
Henri La Fontaine, Peace, 1913
Maurice Maeterlinck, Literature, 1911
Auguste Beernaert, Peace, 1909
Institut de Droit International, Peace, 1904
Vladimir Prelog*, born in Condominium of Bosnia and Herzegovina, Austro-Hungarian Empire (now Bosnia and Herzegovina), Chemistry, 1975
Ivo Andric, born in Condominium of Bosnia and Herzegovina, Austro-Hungarian Empire, now Bosnia and Herzegovina, Literature, 1961
Peter Brian Medawar*, Physiology or Medicine, 1960
Elias Canetti*, Literature, 1981
David Card, Economics, 2021
Jim Peebles, Physics, 2019
Donna Strickland, Physics, 2018
Arthur B. McDonald, Physics, 2015
Alice Munro, Literature, 2013
Ralph M. Steinman, Physiology or Medicine, 2011
Willard S. Boyle*, Physics, 2009
Jack W. Szostak, born in the United Kingdom, Physiology or Medicine, 2009
Robert Mundell, Economics, 1999
Myron Scholes*, Economics, 1997
William Vickrey*, Economics, 1996
Pugwash Conferences on Science and World Affairs, Peace, 1995
Bertram Brockhouse, Physics, 1994
Michael Smith, born in the United Kingdom, Chemistry, 1993
Rudolph A. Marcus*, Chemistry, 1992
Richard E. Taylor, Physics, 1990
Sidney Altman*, Chemistry, 1989
John Polanyi, born in Germany, Chemistry, 1986
Henry Taube*, Chemistry, 1983
David H. Hubel*, Physiology or Medicine, 1981
Saul Bellow*, Literature, 1976
Gerhard Herzberg, born in Germany, Chemistry, 1971
Charles B. Huggins*, Physiology or Medicine, 1966
Lester B. Pearson, Peace, 1957
William Giauque*, Chemistry, 1949
Frederick Banting, Physiology or Medicine, 1923
John Macleod, born in the United Kingdom of Great Britain and Ireland, Physiology or Medicine, 1923
Pablo Neruda, Literature, 1971
Gabriela Mistral, Literature, 1945
Tu Youyou, Physiology or Medicine, 2015
Mo Yan, Literature, 2012
Liu Xiaobo, Peace, 2010
Charles K. Kao*, Physics, 2009
Gao Xingjian*, Literature, 2000
Daniel C. Tsui*, Physics, 1998
Chen-Ning Yang*, Physics, 1957
Tsung-Dao Lee*, Physics, 1957
Juan Manuel Santos Calderón, Peace, 2016
Gabriel García Márquez, Literature, 1982
Denis Mukwege, Peace, 2018
Óscar Arias Sánchez, Peace, 1987
Vladimir Prelog*, Resided in what is now Croatia, born in Condominium of Bosnia and Herzegovina, Austro-Hungarian Empire (now Bosnia and Herzegovina), Chemistry, 1975
Leopold Ružička*, Resided in what is now Croatia, born in the Kingdom of Croatia-Slavonia part of Kingdom of Hungary in Austro-Hungarian Empire (now Croatia), Chemistry, 1939
Christopher A. Pissarides*, Economics, 2010
Peter Grünberg*, born in the Protectorate of Bohemia and Moravia, (now Czech Republic), Physics, 2007
Jaroslav Seifert, born in Prague, Austria-Hungary, (now Czech Republic), Literature, 1984
Jaroslav Heyrovský, born in Prague, Austria-Hungary, (now Czech Republic), Chemistry, 1959
Carl Ferdinand Cori*, born in Prague, Austria-Hungary, (now Czech Republic), Physiology or Medicine, 1947
Gerty Cori*, born in Prague, Austria-Hungary, (now Czech Republic), Physiology or Medicine, 1947
Bertha von Suttner*, born in Prague, Austria-Hungary, (now Czech Republic), Peace, 1905
Morten P. Meldal, Chemistry, 2022
Jens Christian Skou, Chemistry, 1997
Niels Kaj Jerne, born in United Kingdom, Physiology or Medicine, 1984
Aage Bohr, Physics, 1975
Ben Roy Mottelson, born in United States, Physics, 1975
Johannes V. Jensen, Literature, 1944
Henrik Dam, Physiology or Medicine, 1943
Johannes Fibiger, Physiology or Medicine, 1926
Niels Bohr, Physics, 1922
August Krogh, Physiology or Medicine, 1920
Karl Adolph Gjellerup, Literature, 1917
Henrik Pontoppidan, Literature, 1917
Fredrik Bajer, Peace, 1908
Niels Ryberg Finsen, born in Faroe Islands, Physiology or Medicine, 1903
Carlos Filipe Ximenes Belo, Peace, 1996
José Ramos-Horta, Peace, 1996
Mohamed ElBaradei, Peace, 2005
Ahmed Zewail, Chemistry, 1999
Naguib Mahfouz, Literature, 1988
Anwar Sadat, Peace, 1978
Abiy Ahmed Ali, Peace, 2019
Niels Ryberg Finsen*, Physiology or Medicine, 1903
Bengt R. Holmström, Economics, 2016
Martti Ahtisaari, Peace, 2008
Ragnar Granit, born in the Grand Duchy of Finland, a part of the Russian Empire in 1809–1917, Physiology or Medicine, 1967
Artturi Ilmari Virtanen, born in the Grand Duchy of Finland, a part of the Russian Empire in 1809–1917, Chemistry, 1945
Frans Eemil Sillanpää, born in the Grand Duchy of Finland, a part of the Russian Empire in 1809–1917, Literature, 1939
Moungi Bawendi, Chemistry, 2023
Pierre Agostini, born in Tunisia, Physics, 2023
Anne L'Huillier, Physics, 2023
Annie Ernaux, Literature, 2022
Alain Aspect, Physics, 2022
Emmanuelle Charpentier, Chemistry, 2020
Esther Duflo, Economics, 2019
Gérard Mourou, Physics, 2018
Jean-Pierre Sauvage, Chemistry, 2016
Jean Tirole, Economics, 2014
Patrick Modiano, Literature, 2014
Serge Haroche, born in Morocco, then under French protectorate, Physics, 2012
Jules A. Hoffmann, born in Luxembourg, Physiology or Medicine, 2011
J. M. G. Le Clézio, Literature, 2008
Luc Montagnier, Physiology or Medicine, 2008
Françoise Barré-Sinoussi, Physiology or Medicine, 2008
Albert Fert, Physics, 2007
Yves Chauvin, Chemistry, 2005
Gao Xingjian, born in China, Literature, 2000
Médecins Sans Frontières, Peace, 1999
Claude Cohen-Tannoudji, born in French Algeria, Physics, 1997
Georges Charpak, born in then Poland (Second Polish Republic), now Ukraine, Physics, 1992
Pierre-Gilles de Gennes, Physics, 1991
Maurice Allais, Economics, 1988
Jean-Marie Lehn, Chemistry, 1987
Claude Simon, Literature, 1985
Gérard Debreu, Economics, 1983
Jean Dausset, Physiology or Medicine, 1980
Roger Guillemin*, Physiology or Medicine, 1977
Seán MacBride*, Peace, 1974
Louis Néel, Physics, 1970
Luis Federico Leloir*, Chemistry, 1970
René Cassin, Peace, 1968
Alfred Kastler, Physics, 1966
François Jacob, Physiology or Medicine, 1965
Jacques Monod, Physiology or Medicine, 1965
André Lwoff, Physiology or Medicine, 1965
Jean-Paul Sartre, Literature, 1964 (declined the prize)
Saint-John Perse, Literature, 1960
Albert Camus, born in French Algeria, Literature, 1957
André Frédéric Cournand, Physiology or Medicine, 1956
François Mauriac, Literature, 1952
Albert Schweitzer, born in Alsace, then in Germany, Peace, 1952
Léon Jouhaux, Peace, 1951
André Gide, Literature, 1947
Roger Martin du Gard, Literature, 1937
Frédéric Joliot-Curie, Chemistry, 1935
Irène Joliot-Curie, Chemistry, 1935
Ivan Bunin, born in Russia, Literature, 1933
Louis de Broglie, Physics, 1929
Charles Nicolle, Physiology or Medicine, 1928
Henri Bergson, Literature, 1927
Ferdinand Buisson, Peace, 1927
Aristide Briand, Peace, 1926
Jean Baptiste Perrin, Physics, 1926
Anatole France, Literature, 1921
Léon Bourgeois, Peace, 1920
Romain Rolland, Literature, 1915
Alfred Werner*, Chemistry, 1913
Charles Richet, Physiology or Medicine, 1913
Alexis Carrel, Physiology or Medicine, 1912
Paul Sabatier, Chemistry, 1912
Victor Grignard, Chemistry, 1912
Marie Curie, born in Congress Poland (Russian Empire), Chemistry, 1911
Paul-Henri-Benjamin d'Estournelles de Constant, Peace, 1909
Gabriel Lippmann, born in Luxembourg, Physics, 1908
Charles Louis Alphonse Laveran, Physiology or Medicine, 1907
Louis Renault, Peace, 1907
Henri Moissan, Chemistry, 1906
Frédéric Mistral, Literature, 1904
Antoine Henri Becquerel, Physics, 1903
Pierre Curie, Physics, 1903
Marie Curie, born in Congress Poland, (Russian Empire), Physics, 1903
Henry Dunant, born in the Swiss Confederation, Peace, 1901
Frédéric Passy, Peace, 1901
Sully Prudhomme, Literature, 1901
Svante Pääbo, born in Sweden, Physiology or Medicine, 2022
Benjamin List, Chemistry, 2021
Klaus Hasselmann, Physics, 2021
Emmanuelle Charpentier*, Chemistry, 2020
Reinhard Genzel, Physics, 2020
John B. Goodenough*, Chemistry, 2019
Joachim Frank*, Chemistry, 2017
Rainer Weiss*, Physics, 2017
Stefan Hell, born in Romania, Chemistry, 2014
Thomas C. Südhof, Physiology or Medicine, 2013
Herta Müller, born in Romania, Literature, 2009
Harald zur Hausen, Physiology or Medicine, 2008
Gerhard Ertl, Chemistry, 2007
Peter Grünberg, born in the Protectorate of Bohemia and Moravia, now the Czech Republic, Physics, 2007
Theodor W. Hänsch, Physics, 2005
Wolfgang Ketterle, Physics, 2001
Herbert Kroemer, Physics, 2000
Günter Blobel*, Physiology or Medicine, 1999
Günter Grass, born in Free City of Danzig, now Poland, Literature, 1999
Horst L. Störmer, Physics, 1998
Paul J. Crutzen, Chemistry, 1995
Christiane Nüsslein-Volhard, Physiology or Medicine, 1995
Reinhard Selten, Economics, 1994
Bert Sakmann, Physiology or Medicine, 1991
Erwin Neher, Physiology or Medicine, 1991
Hans G. Dehmelt*, Physics, 1989
Wolfgang Paul, Physics, 1989
Johann Deisenhofer, Chemistry, 1988
Robert Huber, Chemistry, 1988
Hartmut Michel, Chemistry, 1988
Jack Steinberger*, Physics, 1988
J. Georg Bednorz, Physics, 1987
John Polanyi*, Chemistry, 1986
Ernst Ruska, Physics, 1986
Gerd Binnig, Physics, 1986
Klaus von Klitzing, Physics, 1985
Georges J.F. Köhler*, Physiology or Medicine, 1984
Georg Wittig, Chemistry, 1979
Arno Penzias*, Physics, 1978
Henry Kissinger*, Peace, 1978
Ernst Otto Fischer, Chemistry, 1973
Karl von Frisch, born in Austria-Hungary, now Austria, Physiology or Medicine, 1973
Heinrich Böll, Literature, 1972
Gerhard Herzberg*, Chemistry, 1971
Willy Brandt, Peace, 1971
Bernard Katz*, Physiology or Medicine, 1970
Max Delbrück*, Physiology or Medicine, 1969
Manfred Eigen, Chemistry, 1967
Hans Bethe*, Physics, 1967
Nelly Sachs*, Literature, 1966
Feodor Felix Konrad Lynen, Physiology or Medicine, 1964
Konrad Bloch*, Physiology or Medicine, 1964
Karl Ziegler, Chemistry, 1963
Maria Goeppert-Mayer*, Physics, 1963
J. Hans D. Jensen, Physics, 1963
Rudolf Mössbauer, Physics, 1961
Werner Forssmann, Physiology or Medicine, 1956
Polykarp Kusch*, Physics, 1955
Max Born*, Physics, 1954
Walther Bothe, Physics, 1954
Hermann Staudinger, Chemistry, 1953
Fritz Albert Lipmann*, Physiology or Medicine, 1953
Hans Adolf Krebs*, Physiology or Medicine, 1953
Albert Schweitzer*, Peace, 1952
Otto Diels, Chemistry, 1950
Kurt Alder, Chemistry, 1950
Hermann Hesse*, Literature, 1946
Ernst Boris Chain*, Physiology or Medicine, 1945
Otto Hahn, Chemistry, 1944
Otto Stern*, Physics, 1943
Adolf Butenandt, Chemistry, 1939
Gerhard Domagk, Physiology or Medicine, 1939
Richard Kuhn, born in Austria-Hungary, now Austria, Chemistry, 1938
Otto Loewi*, Physiology or Medicine, 1936
Carl von Ossietzky, Peace, 1935
Hans Spemann, Physiology or Medicine, 1935
Werner Karl Heisenberg, Physics, 1932
Otto Heinrich Warburg, Physiology or Medicine, 1931
Carl Bosch, Chemistry, 1931
Friedrich Bergius, Chemistry, 1931
Hans Fischer, Chemistry, 1930
Thomas Mann, Literature, 1929
Hans von Euler-Chelpin*, Chemistry, 1929
Adolf Otto Reinhold Windaus, Chemistry, 1928
Ludwig Quidde, Peace, 1927
Heinrich Otto Wieland, Chemistry, 1927
Gustav Stresemann, Peace, 1926
James Franck, Physics, 1925
Gustav Ludwig Hertz, Physics, 1925
Otto Fritz Meyerhof, Physiology or Medicine, 1922
Albert Einstein, Physics, 1921
Walther Nernst, Chemistry, 1920
Johannes Stark, Physics, 1919
Fritz Haber, Chemistry, 1918
Max Karl Ernst Ludwig Planck, Physics, 1918
Richard Willstätter, Chemistry, 1915
Max von Laue, Physics, 1914
Gerhart Hauptmann, born in Prussia, now Poland, Literature, 1912
Wilhelm Wien, Physics, 1911
Otto Wallach, Chemistry, 1910
Albrecht Kossel, Physiology or Medicine, 1910
Paul Johann Ludwig Heyse, Literature, 1910
Karl Ferdinand Braun, Physics, 1909
Wilhelm Ostwald, born in Russia, now Latvia, Chemistry, 1909
Rudolf Christoph Eucken, Literature, 1908
Paul Ehrlich, Physiology or Medicine, 1908
Eduard Buchner, Chemistry, 1907
Robert Koch, Physiology or Medicine, 1905
Philipp Lenard, born in Pressburg, Kingdom of Hungary, Austrian Empire, now Slovakia, Physics, 1905
Adolf von Baeyer, Chemistry, 1905
Hermann Emil Fischer, Chemistry, 1902
Theodor Mommsen, born in Duchy of Schleswig, Literature, 1902
Emil Adolf von Behring, Physiology or Medicine, 1901
Wilhelm Conrad Röntgen, Physics, 1901
Kofi Annan, Peace, 2001
Odysseas Elytis, Literature, 1979
Giorgos Seferis, born in then Ottoman Empire, now Turkey, Literature, 1963
Rigoberta Menchú, Peace, 1992
Miguel Ángel Asturias, Literature, 1967
Charles K. Kao, Physics, 2009
Ferenc Krausz, Physics, 2023
Katalin Karikó, Physiology or Medicine, 2023
Ferenc Herskó*, as an Israeli citizen, Chemistry, 2004
Imre Kertész, Literature, 2002
György András Oláh, Chemistry, 1994
János Harsányi, Economics, 1994
János Polányi, born in Germany, Chemistry, 1986
Dénes Gábor, Physics, 1971
Jenő Pál Wigner, Physics, 1963
György Békésy, Physiology or Medicine, 1961
György Hevesy, Chemistry, 1943
Albert Szent-Györgyi, Physiology or Medicine, 1937
Richárd Zsigmondy, Chemistry, 1925
Róbert Bárány, born in Austria, Physiology or Medicine, 1914
Fülöp Lénárd, Physics, 1905
Halldór Laxness, Literature, 1955
Abhijit Banerjee*, Economics, 2019
Kailash Satyarthi, Peace, 2014
Venkatraman Ramakrishnan*, Chemistry, 2009
V. S. Naipaul*, Literature, 2001
Amartya Sen, Economics, 1998
Subramanyan Chandrasekhar*, Physics, 1983
Mother Teresa, born in then Ottoman Empire, now North Macedonia, Peace, 1979
Har Gobind Khorana*, Physiology or Medicine, 1968
C. V. Raman , Physics, 1930
Rabindranath Tagore ', Literature, 1913
Narges Mohammadi, Peace, 2023
Shirin Ebadi, Peace, 2003
Nadia Murad, Peace, 2018
William C. Campbell, Physiology or Medicine, 2015
John Hume, Peace, 1998
David Trimble, Peace, 1998
Séamus Heaney, Literature, 1995
Mairead Corrigan, Peace, 1976
Betty Williams, Peace, 1976
Seán MacBride, born in France, Peace, 1974
Samuel Beckett, Literature, 1969
Ernest Walton, Physics, 1951
George Bernard Shaw*, Literature, 1925
W. B. Yeats, Literature, 1923
Joshua Angrist, born in the United States, Economics, 2021
Arieh Warshel, Chemistry, 2013
Michael Levitt, born in South Africa, Chemistry, 2013
Dan Shechtman, Chemistry, 2011
Ada Yonath, Chemistry, 2009
Robert Aumann, born in Germany, moved to Israel from the United States, Economics, 2005
Aaron Ciechanover, Chemistry, 2004
Avram Hershko, born in Hungary, Chemistry, 2004
Daniel Kahneman*, Economics, 2002
Yitzhak Rabin, Peace, 1994
Shimon Peres, born in Poland, now Belarus, Peace, 1994
Menachem Begin, born in Russia, now Belarus, Peace, 1978
Shmuel Yosef Agnon, born in Austria-Hungary, now Ukraine, Literature, 1966
Giorgio Parisi, Physics, 2021
Mario R. Capecchi*, Physiology or Medicine, 2007
Riccardo Giacconi*, Physics, 2002
Dario Fo, Literature, 1997
Rita Levi-Montalcini, Physiology or Medicine, 1986
Franco Modigliani*, Economics, 1985
Carlo Rubbia, Physics, 1984
Eugenio Montale, Literature, 1975
Renato Dulbecco*, Physiology or Medicine, 1975
Salvador Luria*, Physiology or Medicine, 1969
Giulio Natta, Chemistry, 1963
Emilio Segrè*, Physics, 1959
Salvatore Quasimodo, Literature, 1959
Daniel Bovet, born in Switzerland, Physiology or Medicine, 1957
Enrico Fermi, Physics, 1938
Luigi Pirandello, Literature, 1934
Grazia Deledda, Literature, 1926
Guglielmo Marconi, Physics, 1909
Ernesto Teodoro Moneta, Peace, 1907
Camillo Golgi, Physiology or Medicine, 1906
Giosuè Carducci, Literature, 1906
Syukuro Manabe*, Physics, 2021
Akira Yoshino, Chemistry, 2019
Tasuku Honjo, Physiology or Medicine, 2018
Kazuo Ishiguro*, Literature, 2017
Yoshinori Ohsumi, Physiology or Medicine, 2016
Takaaki Kajita, Physics, 2015
Satoshi Ōmura, Physiology or Medicine, 2015
Hiroshi Amano, Physics, 2014
Isamu Akasaki, Physics, 2014
Shinya Yamanaka, Physiology or Medicine, 2012
Akira Suzuki, Chemistry, 2010
Ei-ichi Negishi, Born in China, Chemistry, 2010
Osamu Shimomura, Chemistry, 2008
Toshihide Maskawa, Physics, 2008
Makoto Kobayashi, Physics, 2008
Yoichiro Nambu*, Physics, 2008
Koichi Tanaka, Chemistry, 2002
Masatoshi Koshiba, Physics, 2002
Ryōji Noyori, Chemistry, 2001
Hideki Shirakawa, Chemistry, 2000
Kenzaburō Ōe, Literature, 1994
Susumu Tonegawa, Physiology or Medicine, 1987
Kenichi Fukui, Chemistry, 1981
Eisaku Satō, Peace, 1974
Leo Esaki, Physics, 1973
Yasunari Kawabata, Literature, 1968
Sin-Itiro Tomonaga, Physics, 1965
Hideki Yukawa, Physics, 1949
Wangari Maathai, Peace, 2004
Ardem Patapoutian*, Physiology or Medicine, 2021
Wilhelm Ostwald*, Chemistry, 1909[7]
Ellen Johnson Sirleaf, Peace, 2011
Leymah Gbowee, Peace, 2011
Aaron Klug*, Chemistry, 1982
Jules A. Hoffmann*, Physiology or Medicine, 2011
Gabriel Lippmann*, Physics, 1908
Mario José Molina Henríquez*, Chemistry, 1995
Octavio Paz Lozano, Literature, 1990
Alfonso García Robles, Peace, 1982
Aung San Suu Kyi, Peace, 1991
Serge Haroche*, Physics, 2012; a French citizen as per his Wikipedia page
Guido Imbens, Economics, 2021
Ben Feringa, Chemistry, 2016
Organisation for the Prohibition of Chemical Weapons, Peace, 2013
Andre Geim, born in Soviet Union, now Russia, Physics, 2010
Martinus J. G. Veltman, Physics, 1999
Gerard 't Hooft, Physics, 1999
Paul J. Crutzen, Chemistry, 1995
Simon van der Meer, Physics, 1984
Nicolaas Bloembergen*, Physics, 1981
Tjalling C. Koopmans, Economics, 1975
Nikolaas Tinbergen*, Physiology or Medicine, 1973
Jan Tinbergen, Economics, 1969
Frits Zernike, Physics, 1953
Peter Debye, Chemistry, 1936
Christiaan Eijkman, Physiology or Medicine, 1929
Willem Einthoven, Physiology or Medicine, 1924
Heike Kamerlingh Onnes, Physics, 1913
Tobias Asser, Peace, 1911
Johannes Diderik van der Waals, Physics, 1910
Pieter Zeeman, Physics, 1902
Hendrik Lorentz, Physics, 1902
Jacobus Henricus van 't Hoff, Chemistry, 1901
Alan MacDiarmid, Chemistry, 2000
Maurice Wilkins, Physiology or Medicine, 1962
Ernest Rutherford*, Chemistry, 1908
Wole Soyinka, Literature, 1986
Mother Teresa*, born in then Ottoman Empire, now North Macedonia, Peace, 1979
Jon Fosse, Literature, 2023
May-Britt Moser, Physiology or Medicine, 2014
Edvard Moser, Physiology or Medicine, 2014
Finn E. Kydland, Economics, 2004
Trygve Haavelmo, Economics, 1989
Ivar Giaever, Physics, 1973
Ragnar Frisch, Economics, 1969
Odd Hassel, Chemistry, 1969
Lars Onsager, Chemistry, 1968
Sigrid Undset, Literature, 1928
Fridtjof Nansen, Peace, 1922
Christian Lous Lange, Peace, 1921
Knut Hamsun, Literature, 1920
Bjørnstjerne Bjørnson, Literature, 1903
Abdus Salam, Physics, 1979
Malala Yousafzai, Peace, 2014
Yasser Arafat, Born in Cairo, Egypt, Peace, 1994
Mario Vargas Llosa*, Literature, 2010
Maria Ressa, Peace, 2021
Olga Tokarczuk, Literature, 2018
Leonid Hurwicz*, born in then Russian Republic (now Russia), Economics, 2007
Wisława Szymborska, Literature, 1996
Joseph Rotblat*, born in Congress Poland (Russian Empire), Peace, 1995
Shimon Peres*, as an Israeli citizen, Peace, 1994
Georges Charpak*, born in Dąbrowica Poland (now in Ukraine), Physics, 1992
Lech Wałęsa, born in Popowo, Reichsgau Danzig-West Prussia, Germany (today Poland), Peace, 1983
Roald Hoffmann*, born in Złoczów Poland (now in Ukraine), Chemistry, 1981
Czesław Miłosz*, born in Russian Empire, now Lithuania, Literature, 1980
Isaac Bashevis Singer*, born in Congress Poland (Russian Empire), Literature, 1978
Menachem Begin*, as an Israeli citizen, he also had Polish citizenship, Peace, 1978
Andrew Schally*, born in Vilnius, Poland (now Vilnius, Lithuania), Physiology or Medicine, 1977
Tadeusz Reichstein*, born in Congress Poland (Russian Empire), Physiology or Medicine, 1950
Isidor Isaac Rabi*, born in Rymanów, Austria-Hungary (now in Poland), Physics, 1944
Władysław Reymont, born in Congress Poland (Russian Empire), Literature, 1924
Marie Skłodowska-Curie, born in Congress Poland (Russian Empire), Chemistry, 1911
Albert A. Michelson*, born in Strelno, North German Confederation (now in Poland), Physics, 1907
Henryk Sienkiewicz, born in Congress Poland (Russian Empire), Literature, 1905
Marie Skłodowska-Curie, born in Congress Poland (Russian Empire), Physics, 1903
José Saramago, Literature, 1998
António Egas Moniz, Physiology or Medicine, 1949
Stefan Hell*, Chemistry, 2014
Herta Müller*, Literature, 2009
Elie Wiesel*, Peace, 1986
George E. Palade*, Physiology or Medicine, 1974
Alexei Ekimov, Chemistry, 2023
Memorial, Peace, 2022
Dmitry Muratov, Peace, 2021
Andre Geim*, Physics, 2010
Konstantin Novoselov*, Physics, 2010
Leonid Hurwicz*, Economics, 2007
Alexei Alexeyevich Abrikosov*, Physics, 2003
Vitaly Ginzburg, Physics, 2003
Zhores Alferov, born in then Soviet Union, now Belarus, Physics, 2000
Mikhail Gorbachev, Peace, 1990
Joseph Brodsky, born in Russia, Literature, 1987
Pyotr Kapitsa, Physics, 1978
Menachem Begin*, as an Israeli citizen, Peace, 1978
Ilya Prigogine*, Chemistry, 1977
Andrei Sakharov, Peace, 1975
Leonid Kantorovich, Economics, 1975
Simon Kuznets, now Belarus, Economics, 1971
Aleksandr Solzhenitsyn, Literature, 1970
Mikhail Sholokhov, Literature, 1965
Nikolay Basov, Physics, 1964
Alexander Prokhorov, born in Australia, Physics, 1964
Lev Landau, born in then Russian Empire, now Azerbaijan, laureate when citizen of the Soviet Union, Physics, 1962
Boris Pasternak, Literature, 1958 (forced to decline)
Pavel Cherenkov, Physics, 1958
Igor Tamm, Physics, 1958
Ilya Mikhailovich Frank, Physics, 1958
Nikolay Semyonov, Chemistry, 1956
Ivan Bunin*, Literature, 1933
Élie Metchnikoff, born in now Ukraine, Physiology or Medicine, 1908
Ivan Pavlov, Physiology or Medicine, 1904
Derek Walcott, Literature, 1992
W. Arthur Lewis*, Economics, 1979
Friderik Pregl*, born in Austria-Hungary, now Slovenia, Chemistry, 1923
Michael Levitt*, Chemistry, 2013
J. M. Coetzee, Literature, 2003
Sydney Brenner*, Physiology or Medicine, 2002
F. W. de Klerk, Peace, 1993
Nelson Mandela, Peace, 1993
Nadine Gordimer, Literature, 1991
Desmond Tutu, Peace, 1984
Aaron Klug*, Chemistry, 1982
Allan M. Cormack*, Physiology or Medicine, 1979
Albert Lutuli, born in then Rhodesia, now Zimbabwe, Peace, 1960
Max Theiler, Physiology or Medicine, 1951
Kim Dae-jung, Peace, 2000
Mario Vargas Llosa, born in Peru, Literature, 2010
Camilo José Cela, Literature, 1989
Vicente Aleixandre, Literature, 1977
Severo Ochoa*, Physiology or Medicine, 1959
Juan Ramón Jiménez, Literature, 1956
Jacinto Benavente, Literature, 1922
Santiago Ramón y Cajal, Physiology or Medicine, 1906
José Echegaray, Literature, 1904
Anne L'Huillier, born in France, Physics, 2023
Svante Pääbo, Physiology or Medicine, 2022
Tomas Lindahl, Chemistry, 2015
Tomas Tranströmer, Literature, 2011
Arvid Carlsson, Physiology or Medicine, 2000
Alva Myrdal, Peace, 1982
Sune Bergström, Physiology or Medicine, 1982
Bengt I. Samuelsson, Physiology or Medicine, 1982
Kai Siegbahn, Physics, 1981
Torsten Wiesel, Physiology or Medicine, 1981
Bertil Ohlin, Economics, 1977
Eyvind Johnson, Literature, 1974
Harry Martinson, Literature, 1974
Gunnar Myrdal, Economics, 1974
Ulf von Euler, Physiology or Medicine, 1970
Hannes Alfvén, Physics, 1970
Ragnar Granit, born in the Grand Duchy of Finland, then a part of Russia, Physiology or Medicine, 1967
Nelly Sachs, born in Germany, Literature, 1966
Dag Hammarskjöld, Peace, 1961 (posthumously)
Hugo Theorell, Physiology or Medicine, 1955
Pär Lagerkvist, Literature, 1951
Arne Tiselius, Chemistry, 1948
Erik Axel Karlfeldt, Literature, 1931 (posthumously)
Nathan Söderblom, Peace, 1930
Hans von Euler-Chelpin, born in Germany, Chemistry, 1929
Theodor Svedberg, Chemistry, 1926
Karl Manne Siegbahn, Physics, 1924
Hjalmar Branting, Peace, 1921
Carl Gustaf Verner von Heidenstam, Literature, 1916
Gustaf Dalén, Physics, 1912
Allvar Gullstrand, Physiology or Medicine, 1911
Selma Lagerlöf, Literature, 1909
Klas Pontus Arnoldson, Peace, 1908
Svante Arrhenius, Chemistry, 1903
Michel Mayor, Physics, 2019
Didier Queloz, Physics, 2019
Jacques Dubochet, Chemistry, 2017
Kurt Wüthrich, Chemistry, 2002
Rolf M. Zinkernagel, Physiology or Medicine, 1996
Edmond H. Fischer, Physiology or Medicine, 1992
Richard R. Ernst, Chemistry, 1991
Karl Alexander Müller, Physics, 1987
Heinrich Rohrer, Physics, 1986
Werner Arber, Physiology or Medicine, 1978
Daniel Bovet, Physiology or Medicine, 1957
Felix Bloch, Physics, 1952
Tadeusz Reichstein, Physiology or Medicine, 1950
Walter Rudolf Hess, Physiology or Medicine, 1949
Paul Hermann Müller, Physiology or Medicine, 1948
Hermann Hesse, born in Germany, Literature, 1946
Paul Karrer, Chemistry, 1937
Albert Einstein, born in Germany, Physics, 1921
Charles Édouard Guillaume, Physics, 1920
Carl Spitteler, Literature, 1919
Alfred Werner, Chemistry, 1913
Theodor Kocher, Physiology or Medicine, 1909
Élie Ducommun, Peace, 1902
Charles Albert Gobat, Peace, 1902
Henry Dunant, Peace, 1901
Vladimir Prelog*, Resided in what is now Croatia, born in Condominium of Bosnia and Herzegovina, Austro-Hungarian Empire (now Bosnia and Herzegovina), Chemistry, 1975
Leopold Ružička*, Resided in what is now Croatia, born in the Kingdom of Croatia-Slavonia part of Kingdom of Hungary in Austro-Hungarian Empire (now Croatia), Chemistry, 1939
Yuan T. Lee, Chemistry, 1986
Samuel C. C. Ting, Physics, 1976
Chen-Ning Yang*, Physics, 1957
Tsung-Dao Lee*, Physics, 1957
Abdulrazak Gurnah*, Literature, 2021
14th Dalai Lama, Peace, 1989
V. S. Naipaul*, Literature, 2001
Moungi Bawendi, born in France, Chemistry, 2023
Tunisian National Dialogue Quartet, Peace, 2015
Aziz Sancar, Chemistry, 2015
Orhan Pamuk, Literature, 2006
Centre for Civil Liberties, Peace, 2022
Svetlana Alexievich*, born in Ukraine, Literature, 2015
Selman Waksman*, born in Ukraine, Physiology or Medicine, 1952
Ilya Ilyich Mechnikov, Physiology or Medicine, 1908
Abdulrazak Gurnah, born in Tanzania, Literature, 2021
David MacMillan, Chemistry, 2021
Roger Penrose, Physics, 2020
Michael Houghton, Physiology or Medicine, 2020
Peter J. Ratcliffe, Physiology or Medicine, 2019
M. Stanley Whittingham, Chemistry, 2019
Greg Winter, Chemistry, 2018
Kazuo Ishiguro, born in Japan, Literature, 2017
Richard Henderson, Chemistry, 2017
Oliver Hart, Economics, 2016
Fraser Stoddart, Chemistry, 2016
David J. Thouless, Physics, 2016
F. Duncan M. Haldane, Physics, 2016
John M. Kosterlitz, Physics, 2016
Angus Deaton, Economics, 2015
Tomas Lindahl, born in Sweden, Chemistry, 2015
John O'Keefe, born in the United States, Physiology or Medicine, 2014
Michael Levitt, born in South Africa, Chemistry, 2013
Peter Higgs, Physics, 2013
John B. Gurdon, Physiology or Medicine, 2012
Christopher A. Pissarides, born in Cyprus, Economics, 2010
Konstantin Novoselov, born in Russia, Physics, 2010
Robert G. Edwards, Physiology or Medicine, 2010
Charles K. Kao, Physics, 2009
Venkatraman Ramakrishnan, born in India, Chemistry, 2009
Jack W. Szostak, born in United Kingdom, Physiology or Medicine, 2009
Doris Lessing, born in Iran, Literature, 2007
Sir Martin J. Evans, Physiology or Medicine, 2007
Oliver Smithies*, Physiology or Medicine, 2007
Harold Pinter, Literature, 2005
Clive W. J. Granger, Economics, 2003
Anthony J. Leggett*, Physics, 2003
Peter Mansfield, Physiology or Medicine, 2003
Sydney Brenner, born in South Africa, Physiology or Medicine, 2002
John E. Sulston, Physiology or Medicine, 2002
Tim Hunt, Physiology or Medicine, 2001
Paul Nurse, Physiology or Medicine, 2001
V. S. Naipaul, born in Trinidad, Literature, 2001
David Trimble, Peace, 1998
John Pople, Chemistry, 1998
John E. Walker, Chemistry, 1997
Harold Kroto, Chemistry, 1996
James A. Mirrlees, Economics, 1996
Joseph Rotblat, born in then Russian Empire, now Poland, Peace, 1995
Richard J. Roberts, Physiology or Medicine, 1993
Michael Smith*, Chemistry, 1993
Ronald Coase, based in the United States, Economics, 1991
James W. Black, Physiology or Medicine, 1988
César Milstein, born in Argentina, Physiology or Medicine, 1984
Richard Stone, Economics, 1984
William Golding, Literature, 1983
Aaron Klug, born in Lithuania, Chemistry, 1982
John Robert Vane, Physiology or Medicine, 1982
Elias Canetti, born in Bulgaria, Literature, 1981
Frederick Sanger, Chemistry, 1980
W. Arthur Lewis, born on St. Lucia, Economics, 1979
Godfrey Hounsfield, Physiology or Medicine, 1979
Peter D. Mitchell, Chemistry, 1978
James Meade, Economics, 1977
Nevill Francis Mott, Physics, 1977
Amnesty International, Peace, 1977
Mairead Corrigan, Peace, 1976
Betty Williams, Peace, 1976
John Cornforth, born in Australia, Chemistry, 1975
Christian de Duve*, Physiology or Medicine, 1974
Friedrich Hayek, born in Austria, Economics, 1974
Martin Ryle, Physics, 1974
Antony Hewish, Physics, 1974
Patrick White*, Literature, 1973
Geoffrey Wilkinson, Chemistry, 1973
Brian David Josephson, Physics, 1973
Nikolaas Tinbergen, born in the Netherlands, Physiology or Medicine, 1973
Rodney Robert Porter, Physiology or Medicine, 1972
John Hicks, Economics, 1972
Dennis Gabor, born in Hungary, Physics, 1971
Bernard Katz, born in Germany, Physiology or Medicine, 1970
Derek Harold Richard Barton, Chemistry, 1969
Ronald George Wreyford Norrish, Chemistry, 1967
George Porter, Chemistry, 1967
Dorothy Crowfoot Hodgkin, Chemistry, 1964
Andrew Huxley, Physiology or Medicine, 1963
Alan Lloyd Hodgkin, Physiology or Medicine, 1963
John Kendrew, Chemistry, 1962
Max Perutz, born in Austria, Chemistry, 1962
Francis Crick, Physiology or Medicine, 1962
Maurice Wilkins, born in New Zealand, Physiology or Medicine, 1962
Peter Medawar, born in Brazil, Physiology or Medicine, 1960
Philip Noel-Baker, Peace, 1959
Frederick Sanger, Chemistry, 1958
Alexander R. Todd, Baron Todd, Chemistry, 1957
Cyril Norman Hinshelwood, Chemistry, 1956
Max Born, born in then Germany, now Poland, Physics, 1954
Winston Churchill, Literature, 1953
Hans Adolf Krebs, born in Germany, Physiology or Medicine, 1953
Archer John Porter Martin, Chemistry, 1952
Richard Laurence Millington Synge, Chemistry, 1952
John Cockcroft, Physics, 1951
Bertrand Russell, Literature, 1950
Cecil Frank Powell, Physics, 1950
John Boyd Orr, Peace, 1949
T. S. Eliot, born in the United States, Literature, 1948
Patrick Blackett, Baron Blackett, Physics, 1948
Edward Victor Appleton, Physics, 1947
Robert Robinson, Chemistry, 1947
Friends Service Council, Peace, 1947
Ernst Boris Chain, born in Germany, Physiology or Medicine, 1945
Alexander Fleming, Physiology or Medicine, 1945
George Paget Thomson, Physics, 1937
Robert Cecil, 1st Viscount Cecil of Chelwood, Peace, 1937
Norman Haworth, Chemistry, 1937
Henry Hallett Dale, Physiology or Medicine, 1936
James Chadwick, Physics, 1935
Arthur Henderson, Peace, 1934
Norman Angell, Peace, 1933
Paul Dirac, Physics, 1933
Charles Scott Sherrington, Physiology or Medicine, 1932
John Galsworthy, Literature, 1932
Edgar Adrian, 1st Baron Adrian, Physiology or Medicine, 1932
Arthur Harden, Chemistry, 1929
Frederick Hopkins, Physiology or Medicine, 1929
Owen Willans Richardson, Physics, 1928
Charles Thomson Rees Wilson, Physics, 1927
Austen Chamberlain, Peace, 1925
George Bernard Shaw, born in Ireland (Then part of the United Kingdom of Great Britain and Ireland), Literature, 1925
John Macleod*, Physiology or Medicine, 1923
Archibald Vivian Hill, Physiology or Medicine, 1922
Francis William Aston, Chemistry, 1922
Frederick Soddy, Chemistry, 1921
Charles Glover Barkla, Physics, 1917
William Henry Bragg, Physics, 1915
William Lawrence Bragg, born in Australia, Physics, 1915
Ernest Rutherford, born in New Zealand, Chemistry, 1908
Rudyard Kipling, born in India, Literature, 1907
J. J. Thomson, Physics, 1906
John Strutt, 3rd Baron Rayleigh, Physics, 1904
William Ramsay, Chemistry, 1904
William Randal Cremer, Peace, 1903
Ronald Ross, born in India, Physiology or Medicine, 1902
Claudia Goldin, Economics, 2023
Louis Brus, Chemistry, 2023
Moungi Bawendi, born in France, Chemistry, 2023
Alexei Ekimov, born in former USSR, Chemistry, 2023
Drew Weissman, Physiology or Medicine, 2023
Katalin Karikó, born in Hungary, Physiology or Medicine, 2023
Ben Bernanke, Economics, 2022
Douglas Diamond, Economics, 2022
Philip H. Dybvig, Economics, 2022
Carolyn R. Bertozzi, Chemistry, 2022
K. Barry Sharpless, Chemistry, 2022
John Clauser, Physics, 2022
David Card, born in Canada, Economics, 2021
Joshua Angrist, Economics, 2021
Guido Imbens, born in Netherlands, Economics, 2021
Maria Ressa, born in Philippines, Peace, 2021
Syukuro Manabe, born in Japan, Physics, 2021
David MacMillan, born in United Kingdom, Chemistry, 2021
David Julius, Physiology or Medicine, 2021
Ardem Patapoutian, born in Lebanon, Physiology or Medicine, 2021
Robert B. Wilson, Economics, 2020
Paul R. Milgrom, Economics, 2020
Louise Glück, Literature, 2020
Jennifer Doudna, Chemistry, 2020
Andrea Ghez, Physics, 2020
Harvey J. Alter, Physiology or Medicine, 2020
Charles M. Rice, Physiology or Medicine, 2020
Abhijit Banerjee, born in India, Economics, 2019
Esther Duflo, born in France, Economics, 2019
Michael Kremer, Economics, 2019
John B. Goodenough, born in Germany, Chemistry, 2019
M. Stanley Whittingham, born in United Kingdom, Chemistry, 2019
Jim Peebles, born in Canada, Physics, 2019
William Kaelin Jr., Physiology or Medicine, 2019
Gregg L. Semenza, Physiology or Medicine, 2019
Paul Romer, Economics, 2018
William Nordhaus, Economics, 2018
George P. Smith, Chemistry, 2018
Frances Arnold, Chemistry, 2018
Arthur Ashkin, Physics, 2018
James Allison, Physiology or Medicine, 2018
Richard H. Thaler, Economics, 2017
Joachim Frank, born in Germany, Chemistry, 2017
Rainer Weiss, born in Germany, Physics, 2017
Kip Thorne, Physics, 2017
Barry Barish, Physics, 2017
Michael W. Young, Physiology or Medicine, 2017
Michael Rosbash, Physiology or Medicine, 2017
Jeffrey C. Hall, Physiology or Medicine, 2017
Bob Dylan, Literature, 2016
Oliver Hart, born in United Kingdom, Economics, 2016
Fraser Stoddart, born in United Kingdom, Chemistry, 2016
F. Duncan M. Haldane, born in United Kingdom, Physics, 2016
John M. Kosterlitz, born in United Kingdom, Physics, 2016
Angus Deaton, born in United Kingdom, Economics, 2015
Paul L. Modrich, Chemistry, 2015
Aziz Sancar, born in Turkey, Chemistry, 2015
William C. Campbell, born in Ireland, Physiology or Medicine, 2015
William E. Moerner, Chemistry, 2014
Eric Betzig, Chemistry, 2014
Shuji Nakamura, born in Japan, Physics, 2014
John O'Keefe*, Physiology or Medicine, 2014
Robert J. Shiller, Economics, 2013
Lars Peter Hansen, Economics, 2013
Eugene F. Fama, Economics, 2013
Arieh Warshel, born in Israel, Chemistry, 2013
Michael Levitt, born in South Africa, Chemistry, 2013
Martin Karplus, born in Austria, Chemistry, 2013
Randy Schekman, Physiology or Medicine, 2013
Thomas C. Südhof, born in Germany, Physiology or Medicine, 2013
James Rothman, Physiology or Medicine, 2013
Alvin E. Roth, Economics, 2012
Lloyd S. Shapley, Economics, 2012
Brian K. Kobilka, Chemistry, 2012
Robert J. Lefkowitz, Chemistry, 2012
David J. Wineland, Physics, 2012
Christopher A. Sims, Economics, 2011
Thomas J. Sargent, Economics, 2011
Saul Perlmutter, Physics, 2011
Brian P. Schmidt, Physics, 2011
Adam G. Riess, Physics, 2011
Ralph M. Steinman, born in Canada, Physiology or Medicine, 2011
Bruce Beutler, Physiology or Medicine, 2011
Peter A. Diamond, Economics, 2010
Dale T. Mortensen, Economics, 2010
Ei-ichi Negishi, Japanese citizenship, Chemistry, 2010
Richard F. Heck, Chemistry, 2010
Elinor Ostrom, Economics, 2009
Oliver Eaton Williamson, Economics, 2009
Barack H. Obama, Peace, 2009
Venkatraman Ramakrishnan, born in India, Chemistry, 2009
Thomas A. Steitz, Chemistry, 2009
Willard S. Boyle, born in Canada, Physics, 2009
Charles K. Kao, born in China, Physics, 2009
George E. Smith, Physics, 2009
Elizabeth Blackburn, born in Australia, Physiology or Medicine, 2009
Carol W. Greider, Physiology or Medicine, 2009
Jack W. Szostak, born in United Kingdom, Physiology or Medicine, 2009
Paul Krugman, Economics, 2008
Roger Yonchien Tsien, Chemistry, 2008
Martin Chalfie, Chemistry, 2008
Osamu Shimomura, Japanese citizenship , Chemistry, 2008
Yoichiro Nambu, born in Japan, Physics, 2008
Leonid Hurwicz, born in Russia, Economics, 2007
Eric S. Maskin, Economics, 2007
Roger B. Myerson, Economics, 2007
Al Gore, Peace, 2007
Mario R. Capecchi, born in Italy, Physiology or Medicine, 2007
Oliver Smithies, born in United Kingdom, Physiology or Medicine, 2007
Roger D. Kornberg, Chemistry, 2006
John C. Mather, Physics, 2006
Edmund S. Phelps, Economics, 2006
George F. Smoot, Physics, 2006
Andrew Z. Fire, Physiology or Medicine, 2006
Craig C. Mello, Physiology or Medicine, 2006
Robert Aumann*, as an Israeli citizen, Economics, 2005
Robert H. Grubbs, Chemistry, 2005
Richard R. Schrock, Chemistry, 2005
Thomas Schelling, Economics, 2005
John L. Hall, Physics, 2005
Roy J. Glauber, Physics, 2005
Irwin Rose, Chemistry, 2004
Edward C. Prescott, Economics, 2004
David J. Gross, Physics, 2004
H. David Politzer, Physics, 2004
Frank Wilczek, Physics, 2004
Richard Axel, Physiology or Medicine, 2004
Linda B. Buck, Physiology or Medicine, 2004
Peter Agre, Chemistry, 2003
Roderick MacKinnon, Chemistry, 2003
Robert F. Engle, Economics, 2003
Anthony J. Leggett, born in United Kingdom, Physics, 2003
Paul C. Lauterbur, Physiology or Medicine, 2003
Alexei Alexeyevich Abrikosov, born in Russia, Physics, 2003
Daniel Kahneman, born in Israel, Economics, 2002
Vernon L. Smith, Economics, 2002
Jimmy Carter, Peace, 2002
John Bennett Fenn, Chemistry, 2002
Raymond Davis Jr., Physics, 2002
Riccardo Giacconi, born in Italy, Physics, 2002
Sydney Brenner, born in South Africa, Physiology or Medicine, 2002
H. Robert Horvitz, Physiology or Medicine, 2002
William S. Knowles, Chemistry, 2001
K. Barry Sharpless, Chemistry, 2001
Joseph E. Stiglitz, Economics, 2001
George A. Akerlof, Economics, 2001
A. Michael Spence, Economics, 2001
Eric A. Cornell, Physics, 2001
Carl E. Wieman, Physics, 2001
Leland H. Hartwell, Physiology or Medicine, 2001
Alan Heeger, Chemistry, 2000
Alan MacDiarmid, born in New Zealand, Chemistry, 2000
James J. Heckman, Economics, 2000
Daniel L. McFadden, Economics, 2000
Jack Kilby, Physics, 2000
Paul Greengard, Physiology or Medicine, 2000
Eric Kandel, born in Austria, Physiology or Medicine, 2000
Ahmed H. Zewail, born in Egypt, Chemistry, 1999
Günter Blobel, born in then Germany, now Poland, Physiology or Medicine, 1999
Walter Kohn, born in Austria, Chemistry, 1998
Horst Ludwig Störmer, born in Germany, Physics, 1998
Robert B. Laughlin, Physics, 1998
Daniel C. Tsui, born in China, Physics, 1998
Robert F. Furchgott, Physiology or Medicine, 1998
Louis J. Ignarro, Physiology or Medicine, 1998
Ferid Murad, Physiology or Medicine, 1998
Paul D. Boyer, Chemistry, 1997
Robert C. Merton, Economics, 1997
Myron Scholes, born in Canada, Economics, 1997
Jody Williams, Peace, 1997
Steven Chu, Physics, 1997
William D. Phillips, Physics, 1997
Stanley B. Prusiner, Physiology or Medicine, 1997
Richard E. Smalley, Chemistry, 1996
Robert F. Curl Jr., Chemistry, 1996
William Vickrey, born in Canada, Economics, 1996
David M. Lee, Physics, 1996
Douglas D. Osheroff, Physics, 1996
Robert C. Richardson, Physics, 1996
Mario J. Molina, born in Mexico, Chemistry, 1995
F. Sherwood Rowland, Chemistry, 1995
Robert Lucas Jr., Economics, 1995
Martin L. Perl, Physics, 1995
Frederick Reines, Physics, 1995
Edward B. Lewis, Physiology or Medicine, 1995
Eric F. Wieschaus, Physiology or Medicine, 1995
George Andrew Olah, born in Hungary, Chemistry, 1994
John Harsanyi, born in Hungary, Economics, 1994
John Forbes Nash, Economics, 1994
Clifford G. Shull, Physics, 1994
Alfred G. Gilman, Physiology or Medicine, 1994
Martin Rodbell, Physiology or Medicine, 1994
Kary B. Mullis, Chemistry, 1993
Robert W. Fogel, Economics, 1993
Douglass C. North, Economics, 1993
Toni Morrison, Literature, 1993
Russell A. Hulse, Physics, 1993
Joseph H. Taylor Jr., Physics, 1993
Phillip A. Sharp, Physiology or Medicine, 1993
Rudolph A. Marcus, born in Canada, Chemistry, 1992
Gary S. Becker, Economics, 1992
Edmond H. Fischer, born in China, Physiology or Medicine, 1992
Edwin G. Krebs, Physiology or Medicine, 1992
Ronald Coase, born in the United Kingdom, Economics, 1991
Elias James Corey, Chemistry, 1990
Merton H. Miller, Economics, 1990
William F. Sharpe, Economics, 1990
Harry M. Markowitz, Economics, 1990
Jerome I. Friedman, Physics, 1990
Henry W. Kendall, Physics, 1990
Joseph E. Murray, Physiology or Medicine, 1990
E. Donnall Thomas, Physiology or Medicine, 1990
Sidney Altman, born in Canada, Chemistry, 1989
Thomas R. Cech, Chemistry, 1989
Hans G. Dehmelt, born in Germany, Physics, 1989
Norman F. Ramsey, Physics, 1989
J. Michael Bishop, Physiology or Medicine, 1989
Harold E. Varmus, Physiology or Medicine, 1989
Leon M. Lederman, Physics, 1988
Melvin Schwartz, Physics, 1988
Jack Steinberger, born in Germany, Physics, 1988
Gertrude B. Elion, Physiology or Medicine, 1988
George H. Hitchings, Physiology or Medicine, 1988
Charles J. Pedersen, born in Korea, Chemistry, 1987
Donald J. Cram, Chemistry, 1987
Robert M. Solow, Economics, 1987
Joseph Brodsky, born in Russia, Literature, 1987
Dudley R. Herschbach, Chemistry, 1986
Yuan T. Lee, born in Taiwan, Chemistry, 1986
James M. Buchanan, Economics, 1986
Elie Wiesel, born in Romania, Peace, 1986
Stanley Cohen, Physiology or Medicine, 1986
Rita Levi-Montalcini, born in Italy, Physiology or Medicine, 1986
Jerome Karle, Chemistry, 1985
Herbert A. Hauptman, Chemistry, 1985
Franco Modigliani, born in Italy, Economics, 1985
Michael S. Brown, Physiology or Medicine, 1985
Joseph L. Goldstein, Physiology or Medicine, 1985
Bruce Merrifield, Chemistry, 1984
Henry Taube, born in Canada, Chemistry, 1983
Gérard Debreu, born in France, Economics, 1983
William A. Fowler, Physics, 1983
Subrahmanyan Chandrasekhar, born in India, Physics, 1983
Barbara McClintock, Physiology or Medicine, 1983
George J. Stigler, Economics, 1982
Kenneth G. Wilson, Physics, 1982
Roald Hoffmann, born in then Poland, now Ukraine, Chemistry, 1981
James Tobin, Economics, 1981
Nicolaas Bloembergen, born in the Netherlands, Physics, 1981
Arthur L. Schawlow, Physics, 1981
David H. Hubel, born in Canada, Physiology or Medicine, 1981
Roger W. Sperry, Physiology or Medicine, 1981
Walter Gilbert, Chemistry, 1980
Paul Berg, Chemistry, 1980
Lawrence R. Klein, Economics, 1980
Czesław Miłosz, born in then Russian Empire, now Lithuania, Literature, 1980
James Cronin, Physics, 1980
Val Fitch, Physics, 1980
Baruj Benacerraf, born in Venezuela, Physiology or Medicine, 1980
George D. Snell, Physiology or Medicine, 1980
Herbert C. Brown, born in the United Kingdom, Chemistry, 1979
Theodore Schultz, Economics, 1979
Steven Weinberg, Physics, 1979
Sheldon Glashow, Physics, 1979
Allan M. Cormack, born in South Africa, Physiology or Medicine, 1979
Herbert A. Simon, Economics, 1978
Isaac Bashevis Singer, born in then Russian Empire, now Poland, Literature, 1978
Robert Woodrow Wilson, Physics, 1978
Arno Penzias, born in Germany, Physics, 1978
Hamilton O. Smith, Physiology or Medicine, 1978
Daniel Nathans, Physiology or Medicine, 1978
Philip Anderson, Physics, 1977
John H. van Vleck, Physics, 1977
Roger Guillemin, born in France, Physiology or Medicine, 1977
Andrew Schally, born in then Poland, now Lithuania, Physiology or Medicine, 1977
Rosalyn Yalow, Physiology or Medicine, 1977
William Lipscomb, Chemistry, 1976
Milton Friedman, Economics, 1976
Saul Bellow, born in Canada, Literature, 1976
Burton Richter, Physics, 1976
Samuel C. C. Ting, Physics, 1976
Baruch S. Blumberg, Physiology or Medicine, 1976
Daniel Carleton Gajdusek, Physiology or Medicine, 1976
Tjalling C. Koopmans, born in the Netherlands, Economics, 1975
Ben Roy Mottelson*, Physics, 1975
James Rainwater, Physics, 1975
David Baltimore, Physiology or Medicine, 1975
Renato Dulbecco, born in Italy, Physiology or Medicine, 1975
Howard Martin Temin, Physiology or Medicine, 1975
Paul J. Flory, Chemistry, 1974
George E. Palade, born in Romania, Physiology or Medicine, 1974
Wassily Leontief, born in Germany, Economics, 1973
Henry Kissinger, born in Germany, Peace, 1973
Ivar Giaever, Norway, Physics, 1973
Christian Anfinsen, Chemistry, 1972
Stanford Moore, Chemistry, 1972
William H. Stein, Chemistry, 1972
Kenneth J. Arrow, Economics, 1972
John Bardeen, Physics, 1972
Leon N. Cooper, Physics, 1972
Robert Schrieffer, Physics, 1972
Gerald Edelman, Physiology or Medicine, 1972
Simon Kuznets, born in then Russia, now Belarus, Economics, 1971
Earl W. Sutherland Jr., Physiology or Medicine, 1971
Paul A. Samuelson, Economics, 1970
Norman Borlaug, Peace, 1970
Julius Axelrod, Physiology or Medicine, 1970
Murray Gell-Mann, Physics, 1969
Max Delbrück, born in Germany, Physiology or Medicine, 1969
Alfred Hershey, Physiology or Medicine, 1969
Salvador Luria, born in Italy, Physiology or Medicine, 1969
Lars Onsager, born in Norway, Chemistry, 1968
Luis Alvarez, Physics, 1968
Robert W. Holley, Physiology or Medicine, 1968
Har Gobind Khorana, born in India, Physiology or Medicine, 1968
Marshall Warren Nirenberg, Physiology or Medicine, 1968
Hans Bethe, born in then Germany, now France, Physics, 1967
Haldan Keffer Hartline, Physiology or Medicine, 1967
George Wald, Physiology or Medicine, 1967
Robert S. Mulliken, Chemistry, 1966
Charles B. Huggins, born in Canada, Physiology or Medicine, 1966
Francis Peyton Rous, Physiology or Medicine, 1966
Robert B. Woodward, Chemistry, 1965
Richard P. Feynman, Physics, 1965
Julian Schwinger, Physics, 1965
Martin Luther King Jr., Peace, 1964
Charles H. Townes, Physics, 1964
Konrad Bloch, born in then Germany, now Poland, Physiology or Medicine, 1964
Maria Goeppert-Mayer, born in then Germany, now Poland, Physics, 1963
Eugene Wigner, born in Hungary, Physics, 1963
John Steinbeck, Literature, 1962
Linus C. Pauling, Peace, 1962
James D. Watson, Physiology or Medicine, 1962
Melvin Calvin, Chemistry, 1961
Robert Hofstadter, Physics, 1961
Georg von Békésy, born in Hungary, Physiology or Medicine, 1961
Willard F. Libby, Chemistry, 1960
Donald A. Glaser, Physics, 1960
Owen Chamberlain, Physics, 1959
Emilio Segrè, born in Italy, Physics, 1959
Arthur Kornberg, Physiology or Medicine, 1959
Severo Ochoa, born in Spain, Physiology or Medicine, 1959
George Beadle, Physiology or Medicine, 1958
Joshua Lederberg, Physiology or Medicine, 1958
Edward Tatum, Physiology or Medicine, 1958
Chen Ning Yang, born in China, Physics, 1957
Tsung-Dao Lee, born in China, Physics, 1957
William B. Shockley, Physics, 1956
John Bardeen, Physics, 1956
Walter H. Brattain, born in China, Physics, 1956
Dickinson W. Richards, Physiology or Medicine, 1956
André F. Cournand, France, Physiology or Medicine, 1956
Vincent du Vigneaud, Chemistry, 1955
Willis E. Lamb, Physics, 1955
Polykarp Kusch, born in Germany, Physics, 1955
Linus C. Pauling, Chemistry, 1954
Ernest Hemingway, Literature, 1954
John F. Enders, Physiology or Medicine, 1954
Frederick C. Robbins, Physiology or Medicine, 1954
Thomas H. Weller, Physiology or Medicine, 1954
George C. Marshall, Peace, 1953
Fritz Albert Lipmann, born in then Germany, now Russia, Physiology or Medicine, 1953
E. M. Purcell, Physics, 1952
Felix Bloch, born in Switzerland, Physics, 1952
Selman A. Waksman, born in then Russian Empire, now Ukraine, Physiology or Medicine, 1952
Edwin M. McMillan, Chemistry, 1951
Glenn Theodore Seaborg, Chemistry, 1951
Ralph J. Bunche, Peace, 1950
Philip S. Hench, Physiology or Medicine, 1950
Edward C. Kendall, Physiology or Medicine, 1950
William Giauque, born in Canada, Chemistry, 1949
William Faulkner, Literature, 1949
T. S. Eliot*, Literature, 1948
American Friends Service Committee (The Quakers), Peace, 1947
Carl Ferdinand Cori, born in Austria, Physiology or Medicine, 1947
Gerty Cori, born in Austria, Physiology or Medicine, 1947
Wendell M. Stanley, Chemistry, 1946
James B. Sumner, Chemistry, 1946
John H. Northrop, Chemistry, 1946
Emily G. Balch, Peace, 1946
John R. Mott, Peace, 1946
Percy W. Bridgman, Physics, 1946
Hermann J. Muller, Physiology or Medicine, 1946
Cordell Hull, Peace, 1945
Isidor Isaac Rabi, born in Austria, Physics, 1944
Joseph Erlanger, Physiology or Medicine, 1944
Herbert S. Gasser, Physiology or Medicine, 1944
Otto Stern, born in then Germany, now Poland, Physics, 1943
Edward A. Doisy, Physiology or Medicine, 1943
Ernest Lawrence, Physics, 1939
Pearl S. Buck, Literature, 1938
Clinton Davisson, Physics, 1937
Eugene O'Neill, Literature, 1936
Carl Anderson, Physics, 1936
Harold C. Urey, Chemistry, 1934
George R. Minot, Physiology or Medicine, 1934
William P. Murphy, Physiology or Medicine, 1934
George H. Whipple, Physiology or Medicine, 1934
Thomas H. Morgan, Physiology or Medicine, 1933
Irving Langmuir, Chemistry, 1932
Jane Addams, Peace, 1931
Nicholas M. Butler, Peace, 1931
Sinclair Lewis, Literature, 1930
Frank B. Kellogg, Peace, 1929
Arthur H. Compton, Physics, 1927
Charles G. Dawes, Peace, 1925
Robert A. Millikan, Physics, 1923
Woodrow Wilson, Peace, 1919
Theodore W. Richards, Chemistry, 1914
Elihu Root, Peace, 1912
Albert A. Michelson, born in then Germany, now Poland, Physics, 1907
Theodore Roosevelt, Peace, 1906
Baruj Benacerraf*, Physiology or Medicine, 1980
Lê Đức Thọ, born in French Indochina, Peace, 1973 (declined)
Tawakkol Karman, Peace, 2011
Ivo Andrić, born in the Condominium of Bosnia and Herzegovina, Austria-Hungary (now Bosnia and Herzegovina), Literature, 1961
|
||||
correct_award_00093
|
FactBench
|
1
| 19
|
https://observervoice.com/20-january-remembering-johannes-vilhelm-jensen-on-birthday-33523/
|
en
|
Johannes Vilhelm Jensen: Danish Nobel Laureate in Literature and Master of Prose
|
[
"https://observervoice.com/wp-content/uploads/2023/09/cropped-OV-Logo.png",
"https://observervoice.com/wp-content/uploads/2023/09/cropped-OV-Logo.png",
"https://observervoice.com/wp-content/uploads/2020/09/Khasi_Girls-jpg-webp-150x150.webp",
"https://observervoice.com/wp-content/uploads/2020/09/Matrilineal-Societies-of-India-scaled-jpg-webp-150x150.webp",
"https://observervoice.com/wp-content/uploads/2022/07/Pi-Approximation-Day-jpg-webp-150x150.webp",
"https://observervoice.com/wp-content/uploads/2024/02/Celebrating-Sarawak-Day-150x150.png",
"https://observervoice.com/wp-content/uploads/2023/07/National-Mango-Day-150x150.jpg",
"https://observervoice.com/wp-content/uploads/2023/03/Randal-Cremer-150x150.jpg",
"https://observervoice.com/wp-content/uploads/2024/03/Badri-Narayan-1-150x150.jpg",
"https://observervoice.com/wp-content/uploads/2023/07/Zubir-Said-150x150.jpg",
"https://observervoice.com/wp-content/uploads/2022/07/22-July-in-Indian-and-World-History-e1677511631929-150x150.png",
"https://observervoice.com/wp-content/uploads/2023/07/Alexander-Calder-150x150.jpg",
"https://observervoice.com/wp-content/uploads/2023/01/Sandford-Fleming-jpg-webp-150x150.webp",
"https://observervoice.com/wp-content/uploads/2023/07/Mukesh-150x150.jpg"
] |
[] |
[] |
[
""
] | null |
[
"OV Digital Desk"
] |
2024-01-17T22:33:26+00:00
|
Johannes Vilhelm Jensen was a Danish author. In 1944, Johannes Vilhelm Jensen was awarded the Nobel Prize in a Literature.
|
en
|
Observer Voice
|
https://observervoice.com/20-january-remembering-johannes-vilhelm-jensen-on-birthday-33523/
|
Johannes Vilhelm Jensen (20 January 1873 – 25 November 1950) was a Danish author. In 1944, Johannes Vilhelm Jensen was awarded the Nobel Prize in Literature.
Life and Career
Johannes Vilhelm Jensen was born on 20 January 1873, in Farsø, Denmark. He studied at various universities in Denmark, including the University of Copenhagen, where he initially pursued a medical degree but later shifted his focus to literature and philosophy. Johannes Vilhelm Jensen had a distinguished career as a writer:
He began his writing career as a poet and later ventured into various literary genres, including novels, essays, and short stories. Jensen is known for his imaginative and experimental style of writing, which incorporated elements of naturalism, symbolism, and modernism. One of his most famous works is the novel “The Fall of the King” (1900-1901), which is part of his “Kongens Fald” (The Fall of the King) trilogy. This trilogy explores the history of Denmark and its transition from paganism to Christianity.
Jensen’s literary works often reflected his interest in history, mythology, and human evolution. Johannes Vilhelm Jensen passed away on November 25, 1950, in Copenhagen, Denmark.
Award and Legacy
Johannes Vilhelm Jensen was awarded the Nobel Prize in Literature in 1944 for his contribution to the development of Scandinavian literature. He was the first Danish author to receive this prestigious award. Johannes Vilhelm Jensen’s legacy is primarily based on his literary contributions, which are characterized by their unique style and exploration of themes related to history, mythology, and human evolution.
His “Kongens Fald” trilogy remains a significant work in Danish literature, offering an imaginative and thought-provoking perspective on Denmark’s historical and cultural transformation. Jensen’s innovative approach to writing had an influence on subsequent generations of Danish and Scandinavian writers.
|
|||||
correct_award_00093
|
FactBench
|
0
| 55
|
https://proedit.com/word-people-nobel-prize-for-literature-recipients/
|
en
|
Nobel Prize for Literature Recipients
|
[
"https://i0.wp.com/proedit.com/wp-content/uploads/2022/01/side_logo_new-01-350x70-1.png?fit=350%2C70&ssl=1",
"https://proedit.com/wp-content/uploads/2020/05/symbol_logo_square_white-01-e1590269241639.png"
] |
[] |
[] |
[
""
] | null |
[
"ProEdit"
] |
2014-10-09T15:52:33+00:00
|
There are Word People, and then there are Word Giants! Each year, one writer receives the Nobel Prize for Literature. Take a minute to review this list of all the past recipients of writing’s most prestigious award. Did your favorite author make the grade? 2019 Nobel Prize in Literature Peter Handke “for an influential work that ... Read more
|
en
|
https://proedit.com/wp-content/uploads/2020/07/ProEdit-Logo-Mark_White-Circle.ico
|
ProEdit
|
https://proedit.com/word-people-nobel-prize-for-literature-recipients/
|
There are Word People, and then there are Word Giants! Each year, one writer receives the Nobel Prize for Literature. Take a minute to review this list of all the past recipients of writing’s most prestigious award. Did your favorite author make the grade?
Source: NobelPrize.org
|
||||
correct_award_00093
|
FactBench
|
1
| 35
|
https://www.newworldencyclopedia.org/entry/Thomas_Mann
|
en
|
New World Encyclopedia
|
[
"https://www.newworldencyclopedia.org/images/nwe_header.jpg",
"https://www.newworldencyclopedia.org/d/images/thumb/9/92/Thomas_Mann_1937.jpg/200px-Thomas_Mann_1937.jpg",
"https://static.newworldencyclopedia.org/skins/common/images/Cc.logo.circle.png",
"https://www.newworldencyclopedia.org/resources/assets/poweredby_mediawiki_88x31.png"
] |
[] |
[] |
[
""
] | null |
[] | null |
en
|
https://static.newworldencyclopedia.org/favicon.ico
|
https://www.newworldencyclopedia.org/entry/Thomas_Mann
|
Paul Thomas Mann (June 6, 1875 - August 12, 1955) was a German novelist, social critic, philanthropist, essayist, and Nobel Prize laureate, lauded principally for a series of highly symbolic and often ironic epic novels and mid-length stories, noted for their insight into the psychology of the artist and intellectual. He is noted for his analysis and critique of the European and German soul in the beginning of the twentieth century. Mann is one of the last notable authors of German Modernism, but one of the first to herald the shattered aesthetic of German literature after the horrors of World War II.
Like other German authors, already in their prime during the rise of Nazism, who lived to see the end of the Third Reich and the revelations of the death camps, Mann's writings fall broadly into two categories: those written in a "clean," Realist style before the rise of Nazism, and those written in psychological, modern prose that searches for meaning after the revelations of the Holocaust and Germany's catastrophic defeat in World War II. Mann was a master during both eras. It was on the strength of his early, Realist fiction, among them Buddenbrooks, that Mann would win his Nobel Prize[1]and largely cement his fame. The Magic Mountain was a transitional work, with the original version begun in the realist tradition, but the revised work expressing many of the Mann's concerns for European society in the run-up to the Second World War. Mann searched in vain for answers to forestall the collapse of the tradition of Western humanism in the face of the rise of totalitarian forces. Humanism simply proved incomplete as an ideology sufficient to the task.
Finally, in the war's aftermath, Mann created the powerful and moving testament presented in such works as Doktor Faustus, an allegorical story of Germany's fall, and Josef und Seiner Bruder, a re-telling of the Biblical story of Joseph, and for which Mann today is best-remembered today. Mann was a champion of democracy, and he devoted his life to preventing and recording the disasters that ultimately consumed Germany in the 1930s and 1940s. He is not only one of the finest novelists of the twentieth century, he is also one of its most heroic.
Life
Mann was born in Lübeck, Germany, second son of Thomas Johann Heinrich Mann, a senator and grain merchant, and his wife Júlia da Silva Bruhns, who was born in Brazil and came to Germany when she was seven years old. Mann's father died in 1891, and his trading firm was liquidated. The family subsequently moved to Munich. Mann attended the science division of a Lübeck gymnasium, then spent some time at the University of Munich where, in preparation for a career in journalism, he studied history, economics, art history, and literature. He resided in Munich from 1891 until 1933, with the exception of a year-long stay in Palestrina, Italy, with his older brother Heinrich, also a novelist. Mann's career as a writer began when he joined the writing staff of Simplicissimus, a German literary and political magazine based out of Munich and well-known for its satires of the noble classes. Mann's first short story, Little Herr Friedmann (Der Kleine Herr Friedemann), was published in Simplicissimus in 1898.
In 1905, he married Katia Pringsheim, daughter of a prominent, secular Jewish family of intellectuals. They had six children; Erika, Klaus, Golo, Monika. Elisabeth and Michael Thomas Mann became literary and artistic figures in their own right. Mann emigrated from Nazi Germany to Küsnacht near Zürich, Switzerland in 1933, then to the United States in 1939, where he taught at Princeton University, along with such other émigrés as Albert Einstein. In 1942, the family moved to Pacific Palisades, California, where they remained until after the end of World War II. In 1944, he became a naturalized citizen of the United States.
Thomas Mann returned to Europe in 1952, residing in Kilchberg, near Zurich, Switzerland. He was never to live in Germany again, though he traveled there regularly and was widely celebrated. His most important visit to Germany came in 1949, on the occasion of the 200th birthday of Johann Wolfgang von Goethe. He attended celebrations in both Frankfurt am Main and Weimar, making a clear statement that German culture extended beyond the political borders of East and West Germany.
In 1955 at 80 years old, Thomas Mann died of atherosclerosis in a hospital in Zurich.
Political Views
During World War I Mann supported Kaiser Wilhelm II's conservatism, attacking liberalism. In Von Deutscher Republik (1923), as a semi-official spokesman for parliamentary democracy, Mann called upon the German intellectual class to support the new Weimar Republic. After the close of World War I and Kaiser Wilhelm's defeat, Mann was disillusioned, and his political views gradually shifted toward liberal and democratic principles.
In 1930 Mann gave a public address in Berlin titled An Appeal to Reason, in which he strongly denounced Nazism and encouraged resistance by the working class. This was followed by numerous essays and lectures in which he attacked the Nazis. At the same time, he expressed increasing sympathy for socialism and communism. In 1933 when the Nazis came to power, Mann and his wife were vacationing in Switzerland. Due to his very vociferous denunciations of Nazi policies, his son advised him not to return. Later, Mann's books, particularly Buddenbrooks, were amongst the many burnt by Adolf Hitler's regime.
Work
Mann was awarded the Nobel Prize in Literature in 1929, principally in recognition of his popular achievement with the epic Buddenbrooks (1901), which relates the decline of a merchant family in Lübeck (based on Mann's own family) over the course of three generations. His next major novel was The Magic Mountain (Der Zauberberg, 1924), about an engineering student whose plan to visit his tubercular cousin at a Swiss sanatorium for three weeks turns into a seven-year diversion. During his time spent at the sanatorium, the protagonist of The Magic Mountain encounters a variety of characters who play out the ideological conflicts of early twentieth-century European civilization. Other novels included Lotte in Weimar (1939), in which Mann returned to the world of Goethe's novel The Sorrows of Young Werther (1774); Doktor Faustus (1947), a deep meditation on the nature of art and morality, told through the allegorical story of the composer Adrian Leverkühn and his corruption and ensuing madness in the years leading up to World War II, and often considered Mann's greatest work.
Mann himself devoted most of his writing years to producing the immense tetralogy Joseph and His Brothers (Joseph und seine Brüder, 1933 - 1942), a richly imagined retelling of the story of Joseph related in chapters 27-50 of Genesis in the Hebrew Bible. The first volume relates the establishment of the family of Jacob, who becomes the father of Joseph. In the second volume the young Joseph, not yet master of considerable gifts, arouses the enmity of his ten older brothers and is sold by them into slavery in Egypt. In the third volume, Joseph becomes the steward of a high court official, Potiphar, but finds himself thrown into prison after (mindfully) rejecting the advances of Potiphar's wife. In the last volume, the mature Joseph rises to become administrator of Egypt's granaries. Famine drives the sons of Jacob to Egypt, where the unrecognized Joseph adroitly orchestrates a recognition scene that results in the brothers' reconciliation and the reunion of the family.
Mann was a humanist who valued the cumulative achievements of Western culture and believed in the necessity of upholding civilization against the dangers of decay and barbarism. His work is the record of a consciousness of a life of manifold possibilities, and of the tensions inherent in the (more or less enduringly fruitful) responses to those possibilities. In his own summation (upon receiving the Nobel Prize): "The value and significance of my work for posterity may safely be left to the future; for me they are nothing but the personal traces of a life led consciously, that is, conscientiously."
Regarded as a whole, Mann's career is a striking example of the "repeated puberty" which Goethe thought characteristic of the genius. In technique as well as in thought, he experienced far more daringly than is generally realized. In Buddenbrooks he wrote one of the last of the great "old-fashioned" novels, a patient, thorough tracing of the fortunes of a family. —Henry Hatfield in Thomas Mann. [2]
The Magic Mountain
The Magic Mountain (Der Zauberberg) is one of Thomas Mann's longest and most influential works, first published in 1924, and now considered one of the most important works of German literature. Mann started writing what was to become The Magic Mountain in 1912. It began as a much shorter narrative, revisiting in a comic manner aspects of Death in Venice, a brief novel that he was then preparing for publication. The newer work reflected his experiences and impressions during a period when his wife, who was suffering from a lung complaint, was confined to Dr. Friedrich Jessen's Waldsanatorium, a sanatorium in Davos, Switzerland for several months. In May and June of 1912 he paid her a visit and got to know the team of doctors who were treating her. According to Mann, in the afterword that was later included in the English translation, this stay became the foundation of the opening chapter of the novel.
The outbreak of the First World War interrupted work on the book. The conflict and its aftermath led the author to undertake a major re-examination of European bourgeois society, including the sources of the willful, perverse destructiveness displayed by much of civilized humanity. In the wake of this human disaster, Mann felt compelled to radically revise and expand his budding novel. He revisited the original themes of the work—illness, death, and isolation—in order to examine in detail the philosophical, sociological, and artistic causes that set in motion the catastrophe of the First World War. In particular, Mann was concerned explicitly with examining the "zeitgeist' of the pre-war years—a time characterized by a tremendous flood of new ideas and philosophies and great indecisiveness and confusion that would ultimately manifest itself in the "war to end all wars," the most chaotic and violent conflict in human history up to that time.
The Magic Mountain is vast, erudite, subtle and, most of all, ambiguous. The book blends a scrupulous realism along with deeper symbolic undertones, giving the reader the paradoxical impression that they are reading an allegory that is nevertheless consummately realistic. In addition to this, interpretations of the novel are complicated all the more by Mann's notorious sense of irony. Mann himself was well aware of his book's elusiveness, but offered few clues about approaches to the text. He later compared it to a ssymphonic work, orchestrated with a number of themes and, in a playful commentary on the problems of interpretation, recommended that those who wished to understand it should read it through twice.
Plot summary
The narrative opens in the decade before World War I. We are introduced to the central protagonist of the story, Hans Castorp, a young German. We encounter him when he is in his early twenties, about to take up a ship-building career in Hamburg, his home town. Just before beginning this professional career he undertakes a journey to visit his tubercular cousin, Joachim Ziemßen, who is seeking a cure in a sanatorium in Davos, high up in the Swiss Alps. In the opening chapter, Hans is symbolically transported away from the familiar life and mundane obligations he has known to the rarefied mountain air and introspective little world of the sanatorium.
Castorp's departure from the sanatorium is repeatedly delayed by his failing health. What at first looks like a minor bronchial infection develops into the symptoms of tuberculosis, and Hans is persuaded to stay until his health improves.
During his extended stay, Castorp meets and learns from a variety of characters, who together represent a microcosm of pre-war Europe. These include the humanist and encyclopediaists Lodovico Settembrini (a student of Giosuè Carducci), the totalitarian Jesuit Leo Naphta, the hedonist Mynheer Peeperkorn, and Castorp's romantic interest, Madame Chauchat.
In the end, Castorp remains in the morbid atmosphere of the sanatorium for seven years. During this time, he has grown immensely, having listened and learned from the endless philosophical arguments going on between Settembrini and Naphta. At the conclusion of the novel, Castorp has rejected radicalism and totalitarianism in favor of Settembrini's humble, moral outlook. Shortly after this epiphany, World War I begins, Castorp is conscripted into the military and sent to battle, where he is almost certainly killed.
Significance
The Magic Mountain can be read both as a classic example of the European bildungsroman—a "novel of education" or "novel of formation"—and as a sly parody of this genre. Many formal elements of this type of fiction are present: like the protagonist of a typical bildungsroman, the immature Castorp leaves his home and learns about art, culture, politics, human frailty and love. Hans Castorp’s stay in the rarefied world of The Magic Mountain thus provides him with a panoramic view of pre-war European civilization and its discontents.
Mann’s description of the subjective experience of serious illness and the gradual process of medical institutionalization are interesting in themselves, as are his allusions to the irrational forces within the human psyche which echoes Freudian psychoanalysis and its interest in the unconscious mind that was concurrently becoming prominent. These themes relate to the development of Castorp's character over the time-span covered by the novel, a point that the author himself underlined. In the informative afterword written for the 1927 English translation, Mann states that "what [Hans] came to understand is that one must go through the deep experience of sickness and death to arrive at a higher sanity and health… "
Themes
Connection to Death in Venice
According to the author, he originally planned the The Magic Mountain as a novella; a humorous, ironic, satirical pendant to Death in Venice, which he had completed in 1912. The atmosphere was to derive from the “mixture of death and amusement” that Mann had encountered whilst visiting his wife in a Swiss sanatorium. This fascination with death, the triumph of ecstatic chaos over a life devoted to order, which he had explored in Death in Venice was supposed to be transferred to a comedic plane.
Thus, The Magic Mountain contains many contrasts and parallels with the earlier novel. The established author Gustav von Aschenbach can be readily compared to the young, callow Castorp at the start of a humdrum career. The allure of the beautiful Polish boy, Tadzio, corresponds to the Asiatic mystery of the Russian, Madame Chauchat. The setting itself has shifted both geographically and symbolically; switching from the flooded and diseased Italian coast lands to an alpine resort famed for its health-giving properties. Where the city of Venice in Death in Venice was under the threat of a cholera plague, The Magic Mountain takes place in a secluded enclave where disease is ostensibly meant to be cured, although, as the story unfolds, Hans's disease only becomes fully developed at the sanatorium.
Illness and death
The Berghof patients suffer from some form of tuberculosis, which rules the daily routines, thoughts, and conversations of the “Half-a-lung club.” The disease ends fatally for many of the patients, such as the Catholic girl, Barbara Hujus, whose fear of death is heightened in a harrowing Viaticum-scene, and Castorp's cousin, Ziemßen, who leaves the world like an ancient hero. The dialogs between Settembrini and Naphta often revolve around the theme of death viewed from a metaphysical perspective.
Mann commented that, “What Castorp learns to fathom is that all higher health must have passed through illness and death. […] As Hans Castorp once says to Madame Chauchat, there are two ways to life: One is the common, direct, and brave. The other is bad, leading through death, and that is the genial way. This concept of illness and death, as a necessary passage to knowledge, health, and life, makes The Magic Mountain into a novel of initiation."
Time
Closely connected to the themes of life and death is the subjective nature of time, a recurring leitmotif. Thus Chapter VII, entitled "By the Ocean of Time", opens with the narrator asking rhetorically, "Can one tell - that is to say, narrate - time, time itself, as such, for its own sake?" Mann's authorial (and ironic) response to this question is, "That would surely be an absurd undertaking…," before going on to compare storytelling to the act of music making, describing both as , " …only present themselves as a flowing, as a succession in time, as one thing after another…" . The Magic Mountain, in essence, embodies the author's meditations on the tempo of experience.
The narrative is ordered chronologically but it accelerates throughout the novel, so that the first five chapters relates only the first of Castorp’s seven years at the sanatorium in great detail; the remaining six years, marked by monotony and routine, are described in the last two chapters. This asymmetry corresponds to Castorp’s own skewed perception of the passage of time.
Mann also meditates upon the interrelationship between the experience of time and space, in which time seems to pass more slowly when one doesn't move in space. This aspect of the novel mirrors aspects of contemporary philosophical and scientific debates, such as Heidegger's writings and Einstein's theory of relativity.
Allegorical characters
Mann uses the novel's main characters to introduce Castorp to the ideas and prevailing ideologies of his time. The author observed that the characters are all “exponents, representatives, and messengers of intellectual districts, principles, and worlds,” hoping that he had not made them mere wandering allegories.
Castorp
According to the author, the protagonist is a questing knight, the “pure fool” looking for the Holy Grail in the tradition of Parzival. However, he remains pale and mediocre, representing a German bourgeois that is torn between conflicting influences—capable of the highest humanistic ideals, yet at the same time prone to both stubborn philistinism and radical ideologies.
In a way, Hans Castorp can be seen as an avatar of the young Weimar Republic: Both humanism and radicalism, represented by Settembrini and Naphta, try to win his favor, but Castorp is unable to decide. His body temperature is a subtle metaphor for his lack of clarity: Following Friedrich Schiller’s theory of fever, Castrop’s temperature is always 37.6°C, which is neither healthy nor ill, but an indeterminate stage. Stuck in this intermediate stage, Castorp is subject to the influences of the various conflicting ideologies of his time, embodied in the other characters in the novel.
Settembrini: Humanism
Settembrini represents the active and positive ideal of the Enlightenment, of Humanism, democracy, tolerance and human rights. He often finds Castorp literally in the dark and switches on the light before their conversations. He compares himself to Prometheus of Greek mythology, who brought of fire, and thus enlightenment to Man. His own mentor, Carducci, has even written a hymn to another lightbringer: to Lucifer, “la forza vindice della ragione.” His ethics are those of bourgeois values and labor. He tries to counter Castorp's morbid fascination with death and disease, warns him against the ill Madame Chauchat, and tries to demonstrate a positive outlook on life.
His antagonist, Naphta, describes him as “Zivilisationsliterat”—a "civilized" member of the literati, in the most pejorative sense. Mann originally constructed Settembrini as a caricature of the liberal-democratic novelist, represented for example by his own brother, Heinrich Mann. However, while the novel was written, Mann himself became an outspoken supporter of the Weimar Republic and humanism, which may explain why Settembrini, especially in the later chapters, becomes the authorial voice.
Naphta: Radicalism
Settembrini's antagonist, Naphta, represents the forces of decay, of radicalism and extremism. His perspective combines several heterogeneous radical aspects that include fascism, anarchism, and communism. With brilliant intelligence he aims to unmask Settembrini's values and ethics and leads them ad , “as if to prove that the Sun revolves around Earth.” Settembrini admits that Naptha's sophistry usually prevails in their frequent verbal duels for the favor of their eager student, Castorp. In the end, Castorp sides with Settembrini, based on his benevolence more than the soundness of his arguments.
In Mann's original draft, Naphta was not planned but was added later, while the Weimar Republic was threatened by radical ideologies from all sides, eventually leading to its collapse. Hans Castorp famously tries to classify Naphta politically and comes to the conclusion that he was just as revolutionary as Setembrini, not in a liberal, but in a conservative way, so he decides that Naphta was a Revolutionär der Erhaltung (revolutionist of conservation). This apparent oxymoron alludes to a heterogenous movement of right-wing German intellectuals called the Conservative Revolution. The term, probably first adopted by Hugo von Hofmannsthal, was repeatedly used by Mann and is meant to be revolutionary in a reactionary sense: The movement was highly nationalistic and not only fought against the ideals of left-wing socialism, liberalism and Enlightenment, but it also detested the lost Empire’s petty bourgeoisie and the aristocracy. The movement was somewhat elusive, flirting with every radicalism against established views. Thus, Naphta himself is conceived as a living contradiction in terms: An ex-Jewish Jesuit, anti-capitalist, hostile to modernity, freedom, individuality and progress, anarchic and theocratic. Possible inspirations for Naphta are Leon Trotsky and Georg Lukács.
Doktor Faustus
Doktor Faustus is Mann's last complete novel, published in 1947 as Doktor Faustus. Das Leben des deutschen Tonsetzers Adrian Leverkühn, erzählt von einem Freunde Doctor Faustus: The Life of the German Composer Adrian Leverkühn as Told by a Friend). The novel documents the life of a fictional composer, Adrian Leverkühn, from his early childhood near the turn-of-the-century to his tragic death towards the beginning of World War II. Mann's protagonist, Leverkühn, is intentionally modeled along mythic lines of the German medieval morality tale of Faust, who sold his soul to the devil in exchange for infinite knowledge. As Leverkühn, impassioned by demons, develops artistically toward a fated reckoning day, German society simultaneously moves politically toward its catastrophic fate. That the novel is an allegory for Germany selling its own soul is clear and is suggested not only by Mann but by the novel's mysterious narrator, Serenus Zeitblom. Yet the novel also tackles themes of much greater depth than simply a condemnation of political catastrophes. It examines in particular the growth of German intellectualism, from the early age of Romanticism to the nihilism of such philosophers as Friedrich Nietzsche and Arthur Schopenhauer, and the ultimate disaster of National Socialism; this transition is viewed, from Adrian's point of view, through the development of German music from Romanticism to Modernism, and in this respect the novel is perhaps one of the greatest works ever written on the relation between artistic and political ambition.
Structure
Doktor Faustus consists of a vast array of characters, fables, world events, theories, memories, ideas, and places, sometimes directly and sometimes tangentially linked to the story of Adrian Leverkühn's life. The entire novel is told through the character of Serenus Zeitblom, a childhood friend of Adrian who loves his friend so dearly (and can perceive his genius, even at an early age) that he devotes the rest of his life to following Adrian and writing his biography. That fictional biography is the text of Doktor Faustus. Zeitblom is a fascinating character in and of himself: he is a philologist and schoolteacher, deeply devoted to the philosophy of moral humanism, and he is shy to the point of being comical. He famously begins the first half-dozen chapters of the novel by apologizing for how lengthy the previous chapter had been. He is so unlike the daring artist Adrian that the two are almost antithetical; yet, in complementing one another, they create a complete story of tremendous force and beauty.
Plot summary
Zeitblom begins the novel by recounting his and Leverkuhn's shared childhood in the German township of Kaiseraschern. Leverkuhn's father is a tinkerer and amateur scientist, and he shows the boys a variety of scientific wonders, such as an amoeba under a microscope. Even at this early stage, however, the difference between the two boys is clear: Zeitblom reacts to the amoeba and other wonders with awe and a slight tinge of fear; young Leverkuhn reacts by bursting into fits of uncontrollable laughter.
As the two grow older, Leverkuhn's genius becomes undeniable. His parents hire a tutor for him, only to have the tutor conclude in a matter of months that there is nothing more he can teach the boy. Leverkuhn finds solace, however, by studying with the local organist, a German-American named Kretzschmar who happens to have an extensive musical education and who takes the young prodigy under his wing. During these early chapters concerning Kretzschmar and Leverkuhn's musical education, the narrator Zeitblom reproduces huge lectures (given by Kretzschmar to Leverkuhn, Zeitblom, and a few other listeners) on such topics as Beethoven's sonatas, the ultra-primitive choral music of an extreme isolationist church in Pennsylvania, and the relation between the visual arts and music. Again, as Leverkuhn readily absorbs all of Kretzschmar's knowledge, he cannot help himself from fits of laughter at each new epiphany.
The novel proceeds on as the two characters grow up and proceed into their adult lives. When Leverkuhn comes of age he surprises his parents and his teacher Kretzschmar by announcing that he will pursue a career in theology, not music; Zeitblom devotedly follows, even though he himself has no interest in theology. Leverkuhn enrolls at a theological seminary in Giessen, and only after bouncing around the theology and philosophy departments for a number of years does he at last consider becoming a composer. It is at this crucial point in the book that the central event of the novel occurs:
One night, a pale man, dressed in suit and tie, appears to Leverkuhn as he awakens from a mild dream. The man never identifies himself, though as the two converse it readily becomes clear who He is; the closer he comes to Leverkuhn, the colder the room seems to become, and as their conversation continues he readily changes shape, taking on the voices and personas of variety of people that Leverkuhn knows. The man makes a deal with Leverkuhn: he will grant him the artistic genius he needs to compose great works, in exchange for his soul. At first, Leverkuhn laughs at this offer, but the man continues making clear exactly what he means: when he leaves the room, Leverkuhn will be granted musical genius; but in return, he will never again be able to love another human being. Leverkuhn agrees to the terms; the man disappears; and the novel is never the same again.
After these events, the novel rapidly picks up speed as Leverkuhn begins a repeated process of composing more and more ambitious masterworks while simultaneously falling further and further into insanity. Each new work Leverkuhn composes is preceded by a period of intense illness and migraines, which gradually become more severe as he ambitious spread wider. As Leverkuhn begins to win fame and renown as a major composer, it becomes clear to him just what he's sacrificed. He attempts, desperately, to win the affections of a young woman, only to have those very advances result inadvertently in her own tragic death. Terrified, he flees to a rustic village much like the one he and Zeitblom grew up in and commits himself to composing his greatest work, a massive opera entitled The Damnation of Faust.
After years of total isolation, Leverkuhn invites Zeitblom, Kretzschmar, and anyone else who is willing to meet him at his home as he presents the finished manuscript of his opera. Instead of presenting the opera, however, Leverkuhn tells the gathered audience of all his sins, of having lived the past three decades of his life in service of the Lord Satan, and of being responsible for the murders of his nephew Nepomuk and a woman he loved. As the ghastly confession continues all the members of the audience with the exception of Zeitblom slowly drift away, and the novel closes with Leverkuhn falling into a fit of migraine and lapsing into a coma from which he never recovers.
Themes
The novel is concerned with the intellectual fall of Germany in the time leading up to World War II. Leverkühn's own moods and ideology mimic the change from humanism to irrational nihilism found in Germany's intellectual life in the 1930s. Leverkühn (the name means "live audaciously") becomes increasingly corrupt of body and of mind, ridden by syphilis and insanity. In the novel, all of these thematic threads—Germany's intellectual fall, Leverkühn's spiritual fall, and the physical corruption of his body—directly analogue to the political disaster of fascistic Germany. Mann's sense of the inseparable nature of art and politics may be seen in the published version of his 1938 United States lecture tour, The Coming Victory of Democracy, in which he said, "I must regretfully own that in my younger years I shared that dangerous German habit of thought which regards life and intellect, art and politics as totally separate worlds." In Doktor Faustus, Leverkühn's personal history, his artistic development, and the shifting German political climate are tied together by the narrator Zeitblom as he feels out and worries over the moral health of his nation (just as he had worried over the spiritual health of his friend, Leverkühn).
Another central theme is music. In the novel, Adrian Leverkühn develops the twelve-tone technique actually invented by Arnold Schoenberg. Schoenberg, who lived near Mann in Los Angeles as the novel was being written, was very upset that Mann had appropriated the method without attributing it to him, and at his insistence, later editions of the novel included a disclaimer at the end describing Schoenberg's invention of the technique.
Major Influences
The Bible
Fyodor Dostoevsky
Theodor Fontane
Sigmund Freud
Johann Wolfgang von Goethe
Hermann Hesse
E. T. A. Hoffmann
Carl Jung
Gotthold Ephraim Lessing
Martin Luther
Friedrich Nietzsche
Edgar Allan Poe
Friedrich von Schlegel
Arthur Schopenhauer
Richard Wagner
Works
1897 Little Herr Friedemann (Der kleine Herr Friedemann)
1897 The Clown (Der Bajazzo)
1900 The Road to the Churchyard (Der Weg zum Friedhof)
1901 Buddenbrooks (Buddenbrooks - Verfall einer Familie)
1902 Gladius Dei
1903 Tristan
1903 Tonio Kröger
1909 Royal Highness (Königliche Hoheit)
1912 Death in Venice (Der Tod in Venedig)
1918 Reflections of an Unpolitical Man (Betrachtungen eines Unpolitischen)
1922 The German Republic (Von deutscher Republik)
1924 The Magic Mountain (Der Zauberberg)
1926 Disorder and Early Sorrow (Unordnung und frühes Leid)
1930 Mario and the Magician (Mario und der Zauberer)
1933–1943 Joseph and His Brothers (Joseph und seine Brüder)
1933 The Tales of Jacob (Die Geschichten Jaakobs)
1934 The Young Joseph (Der junge Joseph)
1936 Joseph in Egypt (Joseph in Ägypten)
1943 Joseph the Provider (Joseph, der Ernährer)
1937 The Problem of Freedom (Das Problem der Freiheit)
1938 The Coming Victory of Democracy
1939 Lotte in Weimar: The Beloved Returns
1940 The Transposed Heads (Die vertauschten Köpfe - Eine indische Legende)
1942 Listen Germany! (Deutsche Hörer)
1947 Doctor Faustus (Doktor Faustus)
1951 The Holy Sinner (Der Erwählte)
1922/1954 Confessions of Felix Krull Confidence Man, The Early Years (Bekenntnisse des Hochstaplers Felix Krull. Der Memoiren erster Teil); unfinished
Notes
Sources
Bloom, Harold, ed. The Magic Mountain: Modern Critical Interpretations. Chelsea House, 1986. ISBN 0877549028
The Cambridge Companion to Thomas Mann, Edited by Ritchie Robertson. Cambridge University Press, 2001. ISBN 0521653703
Dowden, Stephen. A Companion to Thomas Mann's Magic Mountain. Camden House, 2002. ISBN 1571132481
Hatfield, Henry. Thomas Mann. Spectrum Books, Ltd. 1962. ASIN: B0013JC61I
Hatfield, Henry, ed. Thomas Mann: a Collection of Critical Essays. Prenticc Hall, 1964. ASIN: B000JWBV5M
Mann, Thomas. Thomas Mann Diaries 1918-1939, Translated By Richard and Clara Winston. New York: Harry N. Abrams. 1982. ASIN: B000P1JMDU
Mann, Thomas. The Magic Mountain, a new translation into English by John E. Woods. London: Everyman's Library, [1996] 2005. ISBN 1400044219 This won the Helen and Kurt Wolff Prize in 1996.
Nobel Lectures, Literature 1901-1967, Edited by Horst Frenz, Amsterdam: Elsevier Publishing Company, 1969. based on a previous version in French.
Reed, Terence James. Thomas Mann: The Uses of Tradition. Oxford University Press, 1974. ISBN 0198157428
All links retrieved April 30, 2023.
|
||||||
correct_award_00093
|
FactBench
|
0
| 79
|
https://www.newyorker.com/books/double-take/judging-the-nobel
|
en
|
Judging the Nobel
|
https://media.newyorker.com/photos/6675837b1b1a72266398301d/4:3/w_480%2Cc_limit/undefined
|
[
"https://www.newyorker.com/verso/static/the-new-yorker/assets/logo.svg",
"https://www.newyorker.com/verso/static/the-new-yorker/assets/logo-header.svg",
"https://media.newyorker.com/photos/59097b9a8b51cf59fc423ca9/1:1/w_270%2Cc_limit/undefined",
"https://media.newyorker.com/photos/59097b9a8b51cf59fc423ca9/1:1/w_270%2Cc_limit/undefined",
"https://media.newyorker.com/photos/6675837b1b1a72266398301d/4:3/w_480%2Cc_limit/undefined",
"https://media.newyorker.com/photos/6675837b1b1a72266398301d/4:3/w_480%2Cc_limit/undefined",
"https://media.newyorker.com/photos/6674548840f2c50633a6eabc/4:3/w_480%2Cc_limit/undefined",
"https://media.newyorker.com/photos/6674548840f2c50633a6eabc/4:3/w_480%2Cc_limit/undefined",
"https://media.newyorker.com/photos/66742d33eb2be49bd18fa1ba/4:3/w_480%2Cc_limit/undefined",
"https://media.newyorker.com/photos/66742d33eb2be49bd18fa1ba/4:3/w_480%2Cc_limit/undefined",
"https://media.newyorker.com/photos/6674853e1bf829450a4bf9a9/4:3/w_480%2Cc_limit/undefined",
"https://media.newyorker.com/photos/6674853e1bf829450a4bf9a9/4:3/w_480%2Cc_limit/undefined",
"https://www.newyorker.com/verso/static/the-new-yorker/assets/logo-reverse.svg"
] |
[] |
[] |
[
"adam gopnik",
"louis menand",
"mario vargas llosa",
"nobel prize in literature",
"prizes"
] | null |
[
"The New Yorker",
"Françoise Mouly",
"Zach Helfand",
"Michael Schulman",
"Emily Flake",
"Condé Nast"
] |
2010-10-11T06:56:30-04:00
|
In this week’s Comment, Adam Gopnik discusses the Nobel Prize in Literature, which was won last week by the Peruvian writer and political activist Mario …
|
en
|
https://www.newyorker.com/verso/static/the-new-yorker/assets/favicon.ico
|
The New Yorker
|
https://www.newyorker.com/books/double-take/judging-the-nobel
|
In this week’s Comment, Adam Gopnik discusses the Nobel Prize in Literature, which was won last week by the Peruvian writer and political activist Mario Vargas Llosa. (Gopnik answered readers’ questions in a live chat on Monday afternoon.)
After identifying some specious winners in the past, Gopnik writes that this year’s pick seems just:
So all hail Vargas Llosa, whom even his noisier left-wing critics have to regard as exactly the kind of writer the prize ought to go to: one with a host of well-regarded novels (“The Time of the Hero,” “Conversation in the Cathedral,” the screen-adapted “Aunt Julia and the Scriptwriter,” “The Feast of the Goat”) and a sense of social responsibility (he ran seriously for, and lost badly, the Presidency of Peru), not to mention a lively personal life that includes once punching out another future laureate with an equally impressive triple-barrelled moniker, Gabriel García Márquez, reportedly over something to do with Mrs. Vargas Llosa.
The Nobel Prize in Literature has long been controversial; the list of winners seems to be dwarfed by the list of undisputed masters who were snubbed. In 1991, Gopnik wrote a Comment (available free for subscribers) on the Nobel laureates in literature from the beginning of the twentieth century who have disappeared from public consciousness, or as he puts it, “seem to have no rank at all.”
Rudolf Christoph Eucken, Karl Adolph Gjellerup, and Henrik Pontoppidan; Carl Friedrich Georg Spitteler, Carl Gustaf Verner von Heidenstam, and Jacinto Benavente; Frans Eemil Sillanpää, and Johannes Vilhelm Jensen. Who were they, and what did they write?
From his survey of these winners, Gopnik suggests to the aspiring Nobel laureate the following: write long epics in a nationalistic (preferably Scandinavian) vein; start or emerge as the central practitioner of an odd literary movement; or title your works with “arresting, unlooked-for juxtaposition of adjective and noun.”
In 2005, Louis Menand reviewed “The Economy of Prestige,” by James English, a study of the politics and history of literary prizes. Here, Menand distills English’s argument about the core purpose of such awards:
As long as we want to believe that creative achievement is special, that a work of art is not just one more commodity seeking to aggrandize itself in the marketplace at the expense of other works of art, we need prizes so that we can complain about how stupid they are. In this respect, it is at least as important that the prize go to the wrong person as that it go to the right one. No one thinks that Tolstoy was less than a great writer because he failed to win the Nobel. The failure to win the Nobel has become, in the end, a mark of his greatness.
|
|||
correct_award_00093
|
FactBench
|
0
| 5
|
https://www.encyclopedia.com/arts/culture-magazines/jensen-johannes-v-20-january-1873-25-november-1950
|
en
|
Jensen, Johannes V. (20 January 1873 - 25 November 1950)
|
[
"https://www.encyclopedia.com/themes/custom/trustme/images/header-logo.jpg"
] |
[] |
[] |
[
"Johannes V. Jensen (20 January 1873 - 25 November 1950)Sven Hakon Rossel University of Vienna1944 Nobel Prize in Literature Broadcast Presentation"
] | null |
[] | null |
Johannes V. Jensen (20 January 1873 - 25 November 1950)Sven Hakon Rossel University of Vienna1944 Nobel Prize in Literature Broadcast Presentation Source for information on Jensen, Johannes V. (20 January 1873 - 25 November 1950): Nobel Prize Laureates in Literature, Part 2 dictionary.
|
en
|
/sites/default/files/favicon.ico
|
https://www.encyclopedia.com/arts/culture-magazines/jensen-johannes-v-20-january-1873-25-november-1950
|
Johannes V. Jensen (20 January 1873 - 25 November 1950)
Sven Hakon Rossel
University of Vienna
1944 Nobel Prize in Literature Broadcast Presentation
1944 Nobel Prize in Literature Presentation Speech
Jensen: Banquet Speech
Jensen: Autobiographical Statement
Bibliographies
Biographies
References
Papers
This entry was expanded by Rossel from his Jensen entry in DLB 214: Twentieth-Century Danish Writers.
BOOKS: Danskere (Copenhagen: Det Nordiske, 1896);
Einar Elkær: Roman (Copenhagen: Det Nordiske, 1898);
Himmerlandsfolk: Historier (Copenhagen: Det Nordiske, 1898);
Intermezzo: Dolores, Forsvundne Skove, Louison (Copenhagen: Det Nordiske, 1899);
Foraarets Død (Copenhagen: Det Nordiske, 1900);
Den store Sommer (Copenhagen: Det Nordiske, 1900);
Den gotiske Renaissance (Copenhagen: Det Nordiske, 1901);
Vinteren (Copenhagen: Det Nordiske, 1901);
Kongens Fald (Copenhagen: Det Nordiske, 1901)–comprises Foraarets Død, Den store Sommer, and Vinteren; translated by P. T. Federspiel and Patrick Kirwan as The Fall of the King (London: Grayson & Grayson, 1933; New York: Holt, 1933);
Kirken i Farsø: Skitse (Minneapolis & Chicago: C. Rasmussen, 1903);
Madame D’Ora (Copenhagen: Gyldendal, 1904);
Nye Himmerlandshistorier (Copenhagen: Gyldendal, 1904)–includes “Kirstens sidste Rejse,” translated by Lee Marshall as “Kirsten’s Last Journey,” in Anthology of Danish Literature: Bilingual Edition, edited by F.J. Billeskov Jansen and P. M. Mitchell (Carbondale: Southern Illinois University Press, 1971), pp. 300-379;
Skovene (Copenhagen: Gyldendal, 1904);
Hjulet (Copenhagen: Gyldendal, 1905);
Digte (Copenhagen: Gyldendal, 1906); revised and enlarged as Digte: Anden Udgave, 1917; revised and enlarged as Digte: Tredie Udgave, 1921; revised and enlarged as Digte: 1901–1941, 1943; revised and enlarged as Digte, 1948–includes “Ved Frokosten” and “Paa Memphis Station,” translated by Alexander Taylor as “At Lunch” and “At Memphis Station,” in Contemporary Danish Poetry, edited by Line Jensen and others (Copenhagen: Gyldendal, 1958), pp. 91-92;
Myter og Jagter (Copenhagen: Gyldendal, 1907)– includes “Fusijama,” translated by Elias Bredsdorff as “Fujiyama,” in his Contemporary Danish Prose: An Anthology (Copenhagen: Gyldendal, 1958), pp. 85-90;
Den ny Verden: Til international Belysning of nordisk Bondekultur (Copenhagen: Gyldendal, 1907);
Singapore Noveller (Copenhagen: Gyldendal, 1907);
Nye Myter (Copenhagen: Gyldendal, 1908);
Bræen: Myter om Istiden og det første Menneske (Copenhagen: Gyldendal, 1908); translated by Arthur G. Chater in Fire and Ice, volume 1 of The Long Journey (London: Gyldendal, 1922; New York: Knopf, 1923);
Lille Ahasverus (Copenhagen: Gyldendal, 1909);
Myter: Ny Samling (Copenhagen: Gyldendal, 1910);
Himmerlandshistorier: Tredie Samling (Copenhagen: Gyldendal, 1910);
Nordisk Aand: Kroniker og Karakteristiker (Copenhagen: Gyldendal, 1911);
Skibet (Copenhagen: Gyldendal, 1912);
Myter: Fjerde Samling (Copenhagen: Gyldendal, 1912);
Rudyard Kipling (Copenhagen: Gyldendal, 1912);
Olivia Marianne (Copenhagen: Gyldendal, 1915);
Introduktion til vor Tidsalder (Copenhagen: Gyldendal, 1915);
Aarbog l916 (Copenhagen: Gyldendal, 1916);
Eksotiske Noveller (Copenhagen: Gyldendal, 1916)–comprises Singapore Noveller, Lille Ahasverus, and Olivia Marianne;
Aarbog l917 (Copenhagen: Gyldendal, 1917);
Norne-Gæst (Copenhagen: Gyldendal, 1919); translated by Chater in The Cimbrians, volume 2 of The Long Journey (London: Gyldendal, 1923; New York: Knopf, 1923);
Det tabte Land: Mennesket før Istiden (Copenhagen: Gyldendal, 1919); translated by Chater in Fire and Ice, volume 1 of The Long Journey (London: Gyldendal, 1922; New York: Knopf, 1923);
Johannes Larsen og hans Billeder (Copenhagen: Gyldendal, 1920);
Christofer Columbus (Copenhagen: Gyldendal, 1921); translated by Chater as Christopher Columbus, volume 3 of The Long Journey (London: Gyldendal, 1924; New York: Knopf, 1924);
Sangerinden (Madame d’Ora): Drama i fem Akter (Copenhagen: Gyldendal, 1921);
Cimbrernes Tog (Copenhagen: Gyldendal, 1922); translated by Chater in The Cimbrians, volume 2 of The Long Journey (London: Gyldendal, 1923; New York: Knopf, 1923);
Æstetik og Udvikling: Efterskerift til Den lange Rejse (Copenhagen: Gyldendal, 1923);
Aarstiderne, illustrated by Johannes Larsen (Copenhagen: Gyldendal, 1923);
Myter: Tredie Bind, 1914–1924 (Copenhagen: Gyldendal, 1924);
Hamlet: Til Forklaring of Hamletskikkselen (Copenhagen: Gyldendal, 1924);
Evolution og Moral (Copenhagen: Gyldendal, 1925);
Aarets Højtider (Copenhagen: Gyldendal, 1925);
Verdens Lys: Nye Digte (Copenhagen: Gyldendal, 1926);
Jørgine (Copenhagen: Hage & Clausens Forlag, 1926);
Dyrenes Forvandling: Til Udviklingens Plastik (Copenhagen: Gyldendal, 1927);
Ved Livets Bred og andre Myter (Copenhagen: Gyldendal, 1928);
Aandens Stadier (Copenhagen: Gyldendal, 1928);
Retninger i Tiden: Artikler 1925–30 (Copenhagen: Gyldendal, 1930);
Den jydske Blæst: Digte 1926–1930 (Copenhagen: Gyldendal, 1931);
Form og Sjæl: Portræter og Personligheder (Copenhagen: Gyldendal, 1931);
Paa danske Veje, illustrated by Larsen (Copenhagen: Gyldendal, 1931);
Pisangen (Copenhagen: Gyldendal, 1932);
Kornmarken (Copenhagen: Gyldendal, 1932);
Sœlernes Ø (Copenhagen: Gyldendal, 1934);
Det Blivende: Tankens Revoluüoneñng i det 19de Aarhundrede og Tïlbagefaldet i det 20de (Copenhagen: Gyldendal, 1934);
Dr. Renaults Fristelser (Copenhagen: Gyldendal, 1935);
Gudrun (Copenhagen: Gyldendal, 1936);
Darduse, Bryllupet i Peking: Eventyrkomedie i fire Akter (Copenhagen: Gyldendal, 1937);
Paaskebadet: Digte 1931–1937 (Copenhagen: Gyldendal, 1937);
Jydske Folkelivsmalere: Dalsgaard, Michael Ancher, Hans Smidth (Copenhagen: Arthur Jensen, 1937);
Den lange Rejse, 2 volumes (Copenhagen: Gyldendal, 1938)–comprises volume 1, Det tabte Land, Bræen, and Norne-Gæst; and volume 2, Cimbrernes Tog, Skibet, and Christofer Columbus;
Thorvaldsen: Haandvœrkeren og Manden (Copenhagen: Arthur Jensen, 1938);
Nordvejen: Indtryk af norsk Natur (Copenhagen: Gyldendal, 1939);
Fra Fristaterne: Rejsebreve, med et Tilbageblik (Copenhagen: Gyldendal, 1939);
Gutenberg: Til Bogtrykkerkunstens Historie, by Jensen and Aage Marcus (Copenhagen: Bianco Lunos Bogtrykkeri, 1939);
Mariehønen (Copenhagen: Gyldendal, 1940);
Udvalgte Prosastykker, edited by Morten Borup and Peter Iløse (Copenhagen: Gyldendal, 1940);
Mindets Tavle: Portrœter og Personligheder (Copenhagen: Gyldendal, 1941);
Vor Oprindelse (Copenhagen: Gyldendal, 1941);
Om Sproget og Undervisningen (Copenhagen: Gyldendal, 1942);
Kvinden i Sagatiden (Copenhagen: Gyldendal, 1942);
Folkeslagene i Østen (Copenhagen: Gyldendal, 1943);
Møllen (Copenhagen: Gyldendal, 1944);
Myter, 2 volumes (Copenhagen: Gyldendal, 1946);
Bogbinderen (Copenhagen: Printed by J. H. Schultz, 1947);
Afrika: Opdagelsesrejserne (Copenhagen: Gyldendal, 1949);
Danske Køreteøjer (Copenhagen: Thaning & Appel, 1949);
Swift og Oehlenschläger (Copenhagen: Gyldendal, 1950);
Tilblivelsen (Copenhagen: Gyldendal, 1951);
Mytens ring: Efterladte myter og beskrivelser (Copenhagen: Gyldendal, 1957);
Ungt er endnu Ordet: Portræter og Personligheder, edited by Carl Bergstrøm-Nielsen (Copenhagen: Gyldendal, 1958);
Troth, edited by Sven Hakon Rossel (Copenhagen: Museum Tusculanum, 2002).
Editions and Collections: Den jydske blæst og andre digte, selected by Ole Wivel (Copenhagen: Gyldendal, 1957);
Bræen, edited by Martin Larsen (Copenhagen: Gyldendal, 1963);
Himmerlandshistorier, edited, with an afterword, by Jørgen Elbek, Gyldendals Bibliotek, no. 24 (Copenhagen: Gyldendal, 1963);
Johannes Larsen og Aarstiderne, edited, with a foreword, by Aage Marcus, Gyldendals Uglebøger, no. 56 (Copenhagen: Gyldendal, 1963);
Jordens Kreds, selected by Marcus, introduction by Niels Birger Wamberg (Copenhagen: Gyldendal, 1967);
Myter i Digte i Udvalg, selected by Leif Nedergaard (Copenhagen: Gyldendal, 1969);
Himmerlandshistorier, edited by Povl Marstal (Copenhagen: Gyldendal, 1970);
Mørkets frodighed; Tidlige myter, selected by Wamberg (Copenhagen: Gyldendal, 1973);
12 Himmerlandshistorier, edited by Sven Moller Kristensen (Copenhagen: Gyldendal, 1979);
Tretten Myter: Johannes V. Jensen, selected and illustrated by Jens Jensen (Copenhagen: Gyldendal, 1982);
Himmerlandshistorier: Et udvalg, selected by Sonja Carlberg (Copenhagen: Gyldendal, 1984);
Kender du Johannes V. Jensen, selected by Margit Mørk (Copenhagen: Grafisk, 1986);
Christofer Columbus, foreword by Ib Michael (Copenhagen: Gyldendal, 1992);
Madame D’Ora; Hjulet, edited by Sven Hakon Rossel (Copenhagen: Det Danske Sproge-og Litteraturselskab/Borgen, 1997);
Digte: Johannes V. Jensen, edited by Frits Johansen (Copenhagen: Gyldendal, 1998).
Editions in English: “Ane og Koen,” translated by Victor Folke Nelson as “Ann and the Cow”; “Forsvundne Skove,” translated by Henry Commager as “Lost Forests,” in Denmark’s Best Stories, edited by Hanna Astrup Larsen (New York: American-Scandinavian Foundation/Norton, 1928), pp. 327-340;
Garden Colonies in Denmark, translated by F. Aubrey Rush (Copenhagen: Danske selskab, 1949);
Denmark’s Johannes V. Jensen, translated by Marion L. Nielsen (Logan: Utah State Agricultural College, 1955);
The Waving Rye, selected by C. A. Bodelsen, translated by Ronald Bathgate and others (Copenhagen: Gyldendal, 1958; New York: American-Scandinavian Foundation, 1959);
The Fall of the King, translated by Alan Bower and edited by Sven Hakon Rossel (Seattle: Mermaid Press, 1992; revised edition, Traverse City: Stonehill, 1995).
PLAY PRODUCTIONS: Trods, Chicago, Scandia Hall, 1 February 1903;
Sangerinden, Odense, Odense Teater, 16 November 1923;
Darduse, Bryllupet i Peking: Eventyrkomedie i fire akter, Copenhagen, Royal Theater, 22 January 1937;
Hamlet, translation of William Shakespeare’s play, Copenhagen, Royal Theater, 24 April 1937.
OTHER: Jack London, Naar Naturen kalder, translated by Aslaug Mikkelsen, foreword by Jensen (Copenhagen: Peter Hansen, 1907);
Ditleff von Zeppelin, Fugletrak, edited by Jensen and Otto Gelsted (Copenhagen: Gyldendal, 1916);
Thorvaldsens Portrœtbuster, introduction by Jensen, biographical notes by Aage Marcus (Copenhagen: Gyldendal, 1926).
TRANSLATIONS: Frank Norris, Af Hvedens Saga: Polypen, en Bog om Kalifornien (Copenhagen: Gyldendal, 1907);
Rudyard Kipling and Wolcott Balestier, Nauhlaka: Forædling fra Vesten og Østen, translated by Jensen and Aslaug Mikkelsen (Copenhagen: V. Pio, 1911);
Kipling, Fribytterbreve; De rædselsfulde Nætters By og andre Skizzer, translated by Jensen and Mikkelsen (Copenhagen: V. Pio, 1912);
Kipling, Fra Hav til Hav, translated by Jensen and Mikkelsen (Copenhagen: V. Pio, 1913);
Kipling, Liv og Drøm, translated by Jensen and Mikkelsen (Copenhagen: V. Pio, 1913);
Kipling, Med Natexpressen Aar 2000 og andre Fortœllinger, translated by Jensen and Mikkelsen (Copenhagen: V. Pio, 1914);
Rudolf Requadt, Krigsflyveren (Copenhagen: Fr. Ravn, 1916);
Adelbert von Chamisso, Peter Schlemihls vidunderlige Historie (Copenhagen: Høst & Søn, 1918);
Walt Whitman, Digte, translated by Jensen and Otto Gelsted (Copenhagen: Nyt Nordisk, 1919);
De islandske Sagaer, 3 volumes, translated by Jensen, Knud Hjortø, and Hans Kyrre (Copenhagen: Gyldendal, 1930–1932);
William Shakespeare, Hamlet (Copenhagen: Gyldendal, 1937);
Egil Skallagrimssons Saga (Copenhagen: Gyldendal, 1943);
Snorre Sturlason, Heimskringla: Norges Kongesagaer, translated and edited by Jensen and Kyrre (Copenhagen: Gyldendal, 1948).
SELECTED PERIODICAL PUBLICATIONS–UNCOLLECTED: Skatten paa Korsøgaard: Fortælling fra Aarhundredets Begyndelse, as Ivar Lykke, Revuen (2 January-21 July 1895);
DødssejUren: Fortælling, as Lykke, Revuen (24 July 1895–26 January 1896);
Blodfesterne i Arizona: Mexikansk, historisk Fortælling, as Lykke, Revuen (26 January-31 May 1896);
Jim Blacksools Revolver: Roman fra det fjerne Vesten, as Lykke, Revuen (31 May-30 September 1896);
Falskmønterbandens blodige Bog: Kriminal-Roman, as Lykke, Revuen (1 July-13 December 1896);
Nihilistens Ed: En Nutidsroman, as Lykke, Revuen (19 July - 23 September 1896);
Taterkongens ni Sønner og deres Blodhœvn: Roman fra Dronning Margrethes Tid, as Lykke, Revuen (9 December 1896–5 May 1897);
Milliontyvenes Høvding eller den røde Tiger: Original illustreret New Yorkerroman, as Lykke, Revuen (9 May-22 September 1897);
Hakon Blodøxes Bedrifter: Roman fra Vikingetiden, as Lykke, Revuen (26 September 1897 – 9 March 1898);
Ridder Tages Dødsridt eller de blodige Sporer: Original historisk Roman fra Valdemar den Stores Tid, as Lykke, Revuen (13 March-20 July 1898).
By revolting against the introspection of Danish turn-of-the-century literature and the psychological and social naturalism of the nineteenth century, Johannes V. Jensen became one of the most prolific, innovative spirits in Danish cultural life. His worship of modern science and technology, the bustling life in the international metropolis, and pragmatic materialism and capitalism made him instrumental in the reorientation of Danish literature away from continental French and German models toward Anglo-American cultural life. His enthusiasm for American literature resulted in the introduction and promotion in Denmark of the works of many American writers.
Charles Darwin’s theories were an important source of inspiration for Jensen’s depictions of nature scenes and animals but had a disastrous impact on his questionable evolutionary and anthropological ideas. Despite his scientific and anti-idealistic orientation, Jensen was never able to let go of his deep-rooted dependence on the Jutland peasant traditions of his childhood and an equally deep-rooted fascination with Christian metaphysics. This contradiction or split resulted in an existential insecurity that manifests itself in a fragmentation both with regard to content and form, which gives Jensen’s fictional work continuing relevance. It places him in the modernist tradition of the twentieth century, adding to some of his texts a vibrant and eclectic, almost postmodern, quality. However, when Jensen was able to bridge this split in his myths and poetry, he created pieces of timeless art in which observation and vision, present time and eternity, reality and dream are seamlessly merged.
Johannes Vilhelm Jensen was born on 20 January 1873 in Farsø in the northern Jutland province of Himmerland. His parents were both of peasant stock. His mother, Marie Kirstine Jensen, had a prosaic and practical view of life, but she also possessed a vivid imagination and a hot temper; his father, Hans Jensen, was the district veterinarian and was an expert in botany and zoology. He inspired Jensen’s later studies of nature and discovery of Darwin’s evolutionary theories. Although the family was strongly antireligious, in the late 1880s Jensen’s father became interested in spiritualism, an interest that became lifelong for Jensen’s sister, the writer Thit Jensen. Johannes V. Jensen later criticized this occult interest, although it was undoubtedly one of the causative factors for the longing for eternity and spiritual expansion that became an essential feature in his writing. Jensen also had one younger brother, who became a painter under the name Hans Deuvs.
In the autobiographical sketch Kirken i Farsø: Skitse (1903, The Church in Farse: Sketch), Jensen described his boyhood with his siblings and friends from the small town and the neighborhood farms, emphasizing his boldness and extroversion. A somewhat different description comes from one of Jensen’s friends, Peder Bach, quoted in Oluf Friis’s 1974 biography of Jensen: “Han var en besynderlig Dreng, ikke som de andre, men for det meste tavs og indesluttet, og han gik gerne og saa ned i jorden i sine egne dybe Tanker; men til Tider kunde han vaagne op, og da husker jeg at hans Snebolde blev temmelig haarde” (He was a strange boy, not like the others, but mostly silent and reticent, and he usually walked around looking down deep in his thoughts; but at times he could wake up, and then I remember that his snowballs became rather hard). A characteristic trait was his joy in reading. Jensen himself, in Mytens ring: Efterladte myter og beskrivelser (1957, The Ring of the Myth: Posthumous Myths and Descriptions), recalled his first books: children’s readers; accounts by Captain James Cook, Henry Morton Stanley, and David Livingstone of their expeditions; a history of Denmark; and Hans Christian Andersen’s tales–all works that he found on his father’s bookshelves.
After two years at Farse School, followed by private tutoring, Jensen went in 1890 to Viborg Katedralskole (cathedral school) for three years, where he became familiar with the humanistic, bourgeois educational tradition that for hundreds of years had formed the basis for spiritual life in Denmark. Jensen’s years in Viborg, depicted in the first two chapters of his first novel, Danskere (1896, Danes), were not harmonious, and his opposition was nourished in particular by reading the German poet Heinrich Heine, whom Jensen called, in his monograph Rudyard Kipling (1912), “denne Dynamitsjæsl” (this explosive spirit), and the British author Rudyard Kipling, in whose works Jensen experienced a new world of activity and international settings. Although classes in Viborg did not include modern literature, Jensen read works by contemporary writers in private. Both the neo-Romantic Danish poet Johannes Jørgensen and the Norwegian Knut Hamsun, whom Jensen regarded as the first to break with the older realism and naturalism–what Jensen called “den galliske knirkende Fornuftspoesi” (the Gallic, creaking literature of reason)–became additional models for the future writer.
Jensen passed his university entrance exams in 1893 and began the study of medicine at the University of Copenhagen in the fall. In January 1894 he published four poems, basically derivative of neo-Romantic literature, that include motifs foreshadowing his later settlement with the introverted and spiritual attitude that was prevalent in Danish literature at the time of his debut and remained part of Jensen’s own personality. Likewise without artistic quality are ten serial novels written under the pseudonym Ivar Lykke and published in the Copenhagen journal Revuen between 1895 and 1898. In Mytens ring, Jensen hints at the main ingredients of these serial novels, remarking that “Hvert Kapitel havde sit horrible Mord” (Each chapter contained a horrible murder). The novels hold no trace of his later mastery of style but are not without importance, as in them Jensen introduces motifs that he took up again later.
Jensen had his true literary debut in 1896 with the novel Danskere. After he abandoned his medical studies in 1898, he published Einar Elkœr: Roman (1898, Einar Elkæsr: Novel). Between the publication of the two novels he took a brief trip to New York City in 1896, the first of many travels that, altogether, brought Jensen to the United States six times. Both the character Buris in the first novel and Einar Elkasr in the second are students from the provinces who are confronted with the modern metropolis, Copenhagen. They are obsessed by paralyzing self-absorption that prevents them from establishing a spontaneous rapport with other people, in particular with women. Whereas Jensen hints at the possibility that the disintegration of Buris’s personality may stop, Einar constantly lapses into his ravings and dreams and dies at a mental hospital, where “Sektionen viste blèd Hjæsrne” (the autopsy showed a soft brain). Even though the two novels are strongly dependent on literary models–their melancholy atmosphere, big-city sceneries, and self-reflective protagonists can be found in the early works of Jergensen and Hamsun–their rebellion against both the fin-de-siècle spirit of the 1890s and literary and philosophical authority in general has a genuine ring. Jensen later excluded his first two books from lists of his works, perhaps because he recognized too much of his own introverted personality in his two protagonists. They are desperate outsiders, whose longing for happiness finds no fulfillment. This longing, which in so many of Jensen’s characters manifests itself as a longing to travel, is in reality a longing for the expansion of the soul.
Introspection remains a major issue in the two travelogues, Intermezzo: Dolores, Forsvundne Skove, Louihon (1899, Intermezzo: Dolores, Lost Forests, Louison) and Skovene (1904, The Woods), both written under the influence of Heine’s capricious, ironic style. The first was based on Jensen’s two visits as a reporter for the liberal newspaper Politiken to Spain, Germany, and France in 1898 and also includes the first of his many attacks on the works of Friedrich Nietzsche, whose theories of the superman constituted the “bad Darwinism” that Jensen later saw as the indirect cause of the two world wars. The second book was inspired by Jensen’s five-week stay in Singapore and on the Malay Peninsula during his first trip around the world in 1902 and 1903, which also took him to China, Japan, and the United States. In his description of a tiger hunt Jensen incorporates lyrical and satirical passages, witticisms, and brilliantly executed, precise but poetic descriptions of animals and nature. The strong stylistic contrasts in the volume reflect the self-ironic and didactic approach Jensen takes toward his own glorification of the primitive, against which he sets his homesickness and longing.
The reworked travel letters from his visit to Spain in 1898 and another to the World’s Exhibition in Paris in 1900, which Jensen incorporated into his Den gotuke Renaissance (1901, The Gothic Renaissance), on the other hand, include an enthusiastic endorsement of progress and reality: “Det tyvende Aarhundrede suser over Hovedet. Jeg bekender mig til Virkeligheden, jeg bekender” (The twentieth century roars above our heads. I profess to reality, I profess). This work glorifies the expansive spirit of the Gothic, that is, Anglo-Saxon, race, the fullest expression of which Jensen found in the American pragmatic and progressive view of life as it brought liberation from the decadence of the previous century. The volume climaxes in Walt Whitman–inspired prose hymns to progress and technology. Jensen’s theory was that the Gothic race had its origin in his home region, Himmerland. Its nature and people are portrayed in the realistic short stories in Himmerlands-folk: Historier (1898, Himmerland People: Stories), which constitute a counterbalance to Jensen’s introspective writings from the same period, and in the two collections Nye Himmerlandshistorier (1904, New Himmerland Stories) and Himmerlandshistorier: Tredie Samling (1910, Himmerland Stories: Third Collection). The early texts are marked by Jensen’s preoccupation with the meaninglessness of day-to-day existence and death. Later stories include masterful character studies of grotesque, roaming eccentrics, heroic accounts of man’s stubborn fight against either sordid surroundings or the callous forces of nature, and comic or satiric exposures of human folly. Jensen’s intimate knowledge of the flaws and meanness of his characters is balanced by a profound veneration for the old peasant traditions they also represent. In such texts Jensen has distanced himself from the uncritical glorification of technology and progress in Den gotiske Renaissance, and these stories are far above traditional regional literature.
In 1900 and 1901 Jensen published an historical novel in three parts: Foraarets Død (1900, Spring’s Death), Den store Sommer (1900, The Full Summer), and Vinteren (1901, Winter). Jensen combined the works into a single volume under the title of Kongens Fald (1901; translated as The Fall of the King, 1933). With Kongens Fald, Jensen successfully merged the extrovert/naturalistic and introvert/spiritual elements in his writing into a splendid mythic composition. In Jensen’s works there are many attempts at defining “the mythic.” In his 1932 article on Jensen, Aage Marcus reports Jensen as saying: “Leave out the plot, concentrate on those short flashes of the essence of things that illumine man and time, and you have the myth.” As Jensen writes in his Aarbog 1916 (Yearbook 1916), his point of departure is generally a concrete observation from which an expansion in time and space takes place, a technique that entails a revelation, “ingen lang møjsommelig Opregning men et Spring ind i et Billede” (rather a leap into an image than a long painstaking account). Crucial components of the myth are the tensions between the close and the distant, the tangible and the transcendental, and the concrete and the inexplicable, frequently establishing a perspective of time in which present, past, and future are bound together.
Kongens Fald can be read as a purely historical novel, attacking the passivity and indecisiveness that Jensen perceived as a major component of Danish mentality. These negative qualities are embodied in the Renaissance king Christian II and his companion, the mercenary Mikkel Thøgersen. However, Jensen ignores both historical accuracy and a structured psychological character delineation. Instead, by mingling dream-like passages of poetic beauty with harsh, naturalistic scenes of violence and destruction, he creates magnificent, deeply pessimistic visions of man’s inability to reach happiness. Like Buris in Danskere, the introverted outsider Mikkel completely lacks the ability to devote himself to enjoying the present: he can only act when his anguish turns into hatred and blind destructiveness. He rapes Ane Mette, the fiancée of his rival Otte Iversen, and many years later he kills Otte’s son, the carefree and spontaneous Axel, whose success with women stirs Mikkel’s feelings of alienation, envy, and hatred. He assaults the unarmed Axel, who later, without any bitterness, dies of blood poisoning, fever-stricken and dreaming that he is sailing into “den store Sommers Land, Dødens Land” (the land of full summer, the land of death). Axel’s “fall,” one of the lyrical highlights of the novel, corresponds on the historical level to the king’s “fall” during the fateful night in 1523 when Christian II, accompanied by Mikkel, irresolutely sails back and forth between two regions of Denmark wondering whether or not to take up the fight against the rebellious nobility–a dramatic highlight and at the same time a penetrating analysis of the paralysis of action by doubt. However, of crucial importance is not the outward fall as demonstrated in the king’s destiny. Jensen’s masterpiece must be read as a book about the inner fall, about man’s impermanence, and about death as his inexorable destiny in the midst of burgeoning life, illustrated in Axel’s fate. Kongens Fald is a book about the total absurdity of life and love, illustrated in Mikkel’s person. All of humanity is subject to the law of the fall, and only death brings the desired peace.
On his first trip around the world Jensen crossed the Pacific Ocean from Japan, stopping in the Hawaiian Islands and disembarking in San Francisco on 26 October 1902. His stay in the United States, primarily in Chicago and New York during the winter and spring of 1902–1903, provided him with the scenery for the two novels Madame D’Ora (1904) and Hjulet (1905, The Wheel), with which Jensen–only in part successfully-intended to continue the antimetaphysical trend in his authorship. Intentionally he disregards the rules of the traditional, naturalistic novel as he sets out to portray not individuals but various stages in man’s evolution within the framework of a fierce Darwinian struggle for the “survival of the fittest.” In Madame D’Ora this struggle takes place between “the missing link”–the cynical lay preacher, charlatan, and murderer Evanston–and the scientist Edmund Hall, a Faustian character fascinated with the transcendental, a neurasthenic dreamer who not only fails to reciprocate the passionate and unconditional feelings of the opera singer Leontine D’Ora but also falls an easy prey to Evanston because of his preoccupation with spiritual experiences.
In the sequel, Hjulet, the young poet Lee, Jensen’s alter ego, kills Evanston, who has now changed his name to the symbolic Cancer and has become an even clearer example of the Nietzschean vulgarization of Darwinism that Jensen had earlier attacked in Intermezzo. Through his struggle Lee overcomes his earlier passivity and turns into a man of action who condemns all aesthetics as nothing but “en Sygdom i Sansen for Virkeligheden” (a disease in the perception of reality). Passages parodying the detective novels of Sir Arthur Conan Doyle alternate with lengthy monologues (in particular by the constantly talking Cancer), congenial translations of poems by Whitman, and Lee’s sweeping visions, ranging from prehistoric, evolutionary stages in man’s development to his view of Christopher Columbus as a man of Nordic descent and a bridge builder between Europe’s past and America’s present. These visions–indeed prose poems of compelling poetic force–as well as the totally negative portrayal of the representative of pure materialism, Evanston/Cancer, prove that Jensen could not let go of the aesthetics that he lets Lee reject. The enthusiasm for the United States expressed in these two novels foreshadows later works of fiction and nonfiction, as Jensen returned to both theories of Columbus and Faustian motifs.
After Jensen returned to Copenhagen from New York City in the summer of 1903, he published a series of newspaper articles. In these articles his violent criticism of Danish superficiality and provincialism demonstrates how difficult it was for him to resign himself to staying home. He was particularly irritated by Danish literary life, partly because of his own aggressive nature, which isolated him among colleagues, and partly because he rejected both the neo-Romantics and the radical circle around the influential critic Georg Brandes, whom he had earlier admired. He was also unhappy because his books received mostly negative reviews and sold poorly.
Nevertheless, the period from 1904 to 1906 was characterized by a hectic productivity: two novels, a travelogue, a new volume of stories from Himmerland, and finally an epoch-making collection of poetry. All these publications were projects that Jensen felt he wanted to finish and move on. Digte (1906, Poems; revised and enlarged, 1917, 1921, 1943, 1948) includes almost all of his youthful poetry, except for those poems that he had published during his first year at the university, which have never been collected. Two of Jensen’s earliest prose poems from 1901, “Interferens” (Interference) and “Ved Bordet” (At the Table), published in a newspaper, were revised and incorporated in Digte. Prose poetry came to dominate this volume, especially with the inclusion of the Whitman translations, first published in the novel Hjulet. This preponderance of free verse has contributed to the popular but erroneous view that Jensen’s poetry consists mostly of prose poetry, with Whitman as the predominant model. The truth is rather that Jensen’s prose poems, modeled after the free verse of Johann Wolfgang von Goethe and Heine, belong to the period of 1901 to 1906, after which he increasingly devoted himself to poems in regular meters with either Old Norse alliterative verse or the classical rhymed stanza as models.
Digte is a milestone in the development of Danish lyric poetry. Its highly developed, bold imagery is filled with contrasts and tension both in content and in style, a style that veers from cynical statement to ecstatic exclamation to heart-rending simplicity and tenderness–the heritage of the 1890s. This metaphoric language is based on a sharp sensory perception that often takes the shape of a merciless self-analysis at the same time as it incorporates images from modern technology and everyday life.
At the center of the volume are three texts, the two prose poems “Interferens” and “Paa Memphis Station” (At Memphis Station), and the ballad-like “Christofer Columbus.” In “Interferens,” Jensen seeks, as he does in several of the poems, to reconcile the clash between extroversion and introversion, optimism and pessimism, belief in progress and wish for death–the two poles in his writing. When they intersect or rather become fused in one single state of mind, the myth emerges, as in “Christofer Columbus,” originally published in Madame D’Ora, where it was a warning to Hall to abandon his insatiable ambitions to transgress the boundaries of empirical science. It can, however, also be read as an anticipation of the novel Christofer Columbus (1921; translated as Christopher Columbus, 1924), as a portrait of Jensen’s tragic alter ego, Columbus, who is invoked in the poem “Afsked” (Departure). Here, another crucial theme from Jensen’s novels, the longing to travel, is introduced in an attack on humdrum everyday life in provincial Denmark. The poem was written in 1902 just before Jensen’s voyage around the world. Inspired by the voyage itself is “Paa Memphis Station,” a commitment to a reality that must encompass even the experience of death. This realization ignites the poet’s zest for a life that must be conquered through travel. Thus Columbus must move on, but the outcome of his travel turns out to be tragic, as Jensen points out in the succeeding poem, “Hverdagene” (Everyday Life). Its concluding request to “gaa frygtløst ind i Hverdagene” (enter into everyday life fearlessly) is preceded by lines about the merciless passage of time and unavoidable death:
Somren slaar sine Kister i.
Unge er vi saa ikke mere.
Men har Haabet ikke beskæmmet os tilstrækkeligt?
Nu kommer vor rige Dødstid, Broder.
(Summer is closing its chests.
So we are young no longer.
But has hope not shamed us sufficiently?
Now comes a plentiful dying time, brother.)
Such lines point ahead to perhaps the most difficult of all lyrical genres, the memorial poem, a genre that Jensen mastered to a degree not reached by any other Danish writer. In the second edition of Jensen’s collected poems, Digte: Anden Udgave (1917, Poems: Second Edition), there are two such poems, “Leonora Christine” and “Bjørnstjerne Bjørnson,” written in traditional iambic meter. The first is a portrait of Leonora Christine, a Danish Renaissance princess who spent twenty-two years in prison and whose tragic destiny Jensen perceives in a mythic perspective. The second poem is a glorification of the active and extroverted Norwegian writer, written on the occasion of Bjørnson’s death in 1910. The poem is pervaded with pantheism; however, Jensen concludes with lines negating that pantheism: “O Solopgang paa Bjergets Sne – / ham skal du aldrig mere se” (Oh sunrise on the mountain’s snow– / you shall never see him again). The last poem in the volume is “Envoi,” which Jensen kept revising until it received its final form in the third edition of his collected poems. This poem is noteworthy for its haiku-like simplicity, with a perfect form embracing time and space, a myth concentrated around the eternity of love placed in the cycle of nature.
Jensen’s productivity after his return from his journey around the world in 1902 and 1903 was also caused by the need to make a living for his family. On 15 April 1904 he married twenty-six-year-old Else Marie Ulrik, with whom he had three sons, Jens, Villum, and Emmerick. On 2 July 1906 Jensen began his own newspaper, Pressen (The Press), with John Martin. Modeled on contemporary American tabloids, Pressen was filled with sensational news, cartoons, and many advertisements. As Jensen did not have any political or cultural program for the newspaper, he did not find any readers, and Pressen lasted only until 31 July. This project was preceded by several trips to Himmerland and Berlin, and at the same time Jensen began translating Frank Norris’s novel The Octopus (1901), eventually published as Af Hvedens Saga: Polypen, en Bog om Kalifornien (1907, From the Wheat’s Saga: The Octopus, a Book about California). By writing the foreword to Naar Naturen kalder, a translation in 1907 of Jack London’s The Call of the Wild (1903), Jensen called attention in Denmark to another American writer; he introduced a third American writer, Ernest Hemingway, to the Danish public with an essay in Politiken (30 May 1930).
As a result of a 1905 trip to New York City, Jensen began writing a series of articles in March 1906 for Politiken about journalism, literature, and social issues in the United States. These essays were republished in the collection Den ny Verden: Til international Belysning of nordisk Bondekultur (1907, The New World: For an International Illustration of Nordic Peasant Culture). The heroes of the book are the American reporter Norris, who successfully portrayed the hectic pioneer spirit during the growth of the United States, and Theodore Roosevelt, who is seen as the epitome of American civilization because of his dynamic and extroverted nature. From the fall of 1906 to the summer of 1907 Jensen was preoccupied with polemical exchanges with various Danish writers and critics. For this reason the collection of prose Myter og Jagter (Myths and Hunts), from 1907, includes primarily texts that had previously been published. Singapore Noveller (Singapore Stories), also from 1907, was likewise based on older material. Together with the texts in Lille Ahasvérus (1909, Little Ahasuerus) and Olivia Marianne (1915), the stories in Singapore Noveller were collected in Ehothke Noveller (1916, Exotic Stories). They form an exotic counterpart to the stories about Himmerland. The United States provides the setting for some of the stories, but most of them are set in Java and China, inspired by Jensen’s Far East trip in the summer and fall of 1902, and owe their quality mainly to the exquisitely drawn scenery and linguistic virtuosity. In their somewhat simplistic view of the life of Europeans among the natives, the stories are an example of the strong influence Kipling had on Jensen’s early writings.
In the summer of 1907 Jensen, tired of literary disputes–although he had provoked them himself– made a trip to Norway, and in the following winter he went to Sweden. These visits resulted in several realistic hunting descriptions for Politiken, later included in the volume Nye Myter (1908, New Myths), which also includes one of Jensen’s best prose texts, “Darwin og Fuglen” (Darwin and the Bird). It opens with a magnificent spring scene, then is extended into a portrait of Darwin, the man whose evolutionary theories influenced Jensen’s writing for the remaining forty-three years of his life. With this volume and Myter og Jagter, Jensen had begun to create a series of brief, somewhat uneven prose texts, so-called myter or myths, which usually were first printed in newspapers; from 1910 to 1944 nine additional volumes were published. In essays and sketches based on reading and traveling, Jensen incorporates
Øjebliksbilleder fra Gaden, indre dæmrende Erindringer omspændende alle fem Verdensdele, Historien, Urtiden og en fjern Barndom... mellem hinanden, men sandt til Hobe forsaa vidt som det hænger sammen og har Tone og Farve.
(snapshots from the street, inner dawning memories encompassing all five continents, history, the earliest times and a distant childhood... intermingled, but every bit of it true in so far as it has coherence and resonance and color.)
Jensen presents his basic ideas in a symbolic, concentrated form: a full acceptance of present reality as the source and final goal of all longing as in “Fusijama” (1907; translated as “Fujiyama,” 1958) and a belief in eternity as it is found in the cyclic reappearance of the seasons as in “Nordisk Foraar” (1912, Nordic Spring). Jensen’s myths are based on his belief in the necessity of placing oneself in a meaningful context with nature as in “Haren” (1908, The Hare) and creating links to the most distant memories from history and prehistory as in “Dansk Natur” (1910, Danish Nature). This myth gives a superb description of Denmark seen in the light of the country’s past, incarnated in ancient monuments and in visions of the life of Stone Age people. It is actually one of several myths exemplifying the impossibility of seizing and preserving the present moment in isolation and thus–characteristic of the split in Jensen between materialism and spirituality–partially contradicting the “Fusijama” myth. Directly dealing with man’s quest for the indefinite and eternal are the myths “Moderen og Barnet” (1917, Mother and Child) and “Den store Kristoffer” (1917, The Great Christopher). “Moderen og Barnet,” which in his Æsketik og Udvikling: Efterskerift til Den lange Rejse (1923, Aesthetics and Evolution: Postscript to The Long Journey) Jensen called “nok det gyldigste jeg har gjort” (probably the most valid thing I have ever written), is based on the Roman Catholic concept of the Madonna and child, “Livets skønneste Symbol, Slægten i et Afbillede, som var Slægten selv, det højeste Under, og samtidigt den højeste Moral” (life’s most beautiful symbol, the family in a single image that is the family itself, the highest miracle and at the same time the highest ethics). In his retelling of the legend of St. Christopher, Jensen makes a Northerner of him. His staff, which changed into a palm, is interpreted mythically as the Northerner’s longing for the South, which is finally satisfied when the South comes to him in the person of the infant Jesus, whom Christopher carries across the river and into the North, an achievement that is duplicated when his namesake, Columbus, brings Christianity to the New World.
These two myths, as well as several others, were reworked and incorporated into the six books that became Den huge Rejse (1938; translated as The Long Journey, 3 volumes, 1922-1924). This multivolume novel comprises an evolutionary history or rather a fantasy of mankind. It was intended as a scientific counterpart to the biblical legends but turned out to be a collection of legends itself. Jensen sees the challenge of nature as the driving force of progress that brings about evolution. Bræen: Myter om Istiden og det første Menneske (1908, The Glacier: Myths of the Ice Age and of the First Man; translated in Fire and Ice, 1922) is the first written in the series and also its most popular. Here the Glacial Age has forced the humans to migrate southward; only Dreng (Boy) turns in defiance to the north and founds, together with the woman Moa (Mother), a large family. In a memorable scene Dreng rediscovers fire by striking sparks from flint, while Moa collects seeds and begins to till the soil.
Det tabte Land: Mermesket før Istiden (1919, The Lost Land: Man Before the Ice Age; translated in Fire and Ice) is a Darwinian myth of creation about the transition from animal to Homo sapiens in the preglacial rain forests of Jutland. The major character of the volume is Fyr (Fire), who climbs a volcano and, like Prometheus, steals the fire. Up on the mountain he sees the ocean in the distance, and the feeling of longing is stirred in him for the first time.
In Norne-Gæst (1919; translated in The Cimbrians, 1923), Jensen follows the lives of the Northerners from the Glacial Age to the Great Migration. The opening lines describe the newborn title character’s first glimpse of the blue sky between the leaves. This vision is to become crucial for his insatiable longing to travel that will drive Norne-Gæst around the world, encountering, for instance, Greek and Roman civilization, and up through the Bronze and Iron Ages. Cimbrernes Tog (1922, The Raids of the Cimbri; translated in The Cimbrians), the last volume of the series to be published, is also mainly set abroad. It opens with Norne-Gæst wandering up through Jutland in order to attend the spring festivals in Himmerland, home of the Cimbrians. Suddenly climatic deterioration sets in with floods and famine, and the Cimbrians set off under the command of Bojerik, a name modeled after a Boiorix mentioned in one of Jensen’s sources, Plutarch’s Parallel Lives. The narrative then follows the everyday life of the Cimbrians as they raid down through Europe, with intervening mythic scenes as observed and commented upon by the omnipresent Norne-Gæst. Other tribes join the Cimbrians, and together they win their first battles against the Romans. Here the Northerner and the Southerner, separated in Bræen, meet again. Eventually, the Cimbrians, having acquired the decadence of the foreign lands, are defeated in a bloody battle; yet, the volume concludes on a conciliatory note in which the spirits of antiquity and of the North merge harmoniously.
In Skibet (1912, The Ship) the Nordic longing to travel is embodied in the Vikings and their raids to the Mediterranean. In the North, Christianity is introduced through a monk, Brother Parvus. Jensen’s description of Parvus’s works of charity belongs among his best passages, forming an essential correlation to his frequently stressed anti-Christian attitude. The first church is erected by turning the Viking ship upside down. In the deepest sense the forest itself becomes a cathedral, while the longing to go abroad takes on a religious dimension.
Longing as the basic trait of the Nordic people becomes personified by the title character of the 1921 novel Christofer Columbus–whom Jensen had previously described as a reddish-blond Northerner in Hjulet– and his voyages of discovery. Jensen’s view of the defiant and struggling individual as the creator of culture, fundamental to Den lange Rejse, is paralleled by his concept of the basic trait of the Nordic people–“the Gothic race”–as being the dream about warmth and sun. This dream, which is Jensen’s explanation of the religious sentiment, is expressed through a longing for distant places, in the final account a longing for paradise that becomes embodied in the structures of the ship and the upward-reaching Gothic cathedral. The Viking migrations, the “raids of the Cimbri,” were a result of this longing, as was the voyage of the Goth, Columbus. His attempt at finding legendary lands resulted, however, in the discovery of America, of reality; and yet, at the conclusion of his life, Columbus realizes that his new discovery has brought him nothing but homelessness and loneliness. Now the initial optimism turns into tragic resignation, as Columbus subsequently chooses not a metaphysical solution but a return to his memories of the past. He does not realize that he must let his journey continue toward the eternal as it is depicted in the myth, “Ave Stella,” that concludes the volume.
Den lange Rejse is not a novel with a traditional plot centered around the adventures of a hero, nor should the work be read as a scientifically correct depiction of various cultural stages. The fact that so many of its theories are contrary to modern history, anthropology, and archeology is irrelevant and cannot weaken the work as literature. Rather, Den lange Rejse is a vision written by an artist with a formidable ability to identify with other periods and conditions. The outcome proves the impossibility of creating a meaningful coherence based on evolution alone, and Jensen’s project defies any organizing structure; however, the six volumes form a grandiose and gripping artistic work that is outstanding as a depiction of the ages of history and of mankind.
Jensen gradually turned away from the writing of fiction in favor of a growing involvement in current cultural and scientific issues; this reorientation was accompanied by a focusing on the feature article and the essay with the purpose of popularizing the theories of evolution. In Jensen’s collection of essays Nordisk Aand: Kroni-ker og Karakteristiker (1911, Nordic Spirit: Chronicles and Characteristics), the American society that he had glorified in Den ny Verden was analyzed further and seen as an implementation of the program in Den gotiske Renaissance, indeed as the true expression of the Nordic character; the Scandinavian prototype of this character is Bjørnson, who is portrayed with several other Danish and Norwegian writers. The volume concludes with a fierce attack on contemporary Danish literary critics for not appreciating Jensen’s work.
In April 1911 Jensen traveled with his wife to Paris and London via Berlin and Cologne. His fascination with the cathedral in Cologne found powerful expression in his next essay collection; and in a 10 May 1911 travel letter to Politiken from Normandy, where he looked for traces of the ancient Nordic population, can be seen the first impulses for the novel Skibet, which is set during the Viking Age. In 1912 Jensen also published a monograph, Rudyard Kipling, written in connection with several translations of various Kipling stories that he and Aslaug Mikkelsen had begun in 1911. Although Jensen was strongly influenced by the English writer during the composition of Singapore Noveller and the two additional volumes of short stories from 1909 and 1915, he is strongly critical of Kipling’s imperialistic attitude and deprecation of women.
In the fall of 1912 Jensen began his second great journey to Asia, from which he returned the following year. His reflections en route were later included in the philosophical travelogue Introduction til vor Tidsalder (1915, Introduction to Our Epoch). “I det indiske Hav” (In the Indian Ocean) displays pure poetry in some passages, where Jensen describes the voyage until the vessel sails into the Ceylonese port of Colombo in December. Singapore is revisited, and the homesickness that Jensen had experienced a decade earlier overtook him again. In China he found the theme for a lyrical short story, “Darduses Myndlinge” (Darduse’s Wards), which was later adapted for the stage as the comedy Darduse, Bryllupet i Peking: Eventyrkomedie i fire akter (1937,
Darduse, Wedding in Peking: Fairy Tale Comedy in Four Acts). From Manchuria, Jensen continued on his trip, describing in the chapter “fra Østen til Evropa” (from the East toward Europe) his experiences traveling on the Trans-Siberian Railroad, seeing again with joy the European peasant culture, and describing jubilantly his return to Scandinavia. In its analysis of the Northerner’s longing, the book forms a striking link between Den gotiske Renaissance and Christofer Columbus. At the same time, Jensen’s attempt to give religious feelings a purely physiological explanation is one more bit of evidence that the question of immortality kept troubling him.
Jensen planned to make still another journey to Asia, by way of the United States, the following summer, but did not complete his trip. In March 1914 he embarked for New York City, but the joy of rediscovery was moderate. Even though Jensen still admired the American press and the hectic and progressive atmosphere in the country, he sharply attacked what he saw as the childishness and bigotry of the Americans. In mid April, Jensen decided to return to Copenhagen, and a few months later World War I broke out. Since Denmark remained neutral, Jensen could leave for Berlin in August in order to negotiate with his German publisher, Samuel Fischer. During his visit he became strongly critical of the bellicosity that was shared by all of the German political parties, and he bitterly regretted the clash between Germany and Britain, since both nations had the same Gothic origin.
During the war Jensen was mainly occupied with writing the last volumes of Den lange Rejse. He also prepared a greatly enlarged edition of his collected poems, Digte: Tredie Udgave (1921, Poems: Third Edition). It includes–in addition to several memorial poems and other portraits–nine poems from Den lange Rejse. Of these, the alliterative “Drengs Gravsang” (Dreng’s Elegy) from Bræen binds up the experience of love and boundless longing in a mythic vision. “Vor Frue” (Our Lady), structured on the prosody of the medieval hymn “Dies irae,” and the ballad-like “Den sørgeligste Vise” (The Saddest Song) from Christofer Columbus treat other recurrent themes in Jensen’s writing: the first glorifies woman as a saving force and culminates in an apotheosis of the Madonna, while the second is one of the most overwhelming interpretations in Danish poetry of the futility of life. Disillusion and resignation also characterize the majority of remaining poems in the volume, many of which are in alliterative form.
Negative reviews of Cimbrernes Tog and the Den lange Rejse project as a whole by a critic for Politiken led to another of Jensen’s many literary feuds and culminated in a break with the newspaper that lasted until 1926. Shortly after the break, in December 1922, Jensen began the publication of his own periodical, Forum: Tidsskrift for Litteratur, Biologi og Samfundsspørgsmål (Forum: Periodical of Literature, Biology, and Social Issues), which survived for only one and a half years. In the first issue of Forum he published his most succinct and best-written settlement with the misuse of Darwinism, “Den daarlige Darwinisme” (The Bad Darwinism). In addition, Forum included a few insignificant poems about birds that, together with other animal poems illustrated by Jensen’s close friend the painter Johannes Larsen, were collected in the volume Aarstiderne (1923, The Seasons). Jensen’s interest in this painter and in the visual arts in general–he was himself a painter and was also an accomplished sculptor–resulted in several art books, two of which deal with the renowned Danish sculptor Bertel Thorvaldsen. Related to these works is the collection of newspaper articles about art, Form og Sjæl: Portræter af Personligheder (1931, Form and Soul: Portraits of Personalities), one more expression of Jensen’s love for classicism and of sculpture.
Essential for an understanding of Jensen’s fiction is his 1923 work Æsktik og Udvikling. The book is a sort of afterword to Den lange Rejse, in which Jensen also launches his “gradus” theory, a conception designating the gliding steps of evolution that he wanted to stress rather than the static species–further developed in later volumes, such as Aandens Stadier (1928, Stages of the Mind). In Æsktik og Udvikling, Jensen writes: “Den lange Rejse handler ikke eksklusivt om en Race, den handler om Udviklingstrin. Den ene Race er Udviklingstrinet af den anden” (Den lange Rejse does not deal exclusively with a race, it deals with stages of evolution. One race is the evolutionary stage of another). At the same time Jensen sharply attacks novels that focus exclusively on individuals as being pre-Darwinian and thus hero-worshiping and outdated. The concept of “bad Darwinism” is discussed again in the obscure and insignificant collection of previously published articles about Darwinism, Evolution og Moral (1925, Evolution and Ethics).
At the end of 1925, Jensen once again set out on a long journey; this time the goal was the Egyptian health resort of Helwan. From Egypt and later from Palestine, he sent several travel letters home to the newspaper Social-Demokraten, revised versions of which were included in Aandens Stadier.
Jensen continued to write memorial poems, and his perfection of this genre can be seen in the undervalued collection Verdens Lys (1926, The Light of the World). Verdens Lys includes six alliterative poems from Cimbrernes Tog that–in a mythic perspective–juxtapose and celebrate the union between North and South. Distinct among Jensen’s works in its focus on artistic expression is “Graven i Sne” (The Snow-Covered Grave), a memorial poem to the Danish Romantic poet Adam Oehlenschläger. It is both an idealized portrait of Oehlenschläger, Jensen’s beloved model as both a harmonious artist and human being–“To Gange gav han Livet Form, / i Livet selv, i ædel Norm” (Twice he gave to life a form, / in life itself, in noble norm)–and a poem about the eternal value of art in spite of the inevitability of death. It becomes a glorification of Jensen’s own poetic art, as it is expressed in a structure that combines observation, vision, and reflection into a perfect artistic entity that may turn out to be the only way to overcome the absurdity of life. The poems in memory of Jensen’s father offer an affectionate portrait of a man who lived in close intimacy with the miracles of life in nature, a closeness to nature inherited by the poet although not without discord. A much more somber tone is heard in the obituary for Jensen’s mother, the concluding poem of the volume, “Ved min Moders Død” (At My Mother’s Death), as well as in the hymn-like “Kirken i Hardanger” (The Church in Hardanger), in which the reader can perceive, behind the stoic resignation that results from placing oneself in the hands of the cycle of nature, the author’s fear of annihilation, which is so powerful that it threatens to break up the poetic form.
In 1927 Dyrenes Forvandling: Til Udviklingens Plastik (The Transformation of Animals: A Contribution to the Plasticity of Evolution) was published; it was another presentation of evolutionary theories but without the fierce attacks on Christianity that characterize some of his earlier writings. Jensen attempts–and the task appears scientifically absurd–to describe the animal soul and ethics as they change through all stages of evolution. Trips to Madeira and Rome followed in 1928, and impressions from both trips were likewise included in the anthropological study of human origin and development, Aandens Stadier.
In May 1929 Jensen received an honorary doctorate from the University of Lund. At the same time Jensen began to focus again on Nordic issues and started his translation of Egil’s Saga for a planned edition of De islandske Sagaer (The Icelandic Sagas), which was published in three volumes from 1930 to 1932. At the same time he returned to his preoccupation with Nordic archaeology in a series of articles subsequently published in the volume Paa danske Veje (1931, On Danish Roads) with his own photos and drawings by Larsen. With this work Jensen got involved in a fierce public debate about the preservation of the ancient burial mounds and stirred up so much political attention that stringent conservation laws were put into effect in 1937. His introduction to De islandske Sagaer, in which he discusses the sagas as products of a genuine Nordic mentality untouched by the traditions of antiquity and Christianity, points ahead to his book about women in the Viking Age, Kvinden i Sagatiden (1942, The Woman in the Saga Period), which also includes retellings of selected saga texts. For a new edition of Snorre Sturlason’s Heimskringla, published in 1948, Jensen took upon himself the difficult task of translating all 539 stanzas.
The transcendental aspect in his earlier poetry is not found in Den jydske Blæst: Digte 1926–1930 (1931, The Jutland Wind: Poems 1926-1930), Jensen’s last important poetry collection. Whereas death in, for instance, the memorial poem “Knut Hamsun” is still accepted as a pantheistic amalgamation into nature, most of the texts are structured on the tragic contrast between active life and the corruption of death, as in the memorial poem “Otto Benzon,” and now nature brings no consolation: “I Kammerdøren peb Vindens Røst, / en ensom Jammer, ingen Trøst” (The voice of the wind whistled in the chamber door / a lonesome lamentation, no comfort). When the memorial poem over others becomes a poem about Jensen himself, as in the concluding title poem, one finds the same death motif, the portrayal of death as man’s tragic but only certain verity. Jensen’s writing has this motif in common with Baroque literature, albeit without Heaven as the final destination.
On his sixtieth birthday, 20 January 1933, Jensen had reached such an esteemed position that a torchlight procession was held in his honor in Copenhagen and a festschrift published, Unge Digteres Hyldest til Johannes V. Jemen (Young Writers’ Homage to Johannes V. Jensen). In response to virulent public criticism of his lack of political commitment, Jensen, in an article published in Politiken on his birthday, emphasized–as he had done throughout his career–his independence as a freelance writer outside the political parties. Nevertheless, in a review of Hartvig Frisch’s Pest over Europa (1933, Plague over Europe) in the same newspaper on 3 December, he once again attacked Nietzsche’s philosophy and its consequences in the Germany of the early 1930s. Here Jensen strongly dissociated himself from contemporary political developments in Europe toward dictatorship, and, in the 1938 article “Hagekorset” (The Swastika), he publicly expressed his disgust with anti-Semitism.
In 1930 Jensen published the collection of essays and articles Retninger i Tiden: Artikler 1925-30 (Trends of the Times: Articles 1925-30); however, during the first part of the 1930s Jensen turned away from the essay form, with the exception of his short history of ideas, Det Blivende: Tankens Revolutionering i det 19de Aarhundrede og Tilbagefaldet i det 20de (1934, The Permanent: The Revolution of Thought in the 19th Century and the Backslide in the 20th), written in a more concise and concentrated style than his other philosophical works. With the underrated novel Dr. Renaults Fristelser (1935, The Temptations of Dr. Renault) he again took up fiction, reworking the Faust motif into a plot that, in contrast to the version presented by Goethe, lets the title character win over Mephistopheles because he is ready to fully accept the present. As in Hjulet, aestheticism is regarded as a barrier between man and reality, and in a significant scene Dr. Renault throws a valuable statue of Aphrodite into the sea, so that nothing will stand between him and life.
In October 1936 Jensen went on a short trip to the United States in order to collect material for a sequel to Dr. Renaults Fristelser, in spite of the negative reception that the book had received. The continuation never materialized. Instead, Jensen published the novel Gudrun (1936), a realization of a much older project: a contemporary novel of the Copenhagen woman, and thus also a novel about the city of Copenhagen, but completely different from Jensen’s first two novels. The city is no longer seen through the eyes of a student from the provinces. Now a citizen of Copenhagen for many years–and a matured artist–Jensen delivers a deeply intimate tribute to this city as a swarming, animated organism. Most of Jensen’s poems written in the 1930s were collected in 1937 as Paaskebadet: Digte 1931-1937 (The Easter Bath: Poems 1931-1937).
That same year, Jensen’s play adaptation of Darduse, Bryllupet i Peking, a “Fairy Tale Comedy in Four Acts,” had its premiere on 22 January at the Royal Theater in Copenhagen and was performed sixteen times. It was a weak play, carried by excellent acting and Knudáge Riisager’s rousing music. Jensen’s relationship with the theater was, on the whole, marked by a lack of success. During his stay in Chicago in the winter of 1902–1903 he had his own dramatization of some of his Himmerland stories performed by Danish-American amateurs with the title Trods (2002, Defiance). The one and only performance was a resounding fiasco. A dramatization of Madame D’Ora, titled Sangerinden (The Singer) and published as Sangerinden (Madame d’Ora): Drama i fem Akter (1921, The Singer [Madame d’Ora]: Drama in Five Acts), was performed unsuccessfully in 1923, premiering at the Odense Teater on 16 November; and when, on 24 April 1937, Jensen’s translation of William Shakespeare’s Hamlet was performed at the Royal Theater, it provoked such fierce criticism that the production had to be canceled.
Shortly before the outbreak of World War II, Jensen managed to visit Norway, described in the travelogue Nordvejen: Indtryk afnorsk Natur (1939, The Way North: Impressions of Norwegian Nature), and then the United States; in March 1939 he left Denmark for his sixth journey to the New World. His travel letters were first printed in Politiken and subsequently as a book, titled Fra Fristaterne: Rejsebreve, med et Tilbageblik (1939, From the Free States: Travel Letters with a Retrospect). This time Jensen’s encounter with the United States was marked by ambivalence. In the chapter “Fra Stillehavet til Atlanten” (From the Pacific Ocean to the Atlantic), which is among the best sections of the book, the impressions from the trip are summed up. Although his overall impressions were still positive, Jensen was disappointed by what he saw as the increasing vulgarity and materialism of American society, where technology had become an end in itself, not an expression of man’s inventiveness and ingenuity. After having arrived on the East Coast, Jensen became ill and had to return home earlier than planned. Back in Copenhagen in May 1939 he added a concluding chapter on Thomas Jefferson, meant as a counterweight to the current antidemocratic trends in Europe.
Jensen had planned a tour to France for September 1939 in order to visit the regions where traces of prehistoric man had been discovered, but this plan was thwarted by the outbreak of World War II and had to be postponed until spring 1948. When the Germans occupied Denmark in April 1940, Jensen, apparently fearing arrest, took the precaution of burning his diaries from the previous thirty years, together with all personal letters written to him. In this way much data related to the writing of his books and–most important–notes from his many travels were lost for posterity. Nevertheless, he continued to write throughout the occupation from 1940 to 1945. In 1941 came Mindets Tavle: Portrœter og Personligheder (Plaque of Commemoration: Portraits and Personalities) with portraits of Nordic and British explorers, scientists, writers–and Darwin once again–and in 1943 the collection of some unimportant ethnographic articles, Folkeslagene i østen (The People of the Orient). Of greater quality is the study Vor Oprindehe (1941, Our Origin), describing humanity’s gradual acquisition of civilization. Again Jensen’s myth-creating fantasy bloomed in a visionary description of cultural progress. At the same time he offers a subtle analysis of his authorship, clearly drawing up a balance sheet and expressing his feeling that he was at the end of the road, a feeling that is also expressed in the essay “Tak til Sproget” (Thanks to Language), Jensen’s farewell to literature; this essay was included, together with some linguistic studies, in a small book, Om Sproget og Undervisningen (1942, About Language and Teaching).
After having been nominated several times for the Nobel Prize in Literature, Jensen finally received it on 9 November 1944. The Nobel Committee announced that the prize was awarded in recognition of “the remarkable force and richness of his poetic imagination, combined with a wide-ranging intellectualism and bold, innovative sense of style,” and Den lange Rejse appeared to be a decisive factor for the committee. In his speech of thanks at the City Hall of Stockholm on 10 December 1945, the writer once again paid tribute to Darwin and identified himself with the scientist: “To him [Darwin], evolution was not only the subject of a life’s study but the very essence of life, proof of the inexhaustible richness and wonder of nature, revealed each day and taken to heart.”
Since 1925 a large number of Danish, Norwegian, and Swedish intellectuals, literary scholars, and scientists had been proposing Jensen for the prize. Initially, the proponents pointed to his earlier works, in particular his myths, and his distinctive handling of style and language as well as his depictions of the Danish mind and Danish nature as qualifying him for the prize. Judged negatively, however, were Jensen’s anthropological essays for their lack of speculative content as well as their polemical tone.
The publication in 1938 of the two-volume edition of Den lange Rejse once again drew attention to Jensen’s major series of novels, and the positive attitude was strengthened further in 1943 with the publication of his poetry collection Digte: 1901-1941. Also, the German occupation of Denmark during World War II probably had an impact on the decision to award Jensen the prize in 1944.
In Denmark the Nobel Prize was welcomed by the entire press, which published large articles on Jensen on their front pages and had his friends and colleagues comment on the significance of the award as well as on Jensen’s authorship in general. The underground press also noticed the prize, even though Jensen was not an active member of the resistance. Because of the war, however, the event did not attract much attention abroad.
Jensen definitely appreciated having been awarded the Nobel Prize and expressed a certain amount of pride and self-esteem in his banquet speech in Stockholm; but he was not dependent on the money, and the award did not have any visible impact on his personal life or his writing. Neither did it contribute to increase his reputation abroad, which had reached its zenith in the interwar years, particularly in Germany and the Scandinavian countries; he never became a popular writer in the Anglo-American countries.
During the last years of Jensen’s life his productivity decreased significantly. After an operation in September 1948 he managed to finish a book about the great explorers, Afrika: Opdagelserejserne (1949, Africa: Journeys of Discovery), demonstrating that one of the favorite topics of his childhood reading was still inspiring him. Otherwise, Jensen was primarily occupied with the preparation of a combined, revised edition of the three books Dyrenes For-vandling Aandens Stadier, and Vor Oprindelse, but he managed to complete only the first volume, published posthumously in 1951 in the volume Tilblivelsen (Genesis). Troubled by an ear disease, he also suffered from shingles during the summer of 1949 and was hospitalized. He still managed to write a few more articles; the most valuable of these, “Adam Oehlenschläger 1779-1850,” published in the book Swift og Oehlenschläger (1950, Swift and Oehlenschläger), is a finely drawn portrait of his artistic model as a man of simple nature, without stiltedness, who calmly accepted death as Jensen had described in the poem “Graven i Sne.” On 31 January 1950 Jensen commented on his second great model, Darwin, in a short article, “Træsk fra vor Oprindelse” (Traits from Our Origin). These two personalities, the poet Oehlenschläger and the scientist Darwin, perfectly symbolize the two facets of Jensen’s authorship. In his later years natural science came to dominate, but the first element stands as the most valuable, the one that will survive.
Johannes V. Jensen died on 25 November 1950. Brandes once claimed that as a thinker and preacher Jensen could not be taken seriously. Brandes had a point: the content ofjensen’s many collections of essays and articles dealing with natural science, archaeology, and anthropology are often based on dubious scientific theories and deductions. In addition, in these volumes Jensen linguistically turns from lyrical expressiveness to a terse, matter-of-fact diction. But one must not fail to notice that in spite of the scientific topic, his stylistic mastery often breaks forth in evocative passages that can be read as sublime prose poetry. Jensen’s critics have also frequently overlooked that he was brilliant as a journalistic writer. He was unusually well-informed about current trends, and his knowledge of American society and literature was unique for a Dane of his time. Neither should his contributions as a translator be overlooked. Besides his accomplished translations from Old Norse, his outstanding rendering of Whitman’s poetry in Danish must be acknowledged.
Jensen reached perfection as a lyrical poet and a creator of myth, both in his collections of poetry and myths and when he succeeded in combining the two genres, as in Kongens Fald, Skovene, Den huge Rejse, and several of his stories of Himmerland. With his debut collection of poems, Jensen introduced modernism in Danish poetry, and he became the writer who, arguably, has had the strongest impact on twentieth-century Danish literature.
Bibliographies
Frits Johansen and Aage Marcus, Johannes V. Jensen: En Bibliografi, 2 volumes (Copenhagen: Gyldendal, 1933-1951);
Aage Jørgensen, Litteratur om Johannes V. Jensen En bibliografi (Odense: Odense University Press, 1998).
Biographies
K. K. Nicolaisen, Johannes V. Jensen: Bidrag til hans Biografi og Karakteristik (Aalborg: Viggo Madsens Boghandel, 1914);
Oluf Friis, Den unge Johannes V. Jensen 1873-1902, 2 volumes (Copenhagen: Gad, 1974);
Villum Jensen, Min fars hus: Erindringer om Johannes V. Jensen og hans miljø (Copenhagen: Gyldendal, 1976);
Leif Nedergaard, Johannes V. Jensen, third edition (Copenhagen: C. A. Reitzel, 1993);
Lars Handesten, Johannes V. Jensen: Liv og Værk (Copenhagen: Gyldendal, 2000).
References
Harry Andersen, Afhandlinger om Johannes V. Jensen (Rødovre: Rolv, 1982);
Andersen, Studier i Johannes V. Jensens Lyrik: Verseteknik, Sprog og Stil (Copenhagen: Levin & Munksgaard, 1936);
Jørgen Elbek, Johannes V. Jensen (Copenhagen: Gyldendal, 1966);
Otto Gelsted, Johannes V. Jensen: Kurven i hans Udvikling (Copenhagen: Arthur Jensen, 1938);
Alf Henriques, Johannes V. Jensen (Copenhagen: H. Hirschsprung, 1938);
Iben Holk, Jartegn: Et essay om Johannes V. Jensens myter (Copenhagen: C. A. Reitzel, 2000);
Poul Houe, Johannes V. Jensens lange rejse: En postmoderne myte (Copenhagen: Museum Tusculanum, 1996);
Niels Ingwersen, “America as Setting and Symbol in Johannes V. Jensen’s Early Works,” American Norvegica, 3 (1971): 272-293;
Bent Haugaard Jeppesen, Johannes V. Jensen og den hvide mands byrde: Eksotisme og imperialisme (Copenhagen: Rhodos, 1984);
Aage Jørgensen, “Johannes V. Jensen (Literature): ‘. . . a good enough poet and; nowadays, a good enough human being. . . .’” in Neighbouring Nobel: The History of Thirteen Danish Nobel Prizes, edited by Henry Nielsen and Keld Nielsen (Aarhus: Aarhus University Press, 2001), pp. 207-243;
Jørgensen and Helene Kragh-Jacobsen, eds., Columbus fra Himmerland (Farsø: Farsø Bibliotek, 1994);
Jørgensen and Anders Thyrring Andersen, eds., Et spring ind i et billede: Johannes V. Jensens mytedigtning (Odense: Odense Universitetsforlag, 2000);
Aage Marcus, “Johannes V. Jensen,” American-Scandinavian Review, 20 (1932): 339-347;
Felix Nørgaard and Aage Marcus, eds., Johannes V. Jensen. 1873-20. Januar–1943 (Copenhagen: Gyldendal, 1943);
Sven Hakon Rossel, “Andersen og Jensen–Eventyret og myten,” in Hvad Fatter gjør... Boghistoriske, litterære og musikalske essays tilegnet Erik Dal, edited by Henrik Glahn and others (Herning: Poul Kristensen, 1982), pp. 392-402;
Rossel, Johannes V. Jensen (Boston: Twayne, 1984);
Aage Schiøttz-Christensen, Om sammenhængen i Johannes V. Jensens forfatterskab (Copenhagen: Borgen, 1956);
Henrik Wivel, Den titaniske eros: Drifts- og karakterfortolkning i Johannes V. Jensens forfatterskab (Copenhagen: Gyldendal, 1982).
Papers
The major collection of correspondence and manuscripts is in the Johannes V. Jensen Archives at Det kongelige Bibliotek (The Royal Library), Copenhagen. Additional material is located at Statsbiblioteket, Arhus, and the Farsø Bibliotek. Forty-three letters from Jensen to various Norwegian writers and friends are located at the University Library, Oslo.
|
|||||
correct_award_00093
|
FactBench
|
1
| 74
|
http://www.esp.org/timeline/GEV-vs-ALL_1940-1949.html
|
en
|
ESP Timeline: Genetics, Development, and Evolution vs All Other Categories (1940
|
[
"http://www.esp.org/images/ESP-new-banner-5.png",
"http://www.esp.org/images/ESP-new-banner-4sm.png",
"http://www.esp.org/images/icons/burger-24-white.png",
"http://www.esp.org/images/icons/close-24-white.png",
"http://www.esp.org/images/icons/icon-bar.png",
"http://www.esp.org/images/icons/icon-bar.png",
"http://www.esp.org/images/icons/burger-24-black.png",
"http://www.esp.org/images/icons/close-24-black.png",
"http://www.esp.org/images/icons/close-24-white.png",
"http://www.esp.org/images/icons/facebook.png",
"http://www.esp.org/images/icons/twitter.png",
"http://www.esp.org/images/icons/googleplus.png",
"http://www.esp.org/images/icons/linkedin.png",
"http://www.esp.org/images/icons/reddit.png",
"http://www.esp.org/images/icons/email.png",
"http://www.esp.org/images/l-arrow.png",
"http://www.esp.org/images/r-arrow.png",
"http://www.esp.org/images/timeline/best-picture-1940-250.jpg",
"http://www.esp.org/images/timeline/tusk-airmen-250.jpg",
"http://www.esp.org/images/timeline/pearl-harbor.jpg",
"http://www.esp.org/images/timeline/nty-1941-12-08-250.jpg",
"http://www.esp.org/images/timeline/nyt-1941-12-09-250.jpg",
"http://www.esp.org/images/timeline/best-picture-1941-250.jpg",
"http://www.esp.org/images/timeline/midway-250.jpg",
"http://www.esp.org/images/timeline/best-picture-1942-250.jpg",
"http://www.esp.org/images/timeline/broadway-250.jpg",
"http://www.esp.org/images/timeline/best-picture-1943-250.jpg",
"http://www.esp.org/images/timeline/three-studies-250.jpg",
"http://www.esp.org/images/timeline/gen-eisenhower-80.jpg",
"http://www.esp.org/images/timeline/best-picture-1944-250.jpg",
"http://www.esp.org/images/timeline/se-luria-80.jpg",
"http://www.esp.org/images/timeline/nyt-1945-04-13-250.jpg",
"http://www.esp.org/images/timeline/h-truman-80.jpg",
"http://www.esp.org/images/timeline/nyt-1945-04-28-250.jpg",
"http://www.esp.org/images/timeline/nyt-1945-04-30-250.jpg",
"http://www.esp.org/images/timeline/nyt-1945-05-02-250.jpg",
"http://www.esp.org/images/timeline/nyt-1945-05-08-250.jpg",
"http://www.esp.org/images/timeline/nyt-1945-06-22-250.jpg",
"http://www.esp.org/images/timeline/trinity-250.jpg",
"http://www.esp.org/images/timeline/nyt-1945-07-27-250.jpg",
"http://www.esp.org/images/timeline/hiroshima.jpg",
"http://www.esp.org/images/timeline/nyt-1945-08-07-250.jpg",
"http://www.esp.org/images/timeline/nagasaki.jpg",
"http://www.esp.org/images/timeline/nyt-1945-08-15-250.jpg",
"http://www.esp.org/images/timeline/davis-benjamin-o.jpg",
"http://www.esp.org/images/timeline/best-picture-1945-250.jpg",
"http://www.esp.org/images/timeline/hj-muller-02-80.jpg",
"http://www.esp.org/images/timeline/best-picture-1946-250.jpg",
"http://www.esp.org/images/timeline/big-dipper-250.jpg",
"http://www.esp.org/images/timeline/robinson-j.jpg",
"http://www.esp.org/images/timeline/best-picture-1947-250.jpg",
"http://www.esp.org/images/timeline/hj-muller-02-80.jpg",
"http://www.esp.org/images/timeline/onement-250.jpg",
"http://www.esp.org/images/timeline/hasselblad-1600F-250.jpg",
"http://www.esp.org/images/timeline/best-picture-1948-250.jpg",
"http://www.esp.org/images/timeline/three-men-250.jpg",
"http://www.esp.org/images/timeline/best-picture-1949-250.jpg",
"http://www.esp.org/images/of-interest/new-science.png",
"http://www.esp.org/images/of-interest/old-science.jpg",
"http://www.esp.org/images/of-interest/weird-science.jpg",
"http://www.esp.org/images/of-interest/policy-funding.png",
"http://www.esp.org/images/of-interest/biodiversity.png",
"http://www.esp.org/images/of-interest/symbiosis.png",
"http://www.esp.org/images/of-interest/paleo.png",
"http://www.esp.org/images/of-interest/astronomy.png",
"http://www.esp.org/images/of-interest/climate-change.png",
"http://www.esp.org/images/of-interest/big-data.png",
"http://www.esp.org/images/of-interest/anthro.png",
"http://www.esp.org/images/of-interest/wtf.png",
"http://www.esp.org/images/icons/burger-20-white.png",
"http://www.esp.org/images/icons/close-20-white.png",
"http://www.esp.org/images/valid-html5-blue.png",
"http://jigsaw.w3.org/css-validator/images/vcss-blue"
] |
[] |
[] |
[
""
] | null |
[] | null |
en
|
img/favicon.ico
| null |
Frank Morton Carpenter collects a 2.5-foot wing from a dragonfly-like giant insect that lived in Oklahoma during the Permian Period.
Rebecca wins Academy Award for best picture. The darkest aspects of the Daphne du Maurier novel were cleaned up a little, but the film was popular then and still remains a fan favorite. Its the only Alfred Hitchcock film to take the top Academy Award.
Between 1941 and 1945, the desperate need for labor in US defense plants and shipyards leads to the migration of 1.2 million African-Americans from the South to the North and West. This migration transforms American politics as blacks increasingly vote in their new homes and put pressure on Congress to protect civil rights throughout the nation. Their activism lays much of the foundation for the national civil rights movement a decade later.
On June 25, Pres. Franklin Roosevelt issues Executive Order 8802, which desegregates US defense plants and shipyards and creates the Fair Employment Practices Committee.
The US Army creates the Tuskegee Air Squadron (the 99th Pursuit Squadron) — an all African-American flying unit.
Zuse Z3 machine completed
Anthropologist E. T. Hall excavates the ruins of a dwelling in New Mexico occupied between 700 and 900 AD. He finds two fossil jawbones of Eocene mammals that were deliberately carried to the dwelling by Paleo- Indians.
German paleontologist H. Kirchner suggests that dinosaur tracks in the Rhine Valley might have inspired the legend of Siegfried slaying the dragon Fafnir.
07 DEC 1941: Pearl Harbor bombed by Japanese The US immediately declares war on Japan. Germany quickly declares war on the United States. The US is now a full participant in World War II.
08 DEC 1941: The US responds to Pearl Harbor. President to address joint session of Congress. Declaration of War expected.
09 DEC 1941: The United States formally declares War on Japan.
How Green Was My Valley wins Academy Award for best picture. The drama was set in a Welsh mining village, with director John Ford winning one of the films five Oscars. Even then, many were surprised that the movie beat Orson Welles Citizen Kane, but a Variety reporter at the time attributed that to the 6,000 movie extras who voted on Oscars top prize: The mob prefers a regular guy to a genius.
The Atanasoff-Berry Computer is completed
Kodacolor, the first color film that yields negatives for making chromogenic color prints on paper. Roll films for snapshot cameras only, 35 mm not available until 1958.
4-7 JUN 1942: The Battle of Midway occurs. Less than six months after Pearl Harbor the Japanese navy attempts to lure the remnants of the US Navy into a decisive battle at Midway Island. The Japanese plan backfires, as the battle proves to be a huge victory for US forces and the turning point in the war in the Pacific.
Mrs. Miniver wins Academy Award for best picture. The film, about a British family during the early days of World War II, came at just the right time, reassuring Americans that their newish war effort was the right decision.
Painting by Piet Mondrian: Broadway Boogie Woogie was completed in 1943, shortly after Mondrian moved to New York in 1940. Compared to his earlier work, the canvas is divided into a much larger number of squares. Although he spent most of his career creating abstract work, this painting is inspired by clear real-world examples: the city grid of Manhattan, and the Broadway boogie woogie, a type of music Mondrian loved. The painting was bought by the Brazilian sculptor Maria Martins for the price of $800 at the Valentine Gallery in New York City, after Martins and Mondrian both exhibited there in 1943. Martins later donated the painting to the Museum of Modern Art in New York City.
The Colossus Mark 1 computer is delivered to Bletchley Park
The First Computing Journal
Work begins on ENIAC
Casablanca wins Academy Award for best picture. The WWII drama represents the studio system at its best, where all the talent (behind and in front of the camera) worked at their peak. Humphrey Bogart and Ingrid Bergman were not the studios first choices for their roles, but they remain one of the screens all-time great romantic pairings.
Johannes Vilhelm Jensen awarded the Nobel Prize in Literature "for the rare strength and fertility of his poetic imagination with which is combined an intellectual curiosity of wide scope and a bold, freshly creative style".
Triptych by Francis Bacon: Three Studies for Figures at the Base of a Crucifixion comprises three canvasses that are based on the Eumenides — or Furies — of Aeschylus's Oresteia, and that depict three writhing anthropomorphic creatures set against a flat burnt orange background. It was executed in oil paint and pastel on Sundeala fibre board and completed within two weeks. The triptych summarises themes explored in Bacon's previous work, including his examination of Picasso's biomorphs and his interpretations of the Crucifixion and the Greek Furies. The Three Studies are generally considered Bacon's first mature piece. When the painting was first exhibited in 1945 it caused a sensation and established him as one of the foremost post-war painters. Remarking on the cultural significance of Three Studies, the critic John Russell observed in 1971 that "there was painting in England before the Three Studies, and painting after them, and no one ... can confuse the two".
On April 3, the United States Supreme Court in Smith vs. Allright declares white-only political primaries unconstitutional.
First Harvard Mark 1 shipped
D-Day landing On June 6th, the largest amphibious force ever assembled, led by General Dwight D. Eisenhower, successfully attacks and establishes a landing on the coast of France at Normandy.
Going My Way wins Academy Award for best picture. Writer-director Leo McCarey once again proved his ability to balance tears and laughs, in this tale of a rule-breaking priest (Oscar winner Bing Crosby) taking over a New York parish from a retiring priest. In the latter role, Barry Fitzgerald was oddly nominated as both lead and supporting actor, winning in the latter category.
13 APR 1945: President Roosevelt dies in office.
Harry S. Truman becomes thirty-third president of the United States.
28 APR 1945: US and Russian troops meet. Germany split in two.
30 APR 1945: Press reports Mussolini killed by Italian partisans, his body abused. Hitler commits suicide by gunshot while hiding in his Führerbunker, but news of his death will not surface for a few days.
02 MAY 1945: Hitler reported dead..
08 MAY 1945: Germany surrenders unconditionally. The war in Europe is over.
22 JUN 1945: Okinawa falls after 82 days of fierce fighting.
16 JUL 1945: The Manhattan Project yields results — the world's first atomic bomb is secretly tested in New Mexico.
27 JUL 1945: Churchill is defeated in British elections. Potsdam Declaration is reported, calling for Japan to surrender unconditionally or face "prompt and utter destruction."
06 AUG 1945: the first atomic bomb used in combat is dropped on Hiroshima, Japan.
07 AUG 1945: The world learns about the atomic bomb. President Truman announces "The force from which the sun draws its power has been loosed against those who brought war to the Far East" and he calls upon Japan to immediately accept the terms of the Potsdam Declaration or expect a rain of ruin from the air, the like of which has never been seen on this earth."
09 AUG 1945: The second atomic bomb used in combat is dropped on Nagasaki, Japan. The primary target for this mission was actually the city of Kokura, but the bomber crew moved on to the secondary target of Nagasaki when Kokura proved to be too obscured by smoke to get a clear view for the bombsight. Russia declares war on japan.
15 AUG 1945: In the afternoon of August 15th (Japanese time), Japan announces its unconditional surrender. World War II is finally over. More than 60 million people have died as a result of the conflict.
Gabriela Mistral awarded the Nobel Prize in Literature "for her lyric poetry which, inspired by powerful emotions, has made her name a symbol of the idealistic aspirations of the entire Latin American world".
Col. Benjamin O. Davis, Jr, is named commander of Godman Field, Kentucky. He is the first African-American to command a United States military base.
Grace Hopper recorded the first actual computer "bug"
Patent is Filed for the Harvard Mark I
Vannevar Bush publishes his ideas for MEMEX, a proto-hypertext system and forerunner to the World Wide Web
The Lost Weekend wins Academy Award for best picture. The film, directed by Billy Wilder, was widely admired even though it was controversial: Though movies had featured drunks since the silent days, it was considered daring to address the subject head-on.
Hermann Hesse awarded the Nobel Prize in Literature "for his inspired writings which, while growing in boldness and penetration, exemplify the classical humanitarian ideals and high qualities of style".
The United States Supreme Court, in Morgan vs Virginia, rules that segregation in interstate bus travel is unconstitutional.
Alan Turing Proposal For 'ACE' Automatic Computing Engine
ENIAC Unveiled
ENIAC, short for Electronic Numerical Integrator And Computer was announced
Frederick Williams Receives Patent for RAM device
Along the Kolyma River in northeastern Siberia, Gulag prisoners discover a nest with three frozen, mummified ground squirrel carcasses. They turn the carcasses over to the Gulag camp geologist, Yuriy Popov, who relays them to other Soviet scientists. Seventy years later, radiocarbon dating and DNA analysis of the ground squirrel mummies will identify their age at over 30,000 years old, and indicate that they are not direct ancestors of modern ground squirrels in the region.
Geologist Reg Sprigg discovers fossils near the Ediacara Hills in Australia. The fossils are of multicellular organisms that predated the Cambrian Period, making them the oldest complex fossils yet discovered. At least some of the fossils are generally assumed to be related to modern cnidarians like jellyfish and corals.
The Best Years of Our Lives wins Academy Award for best picture. The world population was just adjusting to life after World War II and some film executives feared that audiences wanted escapism, not a movie reflecting their lives. But it was a huge hit, and its honesty in dealing with civilian changes and vulnerabilities are still powerful. It was produced by Samuel Goldwyn, directed by William Wyler.
André Paul Guillaume Gide awarded the Nobel Prize in Literature "for his comprehensive and artistically significant writings, in which human problems and conditions have been presented with a fearless love of truth and keen psychological insight".
Painting by Jackson Pollock: Reflections of the Big Dipper, consisting of built up layers of paint with dripped enamel as the final touch, concluding the composition. It was around 1947 that Jackson Pollock traded in his brushes for sticks, trowels and knives and began adding foreign matter, such as sand, broken glass, nails, coins, paint-tube tops and bottle caps to his canvases. Reflection of the Big Dipper was exhibited at the Betty Parsons Gallery in 1948, along with sixteen other paintings by Jackson Pollock. The show received positive reviews. Pollock's works from this time are a transitional step between a more traditional handling of paint and his revolutionary technique of dripping paint on canvases off a large scale.
On April 10, Jackie Robinson of the Brooklyn Dodgers becomes the first African-American to play major league baseball in the 20th century.
J Lyons executives report on the potential of computers to automate clerical work
The Williams tube won the race for a practical random-access memory
Dennis Gabor invents holography.
Harold Edgerton develops the Rapatronic camera for the U.S. government.
American Museum of Natural History curator Edwin Colbert finds a massive quarry of Coelophysis dinosaurs in New Mexico and concludes from their skeletons that these Triassic dinosaurs were swift runners with a bird- like posture. Later examination of two fossils will lead Colbert to conclude they were also cannibals, but the "last meals of juvenile coelophyses" will eventually prove to be crocodilian.
Rudolph Zallinger completes The Age of Reptiles mural in the Yale Peabody Museum. His image of slow-moving dinosaurs will prevail until the 1960s.
Gentleman's Agreement wins Academy Award for best picture. The Elia Kazan-directed drama, starring Gregory Peck, was another hot-button winner, as it addressed the topic of anti-Semitism.
Thomas Stearns Eliot awarded the Nobel Prize in Literature "for his outstanding, pioneer contribution to present-day poetry".
Painting by Barnett Newman: Onement I features the first full incarnation of what Newman later called a 'zip', a vertical band of color. This motif would play a central role in many of his subsequent paintings. The painting's title is an archaic derivation of the word 'atonement', meaning, "the state of being made into one."
On July 26, Pres. Harry Truman issues Executive Order 9981, directing the desegregation of the armed forces.
The United States Supreme Court, in Shelley vs Kraemer, rules that racially restrictive covenants are legally unenforceable.
IBM´s Selective Sequence Electronic Calculator was built
The Manchester Baby, the world's first stored program computer, ran its first program
Edwin H. Land introduces the first Polaroid instant camera.
The Hasselblad 1600F camera is introduced.
Mary Leakey finds the skull of the ape Proconsul, about 16 million years old. Although a very significant find, it does little to bolster Louis and Mary Leakey's meager research funding.
Hamlet wins Academy Award for best picture. The black and white Shakespeare adaptation, from U.K.s J. Arthur Rank-Two Cities, was the first non-Hollywood film to take the top award. And Laurence Olivier became the first person to direct himself to a best-actor win.
|
||||||
correct_award_00093
|
FactBench
|
2
| 21
|
https://www.poemine.com/Johannes-Vilhelm-Jensen/biography/
|
en
|
Johannes Vilhelm Jensen biography on Poemine.com
|
[
"https://www.poemine.com/css/logo.png",
"https://www.poemine.com/css/home.png",
"https://www.poemine.com/img/poets/j/johannes-vilhelm-jensen_100x100.jpg",
"https://www.poemine.com/css/star_null.png",
"https://www.poemine.com/css/star_null.png",
"https://www.poemine.com/css/star_null.png",
"https://www.poemine.com/css/star_null.png",
"https://www.poemine.com/css/star_null.png",
"https://www.poemine.com/css/social/facebook.png",
"https://www.poemine.com/css/social/twitter.png",
"https://www.poemine.com/css/social/pinterest.png"
] |
[] |
[] |
[
""
] | null |
[] | null |
Biography and life of Johannes Vilhelm Jensen
|
/favicon.ico
| null |
Biography
Other info : Career | Bibliography
Johannes Vilhelm Jensen (in Denmark always called Johannes V. Jensen; ) was a Danish author, often considered the first great Danish writer of the 20th century. He was awarded the Nobel Prize in Literature in 1944. One of his sisters, Thit Jensen, was also a well-known writer and a very vocal, and occasionally controversial, early feminist. He was born in Farsø, a village in North Jutland, Denmark, as the son of a veterinary surgeon[1] and he grew up in a rural environment. While studying medicine at the University of Copenhagen he worked as a writer to fund his studies. After 3 years of studying he chose to change careers and devote himself fully to literature.
The first phase of his work as an author was influenced by fin-de-siècle pessimism. His career began with the publication of Himmerland Stories (1898–1910), comprising a series of tales set in the part of Denmark where he was born. During 1900 and 1901 he wrote his first masterpiece, Kongens Fald (eng. transl. 1933 The Fall of the King), a modern historical novel centred around King Christian II. Literary critic Martin Seymour-Smith said it is an "indictment of Danish indecision and lack of vitality, which Jensen saw as a national disease. Apart from this aspect of it, it is a penetrating study of sixteenth-century people." [2]
In 1906 Jensen created his greatest literary achievement: the collection of verses Digte 1906 (i.e. Poems 1906), which introduced the prose poem to Danish literature. He also wrote poetry, a few plays, and many essays, chiefly on anthropology and the philosophy of evolution.
He developed his theories of evolution in a cycle of six novels, Den lange rejse (1908–22), translated into English as The Long Journey (1923–24), which was published in a two-volume edition in 1938.[3] This is often considered his main work in prose, a daring and often impressive attempt to create a Darwinian alternative to the Biblical Genesis myth. In this work we see the development of mankind from the Ice Age to the times of Columbus, focusing on pioneering individuals.
Like his compatriot Hans Christian Andersen, he travelled extensively; a trip to the United States inspired a poem of his, "Paa Memphis Station" [At the train station, Memphis, Tennessee], which is well known in Denmark. Walt Whitman was among the writers who influenced Jensen.
|
||||||
correct_award_00093
|
FactBench
|
1
| 23
|
https://tarnmoor.com/2016/01/29/not-the-nobel-prize-for-literature/
|
en
|
[Not] The Nobel Prize for Literature
|
[
"https://tarnmoor.com/wp-content/uploads/2014/06/cropped-picmacchupiccu-blog34.jpg",
"https://tarnmoor.com/wp-content/uploads/2016/01/picconrad_1.jpg?w=469",
"https://0.gravatar.com/avatar/3ea2e55008d952bc2f61f15f9818aec26ad319a43ce1548fb3a6ca13f07b0042?s=80&d=identicon&r=PG",
"https://2.gravatar.com/avatar/51cb68484b8c3fa502ad72119afa6540f080465ffcb98b88dff0f4a65676b090?s=80&d=identicon&r=PG",
"https://tarnmoor.com/wp-content/uploads/2024/02/cropped-pictaoistsage.jpg?w=50",
"https://tarnmoor.com/wp-content/uploads/2024/02/cropped-pictaoistsage.jpg?w=50",
"https://pixel.wp.com/b.gif?v=noscript"
] |
[] |
[] |
[
""
] | null |
[] |
2016-01-29T00:00:00
|
I don’t have too much good to say about the Swedish Academy, which decides who will receive the Nobel Prize for Literature. If you look at the list of its recipients, it would not take too much effort to produce a list of as great as or even greater literary figures who have not received…
|
en
|
Tarnmoor
|
https://tarnmoor.com/2016/01/29/not-the-nobel-prize-for-literature/
|
I don’t have too much good to say about the Swedish Academy, which decides who will receive the Nobel Prize for Literature. If you look at the list of its recipients, it would not take too much effort to produce a list of as great as or even greater literary figures who have not received the laureate. Let me take a stab at it:
Kobo Abe (Japan), Woman in the Dunes
Chinua Achebe (Nigeria), Things Fall Apart
Ryunosuke Akutagawa (Japan), Rashomon
Jorge Amado (Brazil), Gabriela: Clove and Cinnamon
W. H. Auden (UK), Poetry
Georges Bernanos (France), Mouchette
Jorge Luis Borges (Argentina), Ficciones
Joseph Conrad (UK/Poland), Nostromo
Richard Flanagan (Australia), The Narrow Road to the Deep North
Graham Greene (UK), The Heart of the Matter
Vassili Grossman (Russia), Life and Fate
Henry James (US/UK), The Ambassadors
James Joyce (Ireland), Ulysses
Yashar Kemal (Turkey), Memed, My Hawk
Gyula Krúdy (Hungary), The Red Post Coach
Stanislaw Lem (Poland), Solaris
Osip Mandelstam (Russia), Poetry
Vladimir Nabokov (US/Russia), Lolita
Fernando Pessoa (Portugal), The Book of Disquiet
Marcel Proust (France), In Search of Lost Time
Arkady and Boris Strugatsky (Russia), Roadside Picnic
Italo Svevo (Slovenia), Confessions of Zeno
Leo Tolstoy (Russia), Novels and Stories
Mark Twain (US), Novels and Stories
Evelyn Waugh (UK), Brideshead Revisited
Virginia Woolf (UK), Mrs Dalloway
As you can see, I have not overloaded the list with the names of American authors, in the interests of being fair. If I wanted to, I can add names like Philip Roth, Ursula K. Le Guin, Joyce Carol Oates, Philip K. Dick, Cormac McCarthy, James Baldwin, Ralph Ellison, and a few others.
These can replace such figures as the following, whose reputations have not kept up with the times: Bjornsterne Bjornson, José Echegaray, Giosue Carducci, Rudolf Christoph Eucken, Paul von Heyse, Verner von Heidenstam, Karl Adolph Gjellerup, Henrik Pontopiddan, Carl Spitteler, Jacinto Benavente, Grazia Deledda, Erik Axel Karlfeldt, Pearl S. Buck, Frans Eemil Sillanpaa [SIC], Johannes Vilhelm Jensen, Earl Russell, and a few dozen others—mostly Scandinavian nonentities which at one time were highly thought of by a couple dozen mouldy Swedish academics. (Please forgive me for being lax about the diacritical marks in the above names.)
|
|||||
correct_award_00093
|
FactBench
|
0
| 80
|
https://pantheon.world/profile/person/Johannes_V._Jensen
|
en
|
Johannes V. Jensen Biography
|
https://pantheon.world/api/screenshot/person?id=65134
|
https://pantheon.world/api/screenshot/person?id=65134
|
[
"https://pantheon.world/images/icons/icon-nav.svg",
"https://pantheon.world/images/logos/logo_pantheon.svg",
"https://pantheon.world/images/icons/icon-search.svg",
"https://pantheon.world/images/profile/people/65134.jpg",
"https://pantheon.world/images/ui/profile-w.svg",
"https://pantheon.world/images/ui/tri-left-b.svg",
"https://pantheon.world/images/profile/people/163466.jpg",
"https://pantheon.world/images/profile/people/66569.jpg",
"https://pantheon.world/images/profile/people/409517.jpg",
"https://pantheon.world/images/profile/people/146780.jpg",
"https://pantheon.world/images/profile/people/12787.jpg",
"https://pantheon.world/images/profile/people/161420.jpg",
"https://pantheon.world/images/profile/people/65134.jpg",
"https://pantheon.world/images/profile/people/194342.jpg",
"https://pantheon.world/images/profile/people/162145.jpg",
"https://pantheon.world/images/profile/people/287047.jpg",
"https://pantheon.world/images/profile/people/420409.jpg",
"https://pantheon.world/images/profile/people/330857.jpg",
"https://pantheon.world/images/profile/people/28722.jpg",
"https://pantheon.world/images/ui/tri-right-b.svg",
"https://pantheon.world/images/ui/tri-left-b.svg",
"https://pantheon.world/images/profile/people/99340.jpg",
"https://pantheon.world/images/profile/people/338256.jpg",
"https://pantheon.world/images/profile/people/67892.jpg",
"https://pantheon.world/images/profile/people/65134.jpg",
"https://pantheon.world/images/profile/people/6928.jpg",
"https://pantheon.world/images/profile/people/196238.jpg",
"https://pantheon.world/images/profile/people/153290.jpg",
"https://pantheon.world/images/profile/people/1051.jpg",
"https://pantheon.world/images/profile/people/703040.jpg",
"https://pantheon.world/images/profile/people/2047549.jpg",
"https://pantheon.world/images/profile/people/70178.jpg",
"https://pantheon.world/images/profile/people/1869.jpg",
"https://pantheon.world/images/ui/tri-right-b.svg",
"https://pantheon.world/images/ui/tri-left-b.svg",
"https://pantheon.world/images/profile/people/11891.jpg",
"https://pantheon.world/images/profile/people/12855.jpg",
"https://pantheon.world/images/profile/people/104658.jpg",
"https://pantheon.world/images/profile/people/15827.jpg",
"https://pantheon.world/images/profile/people/65134.jpg",
"https://pantheon.world/images/profile/people/63148.jpg",
"https://pantheon.world/images/profile/people/207509.jpg",
"https://pantheon.world/images/profile/people/163283.jpg",
"https://pantheon.world/images/profile/people/609222.jpg",
"https://pantheon.world/images/profile/people/767317.jpg",
"https://pantheon.world/images/profile/people/553525.jpg",
"https://pantheon.world/images/profile/people/59659.jpg",
"https://pantheon.world/images/ui/tri-right-b.svg",
"https://pantheon.world/images/ui/tri-left-b.svg",
"https://pantheon.world/images/profile/people/40406.jpg",
"https://pantheon.world/images/profile/people/72708.jpg",
"https://pantheon.world/images/profile/people/29236.jpg",
"https://pantheon.world/images/profile/people/150916.jpg",
"https://pantheon.world/images/profile/people/150846.jpg",
"https://pantheon.world/images/profile/people/48918.jpg",
"https://pantheon.world/images/profile/people/65134.jpg",
"https://pantheon.world/images/profile/people/151814.jpg",
"https://pantheon.world/images/profile/people/150891.jpg",
"https://pantheon.world/images/profile/people/335248.jpg",
"https://pantheon.world/images/profile/people/196670.jpg",
"https://pantheon.world/images/profile/people/186955.jpg",
"https://pantheon.world/images/profile/people/17917.jpg",
"https://pantheon.world/images/ui/tri-right-b.svg",
"https://pantheon.world/images/ui/tri-left-b.svg",
"https://pantheon.world/images/profile/people/13550.jpg",
"https://pantheon.world/images/profile/people/65133.jpg",
"https://pantheon.world/images/profile/people/236625.jpg",
"https://pantheon.world/images/profile/people/29236.jpg",
"https://pantheon.world/images/profile/people/65134.jpg",
"https://pantheon.world/images/profile/people/335248.jpg",
"https://pantheon.world/images/profile/people/206391.jpg",
"https://pantheon.world/images/profile/people/322851.jpg",
"https://pantheon.world/images/profile/people/303926.jpg",
"https://pantheon.world/images/profile/people/324080.jpg",
"https://pantheon.world/images/profile/people/1440425.jpg",
"https://pantheon.world/images/profile/people/27204494.jpg",
"https://pantheon.world/images/ui/tri-right-b.svg",
"https://pantheon.world/_next/image?url=%2Fimages%2Flogos%2Flogo_datawheel.png&w=256&q=75 1x, /_next/image?url=%2Fimages%2Flogos%2Flogo_datawheel.png&w=640&q=75 2x",
"https://pantheon.world/images/logos/logo_CL.svg",
"https://pantheon.world/_next/image?url=%2Fimages%2Flogos%2Flogo_creative_commons.png&w=96&q=75 1x, /_next/image?url=%2Fimages%2Flogos%2Flogo_creative_commons.png&w=256&q=75 2x",
"https://pantheon.world/images/logos/logo_facebook.svg",
"https://pantheon.world/images/logos/logo_twitter.svg"
] |
[] |
[] |
[
""
] | null |
[] | null |
en
|
/images/favicon.ico
| null |
Johannes V. Jensen
Johannes Vilhelm Jensen (Danish pronunciation: [joˈhænˀəs ˈvilhelˀm ˈjensn̩]; 20 January 1873 – 25 November 1950) was a Danish author, known as one of the great Danish writers of the first half of 20th century. He was awarded the 1944 Nobel Prize in Literature "for the rare strength and fertility of his poetic imagination with which is combined an intellectual curiosity of wide scope and a bold, freshly creative style" . Read more on Wikipedia
Since 2007, the English Wikipedia page of Johannes V. Jensen has received more than 120,371 page views. His biography is available in 80 different languages on Wikipedia (up from 77 in 2019) . Johannes V. Jensen is the 260th most popular writer (up from 345th in 2019), the 27th most popular biography from Denmark (up from 35th in 2019) and the 5th most popular Danish Writer.
Johannes V. Jensen is most famous for his novel "The Emperor's New Clothes."
|
||||
correct_award_00093
|
FactBench
|
2
| 76
|
https://math.ucr.edu/home/baez/physics/Administrivia/nobel.html
|
en
|
The Nobel Prize for Physics
|
[] |
[] |
[] |
[
""
] | null |
[] | null | null |
[Physics FAQ] - [Copyright]
Originally compiled by Scott Chase, Phil Gibbs, and Johan Wevers.
Nobel Prizes for Physics, 1901–2021
The following is a complete listing of Nobel Prize awards, from the first award in 1901. Prizes were not awarded in every year. The date in brackets is the approximate date of the work. The description following the names is an abbreviation of the official citation.
The Physics prize is announced near the beginning of October each year. One of the quickest ways to get the announcement is to watch the Nobel Foundation web site at http://nobelprize.org.
1901 [1895] Wilhelm Konrad Röntgen Discovery of X rays. 1902 [1896] Hendrik Antoon Lorentz Magnetism in radiation phenomena. Pieter Zeeman 1903 [1896] Antoine Henri Bequerel Spontaneous radioactivity. [1898] Pierre Curie Marie Sklodowska-Curie 1904 [1894] Lord Rayleigh Density of gases and discovery of argon. (a.k.a. John William Strutt) 1905 [1899] Pilipp Eduard Anton von Lenard Cathode rays. 1906 [1897] Joseph John Thomson Conduction of electricity by gases. 1907 Albert Abraham Michelson Precision meteorological investigations. 1908 Gabriel Lippman Reproducing colors photographically, based on the phenomenon of interference. 1909 [1901] Carl Ferdinand Braun Wireless telegraphy. Guglielmo Marconi 1910 [1873] Johannes Diderik van der Waals Equation of state of fluids. 1911 [1896] Wilhelm Wien Laws of radiation of heat. 1912 [1909] Nils Gustaf Dalén Automatic gas flow regulators. 1913 [1911] Heike Kamerlingh Onnes Matter at low temperature. 1914 [1912] Max von Laue Crystal diffraction of X rays. 1915 [1913] William Henry Bragg X-ray analysis of crystal structure. William Lawrence Bragg 1916 No award. 1917 [1911] Charles Glover Barkla Characteristic X-ray spectra of elements. 1918 [1900] Max Planck Energy quanta. 1919 [1913] Johannes Stark Splitting of spectral lines in electric fields. 1920 Charles-Edouard Guillaume Anomalies in nickel–steel alloys. 1921 [1905] Albert Einstein Photoelectric effect. 1922 [1913] Niels Bohr Structure of atoms. 1923 [1909] Robert Andrew Millikan Elementary charge of electricity. 1924 Karl Manne Georg Siegbahn X-ray spectroscopy. 1925 [1914] James Franck Impact of an electron on an atom. Gustav Hertz 1926 Jean Baptiste Perrin Sedimentation equilibrium. 1927 [1924] Arthur Holly Compton Compton effect. [1912] Charles Thomson Rees Wilson Invention of the cloud chamber. 1928 [1903] Owen Willans Richardson Thermionic phenomena, Richardson's Law. 1929 [1923] Prince Louis-Victor de Broglie Wave nature of electrons. 1930 [1928] Sir Chandrasekhara Venkata Raman Scattering of light, Raman effect. 1931 No award. 1932 [1925] Werner Heisenberg Quantum Mechanics. 1933 [1926] Erwin Schrödinger Atomic theory. [1928] Paul Dirac 1934 No award. 1935 [1932] James Chadwick The neutron. 1936 [1932] Carl Anderson The positron. [1911] Victor Franz Hess Cosmic rays. 1937 [1925] Clinton Joseph Davisson Crystal diffraction of electrons. George Paget Thomson 1938 [1935] Enrico Fermi New radioactive elements. 1939 [1929] Ernest Orlando Lawrence Invention of the cyclotron. 1940–1942 No award. 1943 [1933] Otto Stern Proton magnetic moment. 1944 [1935] Isador Isaac Rabi Magnetic resonance in atomic nuclei. 1945 [1924] Wolfgang Pauli The exclusion principle. 1946 [1925] Percy Williams Bridgman Production of extremely high pressures. 1947 [1924] Sir Edward Victor Appleton Physics of the upper atmosphere. 1948 [1932] Patrick Maynard Stuart Blackett Cosmic ray showers in cloud chambers. 1949 [1935] Hideki Yukawa Prediction of mesons. 1950 [1947] Cecil Frank Powell Photographic emulsion for meson studies. 1951 [1932] Sir John Douglas Cockroft Artificial acceleration of atomic particles and transmutation of nuclei. Ernest Thomas Sinton Walton 1952 [1946] Felix Bloch Nuclear magnetic precision methods. Edward Mills Purcell 1953 [1935] Frits Zernike Phase-contrast microscope. 1954 [1926] Max Born Fundamental research in quantum mechanics. [1925] Walther Bothe Coincidence counters. 1955 [1947] Polykarp Kusch Electron magnetic moment. [1947] Willis Eugene Lamb Hydrogen fine structure. 1956 [1948] William Shockley Transistors. John Bardeen Walter Houser Brattain 1957 Tsung Dao Lee Parity violation. [1956] Chen Ning Yang 1958 [1934] Pavel Aleksejevic Cerenkov Interpretation of the Cerenkov effect. [1937] Il'ja Mickajlovic Frank Igor' Evgen'evic Tamm 1959 Owen Chamberlain The antiproton. [1955] Emilio Gino Segre 1960 [1952] Donald Arthur Glaser The bubble chamber. 1961 [1953] Robert Hofstadter Electron scattering on nucleons. Rudolf Ludwig Mössbauer Resonant absorption of photons. 1962 [1941] Lev Davidovic Landau Theory of liquid helium. 1963 [1931] Eugene Wigner Fundamental symmetry principles. [1949] Hans Jensen Nuclear shell structure. Maria Goeppert Mayer 1964 Nikolai Basov Maser-Laser principle. Alexander Prochorov [1958] Charles Townes 1965 [1948] Richard Feynman Quantum electrodynamics. Julian Schwinger Sin-Itiro Tomonaga 1966 [1950] Alfred Kastler Study of hertzian resonance in atoms. 1967 [1938] Hans Albrecht Bethe Energy production in stars. 1968 [1955] Luis W. Alvarez Discovery of many-particle resonances. 1969 [1964] Murray Gell-Mann Quark model for particle classification. 1970 [1942] Hannes Alfvén Magneto-hydrodynamics in plasma physics. [1932] Louis Néel Ferromagnetism and antiferromagnetism. 1971 [1947] Dennis Gabor Principles of holography. 1972 [1957] John Bardeen Theory of superconductivity. Leon Cooper Robert Schrieffer 1973 [1960] Leo Esaki Tunneling in superconductors. Ivar Giaever [1962] Brian Josephson Super-current through tunnel barriers. 1974 [1974] Antony Hewish Discovery of pulsars. [1958] Sir Martin Ryle Pioneering radioastronomy work. 1975 [1950] Aage Bohr Structure of the atomic nucleus. Ben Mottelson James Rainwater 1976 [1974] Burton Richter Discovery of the J/Psi particle. Samual Chao Chung Ting 1977 [1958] Philip Warren Anderson Electronic structure of magnetic and disordered solids. [1967] Nevill Francis Mott John Hasbrouck Van Vleck 1978 [1932] Pyotr Kapitsa Liquefaction of helium. [1965] Arno Penzias Cosmic microwave background radiation. Robert Wilson 1979 [1961] Sheldon Glashow Electroweak theory, especially weak neutral currents. [1967] Steven Weinberg [1968] Abdus Salam 1980 [1964] James Cronin Discovery of CP violation in the asymmetric decay of neutral K mesons. Val Fitch 1981 Kai Seigbahn High-resolution electron spectroscopy. [1962] Nicolaas Bloembergen Laser spectroscopy. Arthur Schawlow 1982 [1972] Kenneth Wilson Critical phenomena in phase transitions. 1983 [1935] Subrahmanyan Chandrasekhar Evolution of stars. [1957] William Fowler 1984 [1970] Simon van der Meer Stochastic cooling for colliders. [1983] Carlo Rubbia Discovery of W and Z particles. 1985 [1977] Klaus von Klitzing Discovery of (integer) quantum Hall effect. 1986 [1981] Gerd Binnig Scanning tunneling microscopy. Heinrich Rohrer [1932] Ernst August Friedrich Ruska Electron microscopy. 1987 [1986] Georg Bednorz High-temperature superconductivity. Alex Müller 1988 [1962] Leon Max Lederman Discovery of the muon neutrino, leading to classification of particles into families. Melvin Schwartz Jack Steinberger 1989 Hans Georg Dehmelt Penning trap for charged particles. Wolfgang Paul Paul trap for charged particles. Norman Ramsey Control of atomic transitions by the separated oscillatory fields method. 1990 [1972] Jerome Isaac Friedman Deep inelastic scattering experiments leading to the discovery of quarks. Henry Way Kendall Richard Edward Taylor 1991 Pierre-Gilles de Gennes Order-disorder transitions in liquid crystals and polymers. 1992 Georges Charpak Multiwire proportional chamber. 1993 [1974] Russell Hulse Discovery of the first binary pulsar and subsequent tests of general relativity. Joseph Taylor 1994 [1960] Bertram Brockhouse Neutron scattering experiments. [1946] Clifford Shull 1995 [1975] Martin Perl Discovery of the tau lepton. [1953] Frederick Reines Detection of the neutrino. 1996 David Lee Superfluidity in helium-3. Douglas Osheroff Robert Richardson 1997 [1985] Steven Chu Development of methods to trap and cool atoms with laser light. Claude Cohen-Tannoudji William Phillips 1998 [1982] Robert Laughlin Discovery and theory of the fractional quantum Hall effect. Horst Störmer Daniel Tsui 1999 [1972] Gerard 't Hooft Development of a renormalisation scheme for non-abelian gauge theories. Martin Veltman 2000 [1957] Herbert Kroemer Growing of heterostructures. [1963] Zhores Alferov Semiconductor laser based on heterostructures. [1958] Jack Kilby Invention of the integrated circuit. 2001 Eric Cornell Bose–Einstein condensation of alkali metals. Carl Wieman Wolfgang Ketterle 2002 Raymond Davis Jr Detection of cosmic neutrinos. Masatosh Koshiba Riccardo Giacconi Detection of cosmic X rays. 2003 Alexei Abrikosov Pioneering contributions to the theory of superconductors and superfluids. [1950] Vitaly Ginzburg [1970] Anthony Leggett 2004 [1973] David Gross Discovery of asymptotic freedom in the theory of the strong interaction. David Politzer Frank Wilczek 2005 Roy Glauber Quantum theory of optical coherence. John Hall Development of ultra-high precision measurements of light. Theodor H�nsch 2006 John Mather Study of the early universe, and developing the Cosmic Background Explorer (COBE) experiment. George Smoot 2007 Albert Fert Discovery of giant magnetoresistance. Peter Gr�nberg 2008 Yoichiro Nambu Discovery of the mechanism of spontaneous symmetry breaking. Makoto Kobayashi Discovery of the origin of symmetry breaking. Toshihide Maskawa 2009 Charles Kao Achievements concerning transmission of light in optical fibres. Willard Boyle Invention of the charge-coupled device (CCD). George Smith 2010 Andre Geim Experiments in graphene. Konstantin Novoselov 2011 Saul Perlmutter Discovery of the accelerating expansion of the universe. Brian Schmidt Adam Riess 2012 Serge Haroche New experimental methods for studying individual quantum systems. David Wineland 2013 Fran�ois Englert Theory of the Higgs mechanism. Peter Higgs 2014 Isamu Akasaki Invention of efficient blue light-emitting diodes. Hiroshi Amano Shuji Nakamura 2015 Takaaki Kajita Discovery of neutrino oscillations. Arthur McDonald 2016 David Thouless Discoveries involving topological phase transitions and topological phases of matter. F. Haldane J. Kosterlitz 2017 Rainer Weiss Contributions to the LIGO detector and the observation of gravitational waves. Barry Barish Kip Thorne 2018 Arthur Ashkin Inventions in the field of laser physics. G�rard Mourou Donna Strickland 2019 James Peebles Contributions to our understanding of the evolution of the universe and Earth's place in the cosmos. Michel Mayor Didier Queloz 2020 Roger Penrose The discovery that black hole formation is a robust prediction of general relativity. Reinhard Genzel The discovery of a supermassive compact object at the centre of our galaxy. Andrea Ghez 2021 Syukuro Manabe The physical modelling of Earth's climate, quantifying variability and reliably predicting global warming. Klaus Hasselmann Giorgio Parisi The discovery of the interplay of disorder and fluctuations in physical systems, from atomic to planetary scales.
|
||||||||
correct_award_00093
|
FactBench
|
0
| 18
|
https://librivox.org/author/17141
|
en
|
LibriVox
|
[
"https://librivox.org/images/librivox-logo.png",
"https://librivox.org/images/public-domain-license.gif"
] |
[] |
[] |
[
""
] | null |
[] | null |
LibriVox
|
en
|
https://librivox.org/favicon.ico
| null |
Johannes V. Jensen (1873 - 1950)
Johannes Vilhelm Jensen was a Danish author. He was awarded the Nobel Prize in Literature in 1944 "for the rare strength and fertility of his poetic imagination with which is combined an intellectual curiosity of wide scope and a bold, freshly creative style".
|
|||||
correct_award_00093
|
FactBench
|
1
| 54
|
http://www.esp.org/timeline/ALL-vs-ART_1940-1949.html
|
en
|
ESP Timeline: All Other Categories vs Arts and Culture (1940
|
[
"http://www.esp.org/images/ESP-new-banner-5.png",
"http://www.esp.org/images/ESP-new-banner-4sm.png",
"http://www.esp.org/images/icons/burger-24-white.png",
"http://www.esp.org/images/icons/close-24-white.png",
"http://www.esp.org/images/icons/icon-bar.png",
"http://www.esp.org/images/icons/icon-bar.png",
"http://www.esp.org/images/icons/burger-24-black.png",
"http://www.esp.org/images/icons/close-24-black.png",
"http://www.esp.org/images/icons/close-24-white.png",
"http://www.esp.org/images/icons/facebook.png",
"http://www.esp.org/images/icons/twitter.png",
"http://www.esp.org/images/icons/googleplus.png",
"http://www.esp.org/images/icons/linkedin.png",
"http://www.esp.org/images/icons/reddit.png",
"http://www.esp.org/images/icons/email.png",
"http://www.esp.org/images/l-arrow.png",
"http://www.esp.org/images/r-arrow.png",
"http://www.esp.org/images/timeline/best-picture-1940-250.jpg",
"http://www.esp.org/images/timeline/tusk-airmen-250.jpg",
"http://www.esp.org/images/timeline/pearl-harbor.jpg",
"http://www.esp.org/images/timeline/nty-1941-12-08-250.jpg",
"http://www.esp.org/images/timeline/nyt-1941-12-09-250.jpg",
"http://www.esp.org/images/timeline/best-picture-1941-250.jpg",
"http://www.esp.org/images/timeline/midway-250.jpg",
"http://www.esp.org/images/timeline/best-picture-1942-250.jpg",
"http://www.esp.org/images/timeline/broadway-250.jpg",
"http://www.esp.org/images/timeline/best-picture-1943-250.jpg",
"http://www.esp.org/images/timeline/gen-eisenhower-80.jpg",
"http://www.esp.org/images/timeline/three-studies-250.jpg",
"http://www.esp.org/images/timeline/best-picture-1944-250.jpg",
"http://www.esp.org/images/timeline/nyt-1945-04-13-250.jpg",
"http://www.esp.org/images/timeline/h-truman-80.jpg",
"http://www.esp.org/images/timeline/nyt-1945-04-28-250.jpg",
"http://www.esp.org/images/timeline/nyt-1945-04-30-250.jpg",
"http://www.esp.org/images/timeline/nyt-1945-05-02-250.jpg",
"http://www.esp.org/images/timeline/nyt-1945-05-08-250.jpg",
"http://www.esp.org/images/timeline/nyt-1945-06-22-250.jpg",
"http://www.esp.org/images/timeline/trinity-250.jpg",
"http://www.esp.org/images/timeline/nyt-1945-07-27-250.jpg",
"http://www.esp.org/images/timeline/hiroshima.jpg",
"http://www.esp.org/images/timeline/nyt-1945-08-07-250.jpg",
"http://www.esp.org/images/timeline/nagasaki.jpg",
"http://www.esp.org/images/timeline/nyt-1945-08-15-250.jpg",
"http://www.esp.org/images/timeline/davis-benjamin-o.jpg",
"http://www.esp.org/images/timeline/se-luria-80.jpg",
"http://www.esp.org/images/timeline/best-picture-1945-250.jpg",
"http://www.esp.org/images/timeline/hj-muller-02-80.jpg",
"http://www.esp.org/images/timeline/best-picture-1946-250.jpg",
"http://www.esp.org/images/timeline/robinson-j.jpg",
"http://www.esp.org/images/timeline/big-dipper-250.jpg",
"http://www.esp.org/images/timeline/best-picture-1947-250.jpg",
"http://www.esp.org/images/timeline/hasselblad-1600F-250.jpg",
"http://www.esp.org/images/timeline/hj-muller-02-80.jpg",
"http://www.esp.org/images/timeline/onement-250.jpg",
"http://www.esp.org/images/timeline/best-picture-1948-250.jpg",
"http://www.esp.org/images/timeline/three-men-250.jpg",
"http://www.esp.org/images/timeline/best-picture-1949-250.jpg",
"http://www.esp.org/images/of-interest/new-science.png",
"http://www.esp.org/images/of-interest/old-science.jpg",
"http://www.esp.org/images/of-interest/weird-science.jpg",
"http://www.esp.org/images/of-interest/policy-funding.png",
"http://www.esp.org/images/of-interest/biodiversity.png",
"http://www.esp.org/images/of-interest/symbiosis.png",
"http://www.esp.org/images/of-interest/paleo.png",
"http://www.esp.org/images/of-interest/astronomy.png",
"http://www.esp.org/images/of-interest/climate-change.png",
"http://www.esp.org/images/of-interest/big-data.png",
"http://www.esp.org/images/of-interest/anthro.png",
"http://www.esp.org/images/of-interest/wtf.png",
"http://www.esp.org/images/icons/burger-20-white.png",
"http://www.esp.org/images/icons/close-20-white.png",
"http://www.esp.org/images/valid-html5-blue.png",
"http://jigsaw.w3.org/css-validator/images/vcss-blue"
] |
[] |
[] |
[
""
] | null |
[] | null |
en
|
img/favicon.ico
| null |
Between 1941 and 1945, the desperate need for labor in US defense plants and shipyards leads to the migration of 1.2 million African-Americans from the South to the North and West. This migration transforms American politics as blacks increasingly vote in their new homes and put pressure on Congress to protect civil rights throughout the nation. Their activism lays much of the foundation for the national civil rights movement a decade later.
On June 25, Pres. Franklin Roosevelt issues Executive Order 8802, which desegregates US defense plants and shipyards and creates the Fair Employment Practices Committee.
The US Army creates the Tuskegee Air Squadron (the 99th Pursuit Squadron) — an all African-American flying unit.
Zuse Z3 machine completed
G. W. Beadle and E. L. Tatum publish their classic study on the biochemical genetics of Neurospora and promulgate the ONE-GENE, ONE-ENZYME theory.
K. Mather coins the term polygenes and describes polygenic traits in various organisms.
Anthropologist E. T. Hall excavates the ruins of a dwelling in New Mexico occupied between 700 and 900 AD. He finds two fossil jawbones of Eocene mammals that were deliberately carried to the dwelling by Paleo- Indians.
German paleontologist H. Kirchner suggests that dinosaur tracks in the Rhine Valley might have inspired the legend of Siegfried slaying the dragon Fafnir.
07 DEC 1941: Pearl Harbor bombed by Japanese The US immediately declares war on Japan. Germany quickly declares war on the United States. The US is now a full participant in World War II.
08 DEC 1941: The US responds to Pearl Harbor. President to address joint session of Congress. Declaration of War expected.
09 DEC 1941: The United States formally declares War on Japan.
The Atanasoff-Berry Computer is completed
Kodacolor, the first color film that yields negatives for making chromogenic color prints on paper. Roll films for snapshot cameras only, 35 mm not available until 1958.
4-7 JUN 1942: The Battle of Midway occurs. Less than six months after Pearl Harbor the Japanese navy attempts to lure the remnants of the US Navy into a decisive battle at Midway Island. The Japanese plan backfires, as the battle proves to be a huge victory for US forces and the turning point in the war in the Pacific.
Ernst Mayr publishes Systematics and the Origin of Species, and Julian Huxley publishes Evolution: The Modern Synthesis. Both books are significant contributions to the neo darwinian synthesis combining elements of natural selection, genetics, mutation, population biology and paleontology.
S. E. Luria and T. F. Anderson publish the first electron micrographs of bacterial viruses. T2 has a polyhedral body and a tail.
Painting by Piet Mondrian: Broadway Boogie Woogie was completed in 1943, shortly after Mondrian moved to New York in 1940. Compared to his earlier work, the canvas is divided into a much larger number of squares. Although he spent most of his career creating abstract work, this painting is inspired by clear real-world examples: the city grid of Manhattan, and the Broadway boogie woogie, a type of music Mondrian loved. The painting was bought by the Brazilian sculptor Maria Martins for the price of $800 at the Valentine Gallery in New York City, after Martins and Mondrian both exhibited there in 1943. Martins later donated the painting to the Museum of Modern Art in New York City.
Casablanca wins Academy Award for best picture. The WWII drama represents the studio system at its best, where all the talent (behind and in front of the camera) worked at their peak. Humphrey Bogart and Ingrid Bergman were not the studios first choices for their roles, but they remain one of the screens all-time great romantic pairings.
On April 3, the United States Supreme Court in Smith vs. Allright declares white-only political primaries unconstitutional.
First Harvard Mark 1 shipped
D-Day landing On June 6th, the largest amphibious force ever assembled, led by General Dwight D. Eisenhower, successfully attacks and establishes a landing on the coast of France at Normandy.
Theoretical physicist Erwin Schrödinger publishes What is Life? arguing that living organisms store and pass along information, perhaps using something like Morse code. This book will inspire James Watson, Francis Crick and Maurice Wilkins, who will share the Nobel prize for discovering the structure of DNA.
O. T. Avery, C. M. MacLeod, and M. McCarty describe the pneumococcus transforming principle. The fact that it is rich in DNA suggests that DNA and not protein is the hereditary chemical.
Johannes Vilhelm Jensen awarded the Nobel Prize in Literature "for the rare strength and fertility of his poetic imagination with which is combined an intellectual curiosity of wide scope and a bold, freshly creative style".
Triptych by Francis Bacon: Three Studies for Figures at the Base of a Crucifixion comprises three canvasses that are based on the Eumenides — or Furies — of Aeschylus's Oresteia, and that depict three writhing anthropomorphic creatures set against a flat burnt orange background. It was executed in oil paint and pastel on Sundeala fibre board and completed within two weeks. The triptych summarises themes explored in Bacon's previous work, including his examination of Picasso's biomorphs and his interpretations of the Crucifixion and the Greek Furies. The Three Studies are generally considered Bacon's first mature piece. When the painting was first exhibited in 1945 it caused a sensation and established him as one of the foremost post-war painters. Remarking on the cultural significance of Three Studies, the critic John Russell observed in 1971 that "there was painting in England before the Three Studies, and painting after them, and no one ... can confuse the two".
Going My Way wins Academy Award for best picture. Writer-director Leo McCarey once again proved his ability to balance tears and laughs, in this tale of a rule-breaking priest (Oscar winner Bing Crosby) taking over a New York parish from a retiring priest. In the latter role, Barry Fitzgerald was oddly nominated as both lead and supporting actor, winning in the latter category.
13 APR 1945: President Roosevelt dies in office.
Harry S. Truman becomes thirty-third president of the United States.
28 APR 1945: US and Russian troops meet. Germany split in two.
30 APR 1945: Press reports Mussolini killed by Italian partisans, his body abused. Hitler commits suicide by gunshot while hiding in his Führerbunker, but news of his death will not surface for a few days.
02 MAY 1945: Hitler reported dead..
08 MAY 1945: Germany surrenders unconditionally. The war in Europe is over.
22 JUN 1945: Okinawa falls after 82 days of fierce fighting.
16 JUL 1945: The Manhattan Project yields results — the world's first atomic bomb is secretly tested in New Mexico.
27 JUL 1945: Churchill is defeated in British elections. Potsdam Declaration is reported, calling for Japan to surrender unconditionally or face "prompt and utter destruction."
06 AUG 1945: the first atomic bomb used in combat is dropped on Hiroshima, Japan.
07 AUG 1945: The world learns about the atomic bomb. President Truman announces "The force from which the sun draws its power has been loosed against those who brought war to the Far East" and he calls upon Japan to immediately accept the terms of the Potsdam Declaration or expect a rain of ruin from the air, the like of which has never been seen on this earth."
09 AUG 1945: The second atomic bomb used in combat is dropped on Nagasaki, Japan. The primary target for this mission was actually the city of Kokura, but the bomber crew moved on to the secondary target of Nagasaki when Kokura proved to be too obscured by smoke to get a clear view for the bombsight. Russia declares war on japan.
15 AUG 1945: In the afternoon of August 15th (Japanese time), Japan announces its unconditional surrender. World War II is finally over. More than 60 million people have died as a result of the conflict.
Col. Benjamin O. Davis, Jr, is named commander of Godman Field, Kentucky. He is the first African-American to command a United States military base.
Grace Hopper recorded the first actual computer "bug"
Patent is Filed for the Harvard Mark I
Vannevar Bush publishes his ideas for MEMEX, a proto-hypertext system and forerunner to the World Wide Web
S. E. Luria demonstrates that mutations occur in bacterial viruses.
The United States Supreme Court, in Morgan vs Virginia, rules that segregation in interstate bus travel is unconstitutional.
Alan Turing Proposal For 'ACE' Automatic Computing Engine
ENIAC Unveiled
ENIAC, short for Electronic Numerical Integrator And Computer was announced
Frederick Williams Receives Patent for RAM device
James B. Sumner, John H. Northrop, and Wendell M. Stanley share a Nobel Prize in Chemistry for Sumner's discovery that enzymes can be crystallized and for Northrop and Stanley's preparation of enzymes and virus proteins in a pure form.
Nobel Prize in Medicine awarded to H. J. Muller for his contributions to radiation genetics
Genetic recombination in bacteriophage is demonstrated by M. Delbrück and W. T. Bailey and by A. D. Hershey.
J. Lederberg and E. L. Tatum demonstrate genetic recombination in bacteria.
Along the Kolyma River in northeastern Siberia, Gulag prisoners discover a nest with three frozen, mummified ground squirrel carcasses. They turn the carcasses over to the Gulag camp geologist, Yuriy Popov, who relays them to other Soviet scientists. Seventy years later, radiocarbon dating and DNA analysis of the ground squirrel mummies will identify their age at over 30,000 years old, and indicate that they are not direct ancestors of modern ground squirrels in the region.
Geologist Reg Sprigg discovers fossils near the Ediacara Hills in Australia. The fossils are of multicellular organisms that predated the Cambrian Period, making them the oldest complex fossils yet discovered. At least some of the fossils are generally assumed to be related to modern cnidarians like jellyfish and corals.
On April 10, Jackie Robinson of the Brooklyn Dodgers becomes the first African-American to play major league baseball in the 20th century.
J Lyons executives report on the potential of computers to automate clerical work
The Williams tube won the race for a practical random-access memory
Dennis Gabor invents holography.
Harold Edgerton develops the Rapatronic camera for the U.S. government.
American Museum of Natural History curator Edwin Colbert finds a massive quarry of Coelophysis dinosaurs in New Mexico and concludes from their skeletons that these Triassic dinosaurs were swift runners with a bird- like posture. Later examination of two fossils will lead Colbert to conclude they were also cannibals, but the "last meals of juvenile coelophyses" will eventually prove to be crocodilian.
Rudolph Zallinger completes The Age of Reptiles mural in the Yale Peabody Museum. His image of slow-moving dinosaurs will prevail until the 1960s.
André Paul Guillaume Gide awarded the Nobel Prize in Literature "for his comprehensive and artistically significant writings, in which human problems and conditions have been presented with a fearless love of truth and keen psychological insight".
Painting by Jackson Pollock: Reflections of the Big Dipper, consisting of built up layers of paint with dripped enamel as the final touch, concluding the composition. It was around 1947 that Jackson Pollock traded in his brushes for sticks, trowels and knives and began adding foreign matter, such as sand, broken glass, nails, coins, paint-tube tops and bottle caps to his canvases. Reflection of the Big Dipper was exhibited at the Betty Parsons Gallery in 1948, along with sixteen other paintings by Jackson Pollock. The show received positive reviews. Pollock's works from this time are a transitional step between a more traditional handling of paint and his revolutionary technique of dripping paint on canvases off a large scale.
Gentleman's Agreement wins Academy Award for best picture. The Elia Kazan-directed drama, starring Gregory Peck, was another hot-button winner, as it addressed the topic of anti-Semitism.
|
||||||
correct_award_00093
|
FactBench
|
1
| 0
|
https://www.britannica.com/biography/Johannes-V-Jensen
|
en
|
Johannes V. Jensen | Nobel Prize, Novels, Poetry
|
[
"https://cdn.britannica.com/mendel/eb-logo/MendelNewThistleLogo.png",
"https://cdn.britannica.com/mendel/eb-logo/MendelNewThistleLogo.png",
"https://cdn.britannica.com/30/129330-131-2C73ED00/The-Lamb-edition-Songs-of-Innocence-William.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/76/168476-131-42654A68/letter-Sun-Rose-Dickinson-Emily-A-Life.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/91/185691-131-0ACC5DF8/Emily-Dickinson-1847.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/23/181923-131-76FC0CA5/poet-clothing-poetry-festival-Heian-Kyoto-shrine.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/55/142355-131-EFF621AF/books-Stack-literature-pile-reading-entertainment-society-2010.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/52/242652-050-EB37F33D/summer-reading-bingo-card.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/01/116101-131-5557E1FF/Abergavenny-Monmouthshire-Wales.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/48/195548-131-20D9FBCC/Girl-Reading-Turquoise-Couch.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/38/114738-131-1EC2D535/assassination-Pres-John-Wilkes-Booth-Abraham-Lincoln-April-14-1865.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/68/220368-131-C835E48E/United-States-electoral-college-votes-by-state.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/18/418-131-A96118F7/Eagle-Grumman-lunar-module-Apollo-11-footpads-July-20-1969.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/93/173193-131-3EE3B458/Nelson-Mandela.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/43/193443-131-17ABE1C9/Union-Jack-flag-Great-Britain-united-kingdom.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/02/90702-131-2EC3F987/The-Time-Machine-Morlocks-film-version-George-1960.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/15/95015-131-5E505098/statues-Moai-Easter-Island.jpg?w=200&h=200&c=crop",
"https://cdn.britannica.com/30/129330-131-2C73ED00/The-Lamb-edition-Songs-of-Innocence-William.jpg"
] |
[] |
[] |
[
"Johannes V. Jensen",
"encyclopedia",
"encyclopeadia",
"britannica",
"article"
] | null |
[
"The Editors of Encyclopaedia Britannica"
] |
1999-01-14T00:00:00+00:00
|
Johannes V. Jensen was a Danish novelist, poet, essayist, and writer of many myths, whose attempt, in his later years, to depict man’s development in the light of an idealized Darwinian theory caused his work to be much debated. He received the Nobel Prize for Literature in 1944. Of old peasant
|
en
|
/favicon.png
|
Encyclopedia Britannica
|
https://www.britannica.com/biography/Johannes-V-Jensen
|
Johannes V. Jensen (born Jan. 20, 1873, Farsø, Den.—died Nov. 25, 1950, Copenhagen) was a Danish novelist, poet, essayist, and writer of many myths, whose attempt, in his later years, to depict man’s development in the light of an idealized Darwinian theory caused his work to be much debated. He received the Nobel Prize for Literature in 1944.
Of old peasant stock and the son of a veterinarian, Jensen went to Copenhagen to study medicine but turned to writing. He first made an impression as a writer of tales. These works fall into three groups: tales from the Himmerland, tales from Jensen’s travels in the Far East (for which he was called Denmark’s Kipling), and more than 100 tales published under the recurrent title Myter (“Myths”). His early writings also include a historical trilogy, Kongens Fald (1900–01; The Fall of the King, 1933), a fictional biography of King Christian II of Denmark. Shortly thereafter, as a result of his travels in the United States, came his Madame d’Ora (1904) and Hjulet (1905; “The Wheel”). In 1906 he published a volume of poems, and late in life he returned to poetry, his Digte, 1901–43 being the result.
Britannica Quiz
A Study of Poetry
|
||||
correct_award_00093
|
FactBench
|
1
| 15
|
https://knowledgezone.co.in/resources/gk/history/5c43182c936cc70008a9d2ab
|
en
|
Your Gateway to Knowledge
|
[] |
[] |
[] |
[
"Knowledge",
"Career",
"Job",
"Scholarsip",
"Admission",
"Course",
"Bookmarks",
"Quiz",
"Examination"
] | null |
[] | null |
Knowledge Zone - Social Knowledge Sharing Platform
|
en
|
/assets/images/icons/kzone_icon.png
|
Knowledge Zone
|
https://knowledgezon.co.in/
| |||||
correct_award_00093
|
FactBench
|
2
| 17
|
https://www.encyclopedia.com/arts/culture-magazines/jensen-johannes-v-20-january-1873-25-november-1950
|
en
|
Jensen, Johannes V. (20 January 1873 - 25 November 1950)
|
[
"https://www.encyclopedia.com/themes/custom/trustme/images/header-logo.jpg"
] |
[] |
[] |
[
"Johannes V. Jensen (20 January 1873 - 25 November 1950)Sven Hakon Rossel University of Vienna1944 Nobel Prize in Literature Broadcast Presentation"
] | null |
[] | null |
Johannes V. Jensen (20 January 1873 - 25 November 1950)Sven Hakon Rossel University of Vienna1944 Nobel Prize in Literature Broadcast Presentation Source for information on Jensen, Johannes V. (20 January 1873 - 25 November 1950): Nobel Prize Laureates in Literature, Part 2 dictionary.
|
en
|
/sites/default/files/favicon.ico
|
https://www.encyclopedia.com/arts/culture-magazines/jensen-johannes-v-20-january-1873-25-november-1950
|
Johannes V. Jensen (20 January 1873 - 25 November 1950)
Sven Hakon Rossel
University of Vienna
1944 Nobel Prize in Literature Broadcast Presentation
1944 Nobel Prize in Literature Presentation Speech
Jensen: Banquet Speech
Jensen: Autobiographical Statement
Bibliographies
Biographies
References
Papers
This entry was expanded by Rossel from his Jensen entry in DLB 214: Twentieth-Century Danish Writers.
BOOKS: Danskere (Copenhagen: Det Nordiske, 1896);
Einar Elkær: Roman (Copenhagen: Det Nordiske, 1898);
Himmerlandsfolk: Historier (Copenhagen: Det Nordiske, 1898);
Intermezzo: Dolores, Forsvundne Skove, Louison (Copenhagen: Det Nordiske, 1899);
Foraarets Død (Copenhagen: Det Nordiske, 1900);
Den store Sommer (Copenhagen: Det Nordiske, 1900);
Den gotiske Renaissance (Copenhagen: Det Nordiske, 1901);
Vinteren (Copenhagen: Det Nordiske, 1901);
Kongens Fald (Copenhagen: Det Nordiske, 1901)–comprises Foraarets Død, Den store Sommer, and Vinteren; translated by P. T. Federspiel and Patrick Kirwan as The Fall of the King (London: Grayson & Grayson, 1933; New York: Holt, 1933);
Kirken i Farsø: Skitse (Minneapolis & Chicago: C. Rasmussen, 1903);
Madame D’Ora (Copenhagen: Gyldendal, 1904);
Nye Himmerlandshistorier (Copenhagen: Gyldendal, 1904)–includes “Kirstens sidste Rejse,” translated by Lee Marshall as “Kirsten’s Last Journey,” in Anthology of Danish Literature: Bilingual Edition, edited by F.J. Billeskov Jansen and P. M. Mitchell (Carbondale: Southern Illinois University Press, 1971), pp. 300-379;
Skovene (Copenhagen: Gyldendal, 1904);
Hjulet (Copenhagen: Gyldendal, 1905);
Digte (Copenhagen: Gyldendal, 1906); revised and enlarged as Digte: Anden Udgave, 1917; revised and enlarged as Digte: Tredie Udgave, 1921; revised and enlarged as Digte: 1901–1941, 1943; revised and enlarged as Digte, 1948–includes “Ved Frokosten” and “Paa Memphis Station,” translated by Alexander Taylor as “At Lunch” and “At Memphis Station,” in Contemporary Danish Poetry, edited by Line Jensen and others (Copenhagen: Gyldendal, 1958), pp. 91-92;
Myter og Jagter (Copenhagen: Gyldendal, 1907)– includes “Fusijama,” translated by Elias Bredsdorff as “Fujiyama,” in his Contemporary Danish Prose: An Anthology (Copenhagen: Gyldendal, 1958), pp. 85-90;
Den ny Verden: Til international Belysning of nordisk Bondekultur (Copenhagen: Gyldendal, 1907);
Singapore Noveller (Copenhagen: Gyldendal, 1907);
Nye Myter (Copenhagen: Gyldendal, 1908);
Bræen: Myter om Istiden og det første Menneske (Copenhagen: Gyldendal, 1908); translated by Arthur G. Chater in Fire and Ice, volume 1 of The Long Journey (London: Gyldendal, 1922; New York: Knopf, 1923);
Lille Ahasverus (Copenhagen: Gyldendal, 1909);
Myter: Ny Samling (Copenhagen: Gyldendal, 1910);
Himmerlandshistorier: Tredie Samling (Copenhagen: Gyldendal, 1910);
Nordisk Aand: Kroniker og Karakteristiker (Copenhagen: Gyldendal, 1911);
Skibet (Copenhagen: Gyldendal, 1912);
Myter: Fjerde Samling (Copenhagen: Gyldendal, 1912);
Rudyard Kipling (Copenhagen: Gyldendal, 1912);
Olivia Marianne (Copenhagen: Gyldendal, 1915);
Introduktion til vor Tidsalder (Copenhagen: Gyldendal, 1915);
Aarbog l916 (Copenhagen: Gyldendal, 1916);
Eksotiske Noveller (Copenhagen: Gyldendal, 1916)–comprises Singapore Noveller, Lille Ahasverus, and Olivia Marianne;
Aarbog l917 (Copenhagen: Gyldendal, 1917);
Norne-Gæst (Copenhagen: Gyldendal, 1919); translated by Chater in The Cimbrians, volume 2 of The Long Journey (London: Gyldendal, 1923; New York: Knopf, 1923);
Det tabte Land: Mennesket før Istiden (Copenhagen: Gyldendal, 1919); translated by Chater in Fire and Ice, volume 1 of The Long Journey (London: Gyldendal, 1922; New York: Knopf, 1923);
Johannes Larsen og hans Billeder (Copenhagen: Gyldendal, 1920);
Christofer Columbus (Copenhagen: Gyldendal, 1921); translated by Chater as Christopher Columbus, volume 3 of The Long Journey (London: Gyldendal, 1924; New York: Knopf, 1924);
Sangerinden (Madame d’Ora): Drama i fem Akter (Copenhagen: Gyldendal, 1921);
Cimbrernes Tog (Copenhagen: Gyldendal, 1922); translated by Chater in The Cimbrians, volume 2 of The Long Journey (London: Gyldendal, 1923; New York: Knopf, 1923);
Æstetik og Udvikling: Efterskerift til Den lange Rejse (Copenhagen: Gyldendal, 1923);
Aarstiderne, illustrated by Johannes Larsen (Copenhagen: Gyldendal, 1923);
Myter: Tredie Bind, 1914–1924 (Copenhagen: Gyldendal, 1924);
Hamlet: Til Forklaring of Hamletskikkselen (Copenhagen: Gyldendal, 1924);
Evolution og Moral (Copenhagen: Gyldendal, 1925);
Aarets Højtider (Copenhagen: Gyldendal, 1925);
Verdens Lys: Nye Digte (Copenhagen: Gyldendal, 1926);
Jørgine (Copenhagen: Hage & Clausens Forlag, 1926);
Dyrenes Forvandling: Til Udviklingens Plastik (Copenhagen: Gyldendal, 1927);
Ved Livets Bred og andre Myter (Copenhagen: Gyldendal, 1928);
Aandens Stadier (Copenhagen: Gyldendal, 1928);
Retninger i Tiden: Artikler 1925–30 (Copenhagen: Gyldendal, 1930);
Den jydske Blæst: Digte 1926–1930 (Copenhagen: Gyldendal, 1931);
Form og Sjæl: Portræter og Personligheder (Copenhagen: Gyldendal, 1931);
Paa danske Veje, illustrated by Larsen (Copenhagen: Gyldendal, 1931);
Pisangen (Copenhagen: Gyldendal, 1932);
Kornmarken (Copenhagen: Gyldendal, 1932);
Sœlernes Ø (Copenhagen: Gyldendal, 1934);
Det Blivende: Tankens Revoluüoneñng i det 19de Aarhundrede og Tïlbagefaldet i det 20de (Copenhagen: Gyldendal, 1934);
Dr. Renaults Fristelser (Copenhagen: Gyldendal, 1935);
Gudrun (Copenhagen: Gyldendal, 1936);
Darduse, Bryllupet i Peking: Eventyrkomedie i fire Akter (Copenhagen: Gyldendal, 1937);
Paaskebadet: Digte 1931–1937 (Copenhagen: Gyldendal, 1937);
Jydske Folkelivsmalere: Dalsgaard, Michael Ancher, Hans Smidth (Copenhagen: Arthur Jensen, 1937);
Den lange Rejse, 2 volumes (Copenhagen: Gyldendal, 1938)–comprises volume 1, Det tabte Land, Bræen, and Norne-Gæst; and volume 2, Cimbrernes Tog, Skibet, and Christofer Columbus;
Thorvaldsen: Haandvœrkeren og Manden (Copenhagen: Arthur Jensen, 1938);
Nordvejen: Indtryk af norsk Natur (Copenhagen: Gyldendal, 1939);
Fra Fristaterne: Rejsebreve, med et Tilbageblik (Copenhagen: Gyldendal, 1939);
Gutenberg: Til Bogtrykkerkunstens Historie, by Jensen and Aage Marcus (Copenhagen: Bianco Lunos Bogtrykkeri, 1939);
Mariehønen (Copenhagen: Gyldendal, 1940);
Udvalgte Prosastykker, edited by Morten Borup and Peter Iløse (Copenhagen: Gyldendal, 1940);
Mindets Tavle: Portrœter og Personligheder (Copenhagen: Gyldendal, 1941);
Vor Oprindelse (Copenhagen: Gyldendal, 1941);
Om Sproget og Undervisningen (Copenhagen: Gyldendal, 1942);
Kvinden i Sagatiden (Copenhagen: Gyldendal, 1942);
Folkeslagene i Østen (Copenhagen: Gyldendal, 1943);
Møllen (Copenhagen: Gyldendal, 1944);
Myter, 2 volumes (Copenhagen: Gyldendal, 1946);
Bogbinderen (Copenhagen: Printed by J. H. Schultz, 1947);
Afrika: Opdagelsesrejserne (Copenhagen: Gyldendal, 1949);
Danske Køreteøjer (Copenhagen: Thaning & Appel, 1949);
Swift og Oehlenschläger (Copenhagen: Gyldendal, 1950);
Tilblivelsen (Copenhagen: Gyldendal, 1951);
Mytens ring: Efterladte myter og beskrivelser (Copenhagen: Gyldendal, 1957);
Ungt er endnu Ordet: Portræter og Personligheder, edited by Carl Bergstrøm-Nielsen (Copenhagen: Gyldendal, 1958);
Troth, edited by Sven Hakon Rossel (Copenhagen: Museum Tusculanum, 2002).
Editions and Collections: Den jydske blæst og andre digte, selected by Ole Wivel (Copenhagen: Gyldendal, 1957);
Bræen, edited by Martin Larsen (Copenhagen: Gyldendal, 1963);
Himmerlandshistorier, edited, with an afterword, by Jørgen Elbek, Gyldendals Bibliotek, no. 24 (Copenhagen: Gyldendal, 1963);
Johannes Larsen og Aarstiderne, edited, with a foreword, by Aage Marcus, Gyldendals Uglebøger, no. 56 (Copenhagen: Gyldendal, 1963);
Jordens Kreds, selected by Marcus, introduction by Niels Birger Wamberg (Copenhagen: Gyldendal, 1967);
Myter i Digte i Udvalg, selected by Leif Nedergaard (Copenhagen: Gyldendal, 1969);
Himmerlandshistorier, edited by Povl Marstal (Copenhagen: Gyldendal, 1970);
Mørkets frodighed; Tidlige myter, selected by Wamberg (Copenhagen: Gyldendal, 1973);
12 Himmerlandshistorier, edited by Sven Moller Kristensen (Copenhagen: Gyldendal, 1979);
Tretten Myter: Johannes V. Jensen, selected and illustrated by Jens Jensen (Copenhagen: Gyldendal, 1982);
Himmerlandshistorier: Et udvalg, selected by Sonja Carlberg (Copenhagen: Gyldendal, 1984);
Kender du Johannes V. Jensen, selected by Margit Mørk (Copenhagen: Grafisk, 1986);
Christofer Columbus, foreword by Ib Michael (Copenhagen: Gyldendal, 1992);
Madame D’Ora; Hjulet, edited by Sven Hakon Rossel (Copenhagen: Det Danske Sproge-og Litteraturselskab/Borgen, 1997);
Digte: Johannes V. Jensen, edited by Frits Johansen (Copenhagen: Gyldendal, 1998).
Editions in English: “Ane og Koen,” translated by Victor Folke Nelson as “Ann and the Cow”; “Forsvundne Skove,” translated by Henry Commager as “Lost Forests,” in Denmark’s Best Stories, edited by Hanna Astrup Larsen (New York: American-Scandinavian Foundation/Norton, 1928), pp. 327-340;
Garden Colonies in Denmark, translated by F. Aubrey Rush (Copenhagen: Danske selskab, 1949);
Denmark’s Johannes V. Jensen, translated by Marion L. Nielsen (Logan: Utah State Agricultural College, 1955);
The Waving Rye, selected by C. A. Bodelsen, translated by Ronald Bathgate and others (Copenhagen: Gyldendal, 1958; New York: American-Scandinavian Foundation, 1959);
The Fall of the King, translated by Alan Bower and edited by Sven Hakon Rossel (Seattle: Mermaid Press, 1992; revised edition, Traverse City: Stonehill, 1995).
PLAY PRODUCTIONS: Trods, Chicago, Scandia Hall, 1 February 1903;
Sangerinden, Odense, Odense Teater, 16 November 1923;
Darduse, Bryllupet i Peking: Eventyrkomedie i fire akter, Copenhagen, Royal Theater, 22 January 1937;
Hamlet, translation of William Shakespeare’s play, Copenhagen, Royal Theater, 24 April 1937.
OTHER: Jack London, Naar Naturen kalder, translated by Aslaug Mikkelsen, foreword by Jensen (Copenhagen: Peter Hansen, 1907);
Ditleff von Zeppelin, Fugletrak, edited by Jensen and Otto Gelsted (Copenhagen: Gyldendal, 1916);
Thorvaldsens Portrœtbuster, introduction by Jensen, biographical notes by Aage Marcus (Copenhagen: Gyldendal, 1926).
TRANSLATIONS: Frank Norris, Af Hvedens Saga: Polypen, en Bog om Kalifornien (Copenhagen: Gyldendal, 1907);
Rudyard Kipling and Wolcott Balestier, Nauhlaka: Forædling fra Vesten og Østen, translated by Jensen and Aslaug Mikkelsen (Copenhagen: V. Pio, 1911);
Kipling, Fribytterbreve; De rædselsfulde Nætters By og andre Skizzer, translated by Jensen and Mikkelsen (Copenhagen: V. Pio, 1912);
Kipling, Fra Hav til Hav, translated by Jensen and Mikkelsen (Copenhagen: V. Pio, 1913);
Kipling, Liv og Drøm, translated by Jensen and Mikkelsen (Copenhagen: V. Pio, 1913);
Kipling, Med Natexpressen Aar 2000 og andre Fortœllinger, translated by Jensen and Mikkelsen (Copenhagen: V. Pio, 1914);
Rudolf Requadt, Krigsflyveren (Copenhagen: Fr. Ravn, 1916);
Adelbert von Chamisso, Peter Schlemihls vidunderlige Historie (Copenhagen: Høst & Søn, 1918);
Walt Whitman, Digte, translated by Jensen and Otto Gelsted (Copenhagen: Nyt Nordisk, 1919);
De islandske Sagaer, 3 volumes, translated by Jensen, Knud Hjortø, and Hans Kyrre (Copenhagen: Gyldendal, 1930–1932);
William Shakespeare, Hamlet (Copenhagen: Gyldendal, 1937);
Egil Skallagrimssons Saga (Copenhagen: Gyldendal, 1943);
Snorre Sturlason, Heimskringla: Norges Kongesagaer, translated and edited by Jensen and Kyrre (Copenhagen: Gyldendal, 1948).
SELECTED PERIODICAL PUBLICATIONS–UNCOLLECTED: Skatten paa Korsøgaard: Fortælling fra Aarhundredets Begyndelse, as Ivar Lykke, Revuen (2 January-21 July 1895);
DødssejUren: Fortælling, as Lykke, Revuen (24 July 1895–26 January 1896);
Blodfesterne i Arizona: Mexikansk, historisk Fortælling, as Lykke, Revuen (26 January-31 May 1896);
Jim Blacksools Revolver: Roman fra det fjerne Vesten, as Lykke, Revuen (31 May-30 September 1896);
Falskmønterbandens blodige Bog: Kriminal-Roman, as Lykke, Revuen (1 July-13 December 1896);
Nihilistens Ed: En Nutidsroman, as Lykke, Revuen (19 July - 23 September 1896);
Taterkongens ni Sønner og deres Blodhœvn: Roman fra Dronning Margrethes Tid, as Lykke, Revuen (9 December 1896–5 May 1897);
Milliontyvenes Høvding eller den røde Tiger: Original illustreret New Yorkerroman, as Lykke, Revuen (9 May-22 September 1897);
Hakon Blodøxes Bedrifter: Roman fra Vikingetiden, as Lykke, Revuen (26 September 1897 – 9 March 1898);
Ridder Tages Dødsridt eller de blodige Sporer: Original historisk Roman fra Valdemar den Stores Tid, as Lykke, Revuen (13 March-20 July 1898).
By revolting against the introspection of Danish turn-of-the-century literature and the psychological and social naturalism of the nineteenth century, Johannes V. Jensen became one of the most prolific, innovative spirits in Danish cultural life. His worship of modern science and technology, the bustling life in the international metropolis, and pragmatic materialism and capitalism made him instrumental in the reorientation of Danish literature away from continental French and German models toward Anglo-American cultural life. His enthusiasm for American literature resulted in the introduction and promotion in Denmark of the works of many American writers.
Charles Darwin’s theories were an important source of inspiration for Jensen’s depictions of nature scenes and animals but had a disastrous impact on his questionable evolutionary and anthropological ideas. Despite his scientific and anti-idealistic orientation, Jensen was never able to let go of his deep-rooted dependence on the Jutland peasant traditions of his childhood and an equally deep-rooted fascination with Christian metaphysics. This contradiction or split resulted in an existential insecurity that manifests itself in a fragmentation both with regard to content and form, which gives Jensen’s fictional work continuing relevance. It places him in the modernist tradition of the twentieth century, adding to some of his texts a vibrant and eclectic, almost postmodern, quality. However, when Jensen was able to bridge this split in his myths and poetry, he created pieces of timeless art in which observation and vision, present time and eternity, reality and dream are seamlessly merged.
Johannes Vilhelm Jensen was born on 20 January 1873 in Farsø in the northern Jutland province of Himmerland. His parents were both of peasant stock. His mother, Marie Kirstine Jensen, had a prosaic and practical view of life, but she also possessed a vivid imagination and a hot temper; his father, Hans Jensen, was the district veterinarian and was an expert in botany and zoology. He inspired Jensen’s later studies of nature and discovery of Darwin’s evolutionary theories. Although the family was strongly antireligious, in the late 1880s Jensen’s father became interested in spiritualism, an interest that became lifelong for Jensen’s sister, the writer Thit Jensen. Johannes V. Jensen later criticized this occult interest, although it was undoubtedly one of the causative factors for the longing for eternity and spiritual expansion that became an essential feature in his writing. Jensen also had one younger brother, who became a painter under the name Hans Deuvs.
In the autobiographical sketch Kirken i Farsø: Skitse (1903, The Church in Farse: Sketch), Jensen described his boyhood with his siblings and friends from the small town and the neighborhood farms, emphasizing his boldness and extroversion. A somewhat different description comes from one of Jensen’s friends, Peder Bach, quoted in Oluf Friis’s 1974 biography of Jensen: “Han var en besynderlig Dreng, ikke som de andre, men for det meste tavs og indesluttet, og han gik gerne og saa ned i jorden i sine egne dybe Tanker; men til Tider kunde han vaagne op, og da husker jeg at hans Snebolde blev temmelig haarde” (He was a strange boy, not like the others, but mostly silent and reticent, and he usually walked around looking down deep in his thoughts; but at times he could wake up, and then I remember that his snowballs became rather hard). A characteristic trait was his joy in reading. Jensen himself, in Mytens ring: Efterladte myter og beskrivelser (1957, The Ring of the Myth: Posthumous Myths and Descriptions), recalled his first books: children’s readers; accounts by Captain James Cook, Henry Morton Stanley, and David Livingstone of their expeditions; a history of Denmark; and Hans Christian Andersen’s tales–all works that he found on his father’s bookshelves.
After two years at Farse School, followed by private tutoring, Jensen went in 1890 to Viborg Katedralskole (cathedral school) for three years, where he became familiar with the humanistic, bourgeois educational tradition that for hundreds of years had formed the basis for spiritual life in Denmark. Jensen’s years in Viborg, depicted in the first two chapters of his first novel, Danskere (1896, Danes), were not harmonious, and his opposition was nourished in particular by reading the German poet Heinrich Heine, whom Jensen called, in his monograph Rudyard Kipling (1912), “denne Dynamitsjæsl” (this explosive spirit), and the British author Rudyard Kipling, in whose works Jensen experienced a new world of activity and international settings. Although classes in Viborg did not include modern literature, Jensen read works by contemporary writers in private. Both the neo-Romantic Danish poet Johannes Jørgensen and the Norwegian Knut Hamsun, whom Jensen regarded as the first to break with the older realism and naturalism–what Jensen called “den galliske knirkende Fornuftspoesi” (the Gallic, creaking literature of reason)–became additional models for the future writer.
Jensen passed his university entrance exams in 1893 and began the study of medicine at the University of Copenhagen in the fall. In January 1894 he published four poems, basically derivative of neo-Romantic literature, that include motifs foreshadowing his later settlement with the introverted and spiritual attitude that was prevalent in Danish literature at the time of his debut and remained part of Jensen’s own personality. Likewise without artistic quality are ten serial novels written under the pseudonym Ivar Lykke and published in the Copenhagen journal Revuen between 1895 and 1898. In Mytens ring, Jensen hints at the main ingredients of these serial novels, remarking that “Hvert Kapitel havde sit horrible Mord” (Each chapter contained a horrible murder). The novels hold no trace of his later mastery of style but are not without importance, as in them Jensen introduces motifs that he took up again later.
Jensen had his true literary debut in 1896 with the novel Danskere. After he abandoned his medical studies in 1898, he published Einar Elkœr: Roman (1898, Einar Elkæsr: Novel). Between the publication of the two novels he took a brief trip to New York City in 1896, the first of many travels that, altogether, brought Jensen to the United States six times. Both the character Buris in the first novel and Einar Elkasr in the second are students from the provinces who are confronted with the modern metropolis, Copenhagen. They are obsessed by paralyzing self-absorption that prevents them from establishing a spontaneous rapport with other people, in particular with women. Whereas Jensen hints at the possibility that the disintegration of Buris’s personality may stop, Einar constantly lapses into his ravings and dreams and dies at a mental hospital, where “Sektionen viste blèd Hjæsrne” (the autopsy showed a soft brain). Even though the two novels are strongly dependent on literary models–their melancholy atmosphere, big-city sceneries, and self-reflective protagonists can be found in the early works of Jergensen and Hamsun–their rebellion against both the fin-de-siècle spirit of the 1890s and literary and philosophical authority in general has a genuine ring. Jensen later excluded his first two books from lists of his works, perhaps because he recognized too much of his own introverted personality in his two protagonists. They are desperate outsiders, whose longing for happiness finds no fulfillment. This longing, which in so many of Jensen’s characters manifests itself as a longing to travel, is in reality a longing for the expansion of the soul.
Introspection remains a major issue in the two travelogues, Intermezzo: Dolores, Forsvundne Skove, Louihon (1899, Intermezzo: Dolores, Lost Forests, Louison) and Skovene (1904, The Woods), both written under the influence of Heine’s capricious, ironic style. The first was based on Jensen’s two visits as a reporter for the liberal newspaper Politiken to Spain, Germany, and France in 1898 and also includes the first of his many attacks on the works of Friedrich Nietzsche, whose theories of the superman constituted the “bad Darwinism” that Jensen later saw as the indirect cause of the two world wars. The second book was inspired by Jensen’s five-week stay in Singapore and on the Malay Peninsula during his first trip around the world in 1902 and 1903, which also took him to China, Japan, and the United States. In his description of a tiger hunt Jensen incorporates lyrical and satirical passages, witticisms, and brilliantly executed, precise but poetic descriptions of animals and nature. The strong stylistic contrasts in the volume reflect the self-ironic and didactic approach Jensen takes toward his own glorification of the primitive, against which he sets his homesickness and longing.
The reworked travel letters from his visit to Spain in 1898 and another to the World’s Exhibition in Paris in 1900, which Jensen incorporated into his Den gotuke Renaissance (1901, The Gothic Renaissance), on the other hand, include an enthusiastic endorsement of progress and reality: “Det tyvende Aarhundrede suser over Hovedet. Jeg bekender mig til Virkeligheden, jeg bekender” (The twentieth century roars above our heads. I profess to reality, I profess). This work glorifies the expansive spirit of the Gothic, that is, Anglo-Saxon, race, the fullest expression of which Jensen found in the American pragmatic and progressive view of life as it brought liberation from the decadence of the previous century. The volume climaxes in Walt Whitman–inspired prose hymns to progress and technology. Jensen’s theory was that the Gothic race had its origin in his home region, Himmerland. Its nature and people are portrayed in the realistic short stories in Himmerlands-folk: Historier (1898, Himmerland People: Stories), which constitute a counterbalance to Jensen’s introspective writings from the same period, and in the two collections Nye Himmerlandshistorier (1904, New Himmerland Stories) and Himmerlandshistorier: Tredie Samling (1910, Himmerland Stories: Third Collection). The early texts are marked by Jensen’s preoccupation with the meaninglessness of day-to-day existence and death. Later stories include masterful character studies of grotesque, roaming eccentrics, heroic accounts of man’s stubborn fight against either sordid surroundings or the callous forces of nature, and comic or satiric exposures of human folly. Jensen’s intimate knowledge of the flaws and meanness of his characters is balanced by a profound veneration for the old peasant traditions they also represent. In such texts Jensen has distanced himself from the uncritical glorification of technology and progress in Den gotiske Renaissance, and these stories are far above traditional regional literature.
In 1900 and 1901 Jensen published an historical novel in three parts: Foraarets Død (1900, Spring’s Death), Den store Sommer (1900, The Full Summer), and Vinteren (1901, Winter). Jensen combined the works into a single volume under the title of Kongens Fald (1901; translated as The Fall of the King, 1933). With Kongens Fald, Jensen successfully merged the extrovert/naturalistic and introvert/spiritual elements in his writing into a splendid mythic composition. In Jensen’s works there are many attempts at defining “the mythic.” In his 1932 article on Jensen, Aage Marcus reports Jensen as saying: “Leave out the plot, concentrate on those short flashes of the essence of things that illumine man and time, and you have the myth.” As Jensen writes in his Aarbog 1916 (Yearbook 1916), his point of departure is generally a concrete observation from which an expansion in time and space takes place, a technique that entails a revelation, “ingen lang møjsommelig Opregning men et Spring ind i et Billede” (rather a leap into an image than a long painstaking account). Crucial components of the myth are the tensions between the close and the distant, the tangible and the transcendental, and the concrete and the inexplicable, frequently establishing a perspective of time in which present, past, and future are bound together.
Kongens Fald can be read as a purely historical novel, attacking the passivity and indecisiveness that Jensen perceived as a major component of Danish mentality. These negative qualities are embodied in the Renaissance king Christian II and his companion, the mercenary Mikkel Thøgersen. However, Jensen ignores both historical accuracy and a structured psychological character delineation. Instead, by mingling dream-like passages of poetic beauty with harsh, naturalistic scenes of violence and destruction, he creates magnificent, deeply pessimistic visions of man’s inability to reach happiness. Like Buris in Danskere, the introverted outsider Mikkel completely lacks the ability to devote himself to enjoying the present: he can only act when his anguish turns into hatred and blind destructiveness. He rapes Ane Mette, the fiancée of his rival Otte Iversen, and many years later he kills Otte’s son, the carefree and spontaneous Axel, whose success with women stirs Mikkel’s feelings of alienation, envy, and hatred. He assaults the unarmed Axel, who later, without any bitterness, dies of blood poisoning, fever-stricken and dreaming that he is sailing into “den store Sommers Land, Dødens Land” (the land of full summer, the land of death). Axel’s “fall,” one of the lyrical highlights of the novel, corresponds on the historical level to the king’s “fall” during the fateful night in 1523 when Christian II, accompanied by Mikkel, irresolutely sails back and forth between two regions of Denmark wondering whether or not to take up the fight against the rebellious nobility–a dramatic highlight and at the same time a penetrating analysis of the paralysis of action by doubt. However, of crucial importance is not the outward fall as demonstrated in the king’s destiny. Jensen’s masterpiece must be read as a book about the inner fall, about man’s impermanence, and about death as his inexorable destiny in the midst of burgeoning life, illustrated in Axel’s fate. Kongens Fald is a book about the total absurdity of life and love, illustrated in Mikkel’s person. All of humanity is subject to the law of the fall, and only death brings the desired peace.
On his first trip around the world Jensen crossed the Pacific Ocean from Japan, stopping in the Hawaiian Islands and disembarking in San Francisco on 26 October 1902. His stay in the United States, primarily in Chicago and New York during the winter and spring of 1902–1903, provided him with the scenery for the two novels Madame D’Ora (1904) and Hjulet (1905, The Wheel), with which Jensen–only in part successfully-intended to continue the antimetaphysical trend in his authorship. Intentionally he disregards the rules of the traditional, naturalistic novel as he sets out to portray not individuals but various stages in man’s evolution within the framework of a fierce Darwinian struggle for the “survival of the fittest.” In Madame D’Ora this struggle takes place between “the missing link”–the cynical lay preacher, charlatan, and murderer Evanston–and the scientist Edmund Hall, a Faustian character fascinated with the transcendental, a neurasthenic dreamer who not only fails to reciprocate the passionate and unconditional feelings of the opera singer Leontine D’Ora but also falls an easy prey to Evanston because of his preoccupation with spiritual experiences.
In the sequel, Hjulet, the young poet Lee, Jensen’s alter ego, kills Evanston, who has now changed his name to the symbolic Cancer and has become an even clearer example of the Nietzschean vulgarization of Darwinism that Jensen had earlier attacked in Intermezzo. Through his struggle Lee overcomes his earlier passivity and turns into a man of action who condemns all aesthetics as nothing but “en Sygdom i Sansen for Virkeligheden” (a disease in the perception of reality). Passages parodying the detective novels of Sir Arthur Conan Doyle alternate with lengthy monologues (in particular by the constantly talking Cancer), congenial translations of poems by Whitman, and Lee’s sweeping visions, ranging from prehistoric, evolutionary stages in man’s development to his view of Christopher Columbus as a man of Nordic descent and a bridge builder between Europe’s past and America’s present. These visions–indeed prose poems of compelling poetic force–as well as the totally negative portrayal of the representative of pure materialism, Evanston/Cancer, prove that Jensen could not let go of the aesthetics that he lets Lee reject. The enthusiasm for the United States expressed in these two novels foreshadows later works of fiction and nonfiction, as Jensen returned to both theories of Columbus and Faustian motifs.
After Jensen returned to Copenhagen from New York City in the summer of 1903, he published a series of newspaper articles. In these articles his violent criticism of Danish superficiality and provincialism demonstrates how difficult it was for him to resign himself to staying home. He was particularly irritated by Danish literary life, partly because of his own aggressive nature, which isolated him among colleagues, and partly because he rejected both the neo-Romantics and the radical circle around the influential critic Georg Brandes, whom he had earlier admired. He was also unhappy because his books received mostly negative reviews and sold poorly.
Nevertheless, the period from 1904 to 1906 was characterized by a hectic productivity: two novels, a travelogue, a new volume of stories from Himmerland, and finally an epoch-making collection of poetry. All these publications were projects that Jensen felt he wanted to finish and move on. Digte (1906, Poems; revised and enlarged, 1917, 1921, 1943, 1948) includes almost all of his youthful poetry, except for those poems that he had published during his first year at the university, which have never been collected. Two of Jensen’s earliest prose poems from 1901, “Interferens” (Interference) and “Ved Bordet” (At the Table), published in a newspaper, were revised and incorporated in Digte. Prose poetry came to dominate this volume, especially with the inclusion of the Whitman translations, first published in the novel Hjulet. This preponderance of free verse has contributed to the popular but erroneous view that Jensen’s poetry consists mostly of prose poetry, with Whitman as the predominant model. The truth is rather that Jensen’s prose poems, modeled after the free verse of Johann Wolfgang von Goethe and Heine, belong to the period of 1901 to 1906, after which he increasingly devoted himself to poems in regular meters with either Old Norse alliterative verse or the classical rhymed stanza as models.
Digte is a milestone in the development of Danish lyric poetry. Its highly developed, bold imagery is filled with contrasts and tension both in content and in style, a style that veers from cynical statement to ecstatic exclamation to heart-rending simplicity and tenderness–the heritage of the 1890s. This metaphoric language is based on a sharp sensory perception that often takes the shape of a merciless self-analysis at the same time as it incorporates images from modern technology and everyday life.
At the center of the volume are three texts, the two prose poems “Interferens” and “Paa Memphis Station” (At Memphis Station), and the ballad-like “Christofer Columbus.” In “Interferens,” Jensen seeks, as he does in several of the poems, to reconcile the clash between extroversion and introversion, optimism and pessimism, belief in progress and wish for death–the two poles in his writing. When they intersect or rather become fused in one single state of mind, the myth emerges, as in “Christofer Columbus,” originally published in Madame D’Ora, where it was a warning to Hall to abandon his insatiable ambitions to transgress the boundaries of empirical science. It can, however, also be read as an anticipation of the novel Christofer Columbus (1921; translated as Christopher Columbus, 1924), as a portrait of Jensen’s tragic alter ego, Columbus, who is invoked in the poem “Afsked” (Departure). Here, another crucial theme from Jensen’s novels, the longing to travel, is introduced in an attack on humdrum everyday life in provincial Denmark. The poem was written in 1902 just before Jensen’s voyage around the world. Inspired by the voyage itself is “Paa Memphis Station,” a commitment to a reality that must encompass even the experience of death. This realization ignites the poet’s zest for a life that must be conquered through travel. Thus Columbus must move on, but the outcome of his travel turns out to be tragic, as Jensen points out in the succeeding poem, “Hverdagene” (Everyday Life). Its concluding request to “gaa frygtløst ind i Hverdagene” (enter into everyday life fearlessly) is preceded by lines about the merciless passage of time and unavoidable death:
Somren slaar sine Kister i.
Unge er vi saa ikke mere.
Men har Haabet ikke beskæmmet os tilstrækkeligt?
Nu kommer vor rige Dødstid, Broder.
(Summer is closing its chests.
So we are young no longer.
But has hope not shamed us sufficiently?
Now comes a plentiful dying time, brother.)
Such lines point ahead to perhaps the most difficult of all lyrical genres, the memorial poem, a genre that Jensen mastered to a degree not reached by any other Danish writer. In the second edition of Jensen’s collected poems, Digte: Anden Udgave (1917, Poems: Second Edition), there are two such poems, “Leonora Christine” and “Bjørnstjerne Bjørnson,” written in traditional iambic meter. The first is a portrait of Leonora Christine, a Danish Renaissance princess who spent twenty-two years in prison and whose tragic destiny Jensen perceives in a mythic perspective. The second poem is a glorification of the active and extroverted Norwegian writer, written on the occasion of Bjørnson’s death in 1910. The poem is pervaded with pantheism; however, Jensen concludes with lines negating that pantheism: “O Solopgang paa Bjergets Sne – / ham skal du aldrig mere se” (Oh sunrise on the mountain’s snow– / you shall never see him again). The last poem in the volume is “Envoi,” which Jensen kept revising until it received its final form in the third edition of his collected poems. This poem is noteworthy for its haiku-like simplicity, with a perfect form embracing time and space, a myth concentrated around the eternity of love placed in the cycle of nature.
Jensen’s productivity after his return from his journey around the world in 1902 and 1903 was also caused by the need to make a living for his family. On 15 April 1904 he married twenty-six-year-old Else Marie Ulrik, with whom he had three sons, Jens, Villum, and Emmerick. On 2 July 1906 Jensen began his own newspaper, Pressen (The Press), with John Martin. Modeled on contemporary American tabloids, Pressen was filled with sensational news, cartoons, and many advertisements. As Jensen did not have any political or cultural program for the newspaper, he did not find any readers, and Pressen lasted only until 31 July. This project was preceded by several trips to Himmerland and Berlin, and at the same time Jensen began translating Frank Norris’s novel The Octopus (1901), eventually published as Af Hvedens Saga: Polypen, en Bog om Kalifornien (1907, From the Wheat’s Saga: The Octopus, a Book about California). By writing the foreword to Naar Naturen kalder, a translation in 1907 of Jack London’s The Call of the Wild (1903), Jensen called attention in Denmark to another American writer; he introduced a third American writer, Ernest Hemingway, to the Danish public with an essay in Politiken (30 May 1930).
As a result of a 1905 trip to New York City, Jensen began writing a series of articles in March 1906 for Politiken about journalism, literature, and social issues in the United States. These essays were republished in the collection Den ny Verden: Til international Belysning of nordisk Bondekultur (1907, The New World: For an International Illustration of Nordic Peasant Culture). The heroes of the book are the American reporter Norris, who successfully portrayed the hectic pioneer spirit during the growth of the United States, and Theodore Roosevelt, who is seen as the epitome of American civilization because of his dynamic and extroverted nature. From the fall of 1906 to the summer of 1907 Jensen was preoccupied with polemical exchanges with various Danish writers and critics. For this reason the collection of prose Myter og Jagter (Myths and Hunts), from 1907, includes primarily texts that had previously been published. Singapore Noveller (Singapore Stories), also from 1907, was likewise based on older material. Together with the texts in Lille Ahasvérus (1909, Little Ahasuerus) and Olivia Marianne (1915), the stories in Singapore Noveller were collected in Ehothke Noveller (1916, Exotic Stories). They form an exotic counterpart to the stories about Himmerland. The United States provides the setting for some of the stories, but most of them are set in Java and China, inspired by Jensen’s Far East trip in the summer and fall of 1902, and owe their quality mainly to the exquisitely drawn scenery and linguistic virtuosity. In their somewhat simplistic view of the life of Europeans among the natives, the stories are an example of the strong influence Kipling had on Jensen’s early writings.
In the summer of 1907 Jensen, tired of literary disputes–although he had provoked them himself– made a trip to Norway, and in the following winter he went to Sweden. These visits resulted in several realistic hunting descriptions for Politiken, later included in the volume Nye Myter (1908, New Myths), which also includes one of Jensen’s best prose texts, “Darwin og Fuglen” (Darwin and the Bird). It opens with a magnificent spring scene, then is extended into a portrait of Darwin, the man whose evolutionary theories influenced Jensen’s writing for the remaining forty-three years of his life. With this volume and Myter og Jagter, Jensen had begun to create a series of brief, somewhat uneven prose texts, so-called myter or myths, which usually were first printed in newspapers; from 1910 to 1944 nine additional volumes were published. In essays and sketches based on reading and traveling, Jensen incorporates
Øjebliksbilleder fra Gaden, indre dæmrende Erindringer omspændende alle fem Verdensdele, Historien, Urtiden og en fjern Barndom... mellem hinanden, men sandt til Hobe forsaa vidt som det hænger sammen og har Tone og Farve.
(snapshots from the street, inner dawning memories encompassing all five continents, history, the earliest times and a distant childhood... intermingled, but every bit of it true in so far as it has coherence and resonance and color.)
Jensen presents his basic ideas in a symbolic, concentrated form: a full acceptance of present reality as the source and final goal of all longing as in “Fusijama” (1907; translated as “Fujiyama,” 1958) and a belief in eternity as it is found in the cyclic reappearance of the seasons as in “Nordisk Foraar” (1912, Nordic Spring). Jensen’s myths are based on his belief in the necessity of placing oneself in a meaningful context with nature as in “Haren” (1908, The Hare) and creating links to the most distant memories from history and prehistory as in “Dansk Natur” (1910, Danish Nature). This myth gives a superb description of Denmark seen in the light of the country’s past, incarnated in ancient monuments and in visions of the life of Stone Age people. It is actually one of several myths exemplifying the impossibility of seizing and preserving the present moment in isolation and thus–characteristic of the split in Jensen between materialism and spirituality–partially contradicting the “Fusijama” myth. Directly dealing with man’s quest for the indefinite and eternal are the myths “Moderen og Barnet” (1917, Mother and Child) and “Den store Kristoffer” (1917, The Great Christopher). “Moderen og Barnet,” which in his Æsketik og Udvikling: Efterskerift til Den lange Rejse (1923, Aesthetics and Evolution: Postscript to The Long Journey) Jensen called “nok det gyldigste jeg har gjort” (probably the most valid thing I have ever written), is based on the Roman Catholic concept of the Madonna and child, “Livets skønneste Symbol, Slægten i et Afbillede, som var Slægten selv, det højeste Under, og samtidigt den højeste Moral” (life’s most beautiful symbol, the family in a single image that is the family itself, the highest miracle and at the same time the highest ethics). In his retelling of the legend of St. Christopher, Jensen makes a Northerner of him. His staff, which changed into a palm, is interpreted mythically as the Northerner’s longing for the South, which is finally satisfied when the South comes to him in the person of the infant Jesus, whom Christopher carries across the river and into the North, an achievement that is duplicated when his namesake, Columbus, brings Christianity to the New World.
These two myths, as well as several others, were reworked and incorporated into the six books that became Den huge Rejse (1938; translated as The Long Journey, 3 volumes, 1922-1924). This multivolume novel comprises an evolutionary history or rather a fantasy of mankind. It was intended as a scientific counterpart to the biblical legends but turned out to be a collection of legends itself. Jensen sees the challenge of nature as the driving force of progress that brings about evolution. Bræen: Myter om Istiden og det første Menneske (1908, The Glacier: Myths of the Ice Age and of the First Man; translated in Fire and Ice, 1922) is the first written in the series and also its most popular. Here the Glacial Age has forced the humans to migrate southward; only Dreng (Boy) turns in defiance to the north and founds, together with the woman Moa (Mother), a large family. In a memorable scene Dreng rediscovers fire by striking sparks from flint, while Moa collects seeds and begins to till the soil.
Det tabte Land: Mermesket før Istiden (1919, The Lost Land: Man Before the Ice Age; translated in Fire and Ice) is a Darwinian myth of creation about the transition from animal to Homo sapiens in the preglacial rain forests of Jutland. The major character of the volume is Fyr (Fire), who climbs a volcano and, like Prometheus, steals the fire. Up on the mountain he sees the ocean in the distance, and the feeling of longing is stirred in him for the first time.
In Norne-Gæst (1919; translated in The Cimbrians, 1923), Jensen follows the lives of the Northerners from the Glacial Age to the Great Migration. The opening lines describe the newborn title character’s first glimpse of the blue sky between the leaves. This vision is to become crucial for his insatiable longing to travel that will drive Norne-Gæst around the world, encountering, for instance, Greek and Roman civilization, and up through the Bronze and Iron Ages. Cimbrernes Tog (1922, The Raids of the Cimbri; translated in The Cimbrians), the last volume of the series to be published, is also mainly set abroad. It opens with Norne-Gæst wandering up through Jutland in order to attend the spring festivals in Himmerland, home of the Cimbrians. Suddenly climatic deterioration sets in with floods and famine, and the Cimbrians set off under the command of Bojerik, a name modeled after a Boiorix mentioned in one of Jensen’s sources, Plutarch’s Parallel Lives. The narrative then follows the everyday life of the Cimbrians as they raid down through Europe, with intervening mythic scenes as observed and commented upon by the omnipresent Norne-Gæst. Other tribes join the Cimbrians, and together they win their first battles against the Romans. Here the Northerner and the Southerner, separated in Bræen, meet again. Eventually, the Cimbrians, having acquired the decadence of the foreign lands, are defeated in a bloody battle; yet, the volume concludes on a conciliatory note in which the spirits of antiquity and of the North merge harmoniously.
In Skibet (1912, The Ship) the Nordic longing to travel is embodied in the Vikings and their raids to the Mediterranean. In the North, Christianity is introduced through a monk, Brother Parvus. Jensen’s description of Parvus’s works of charity belongs among his best passages, forming an essential correlation to his frequently stressed anti-Christian attitude. The first church is erected by turning the Viking ship upside down. In the deepest sense the forest itself becomes a cathedral, while the longing to go abroad takes on a religious dimension.
Longing as the basic trait of the Nordic people becomes personified by the title character of the 1921 novel Christofer Columbus–whom Jensen had previously described as a reddish-blond Northerner in Hjulet– and his voyages of discovery. Jensen’s view of the defiant and struggling individual as the creator of culture, fundamental to Den lange Rejse, is paralleled by his concept of the basic trait of the Nordic people–“the Gothic race”–as being the dream about warmth and sun. This dream, which is Jensen’s explanation of the religious sentiment, is expressed through a longing for distant places, in the final account a longing for paradise that becomes embodied in the structures of the ship and the upward-reaching Gothic cathedral. The Viking migrations, the “raids of the Cimbri,” were a result of this longing, as was the voyage of the Goth, Columbus. His attempt at finding legendary lands resulted, however, in the discovery of America, of reality; and yet, at the conclusion of his life, Columbus realizes that his new discovery has brought him nothing but homelessness and loneliness. Now the initial optimism turns into tragic resignation, as Columbus subsequently chooses not a metaphysical solution but a return to his memories of the past. He does not realize that he must let his journey continue toward the eternal as it is depicted in the myth, “Ave Stella,” that concludes the volume.
Den lange Rejse is not a novel with a traditional plot centered around the adventures of a hero, nor should the work be read as a scientifically correct depiction of various cultural stages. The fact that so many of its theories are contrary to modern history, anthropology, and archeology is irrelevant and cannot weaken the work as literature. Rather, Den lange Rejse is a vision written by an artist with a formidable ability to identify with other periods and conditions. The outcome proves the impossibility of creating a meaningful coherence based on evolution alone, and Jensen’s project defies any organizing structure; however, the six volumes form a grandiose and gripping artistic work that is outstanding as a depiction of the ages of history and of mankind.
Jensen gradually turned away from the writing of fiction in favor of a growing involvement in current cultural and scientific issues; this reorientation was accompanied by a focusing on the feature article and the essay with the purpose of popularizing the theories of evolution. In Jensen’s collection of essays Nordisk Aand: Kroni-ker og Karakteristiker (1911, Nordic Spirit: Chronicles and Characteristics), the American society that he had glorified in Den ny Verden was analyzed further and seen as an implementation of the program in Den gotiske Renaissance, indeed as the true expression of the Nordic character; the Scandinavian prototype of this character is Bjørnson, who is portrayed with several other Danish and Norwegian writers. The volume concludes with a fierce attack on contemporary Danish literary critics for not appreciating Jensen’s work.
In April 1911 Jensen traveled with his wife to Paris and London via Berlin and Cologne. His fascination with the cathedral in Cologne found powerful expression in his next essay collection; and in a 10 May 1911 travel letter to Politiken from Normandy, where he looked for traces of the ancient Nordic population, can be seen the first impulses for the novel Skibet, which is set during the Viking Age. In 1912 Jensen also published a monograph, Rudyard Kipling, written in connection with several translations of various Kipling stories that he and Aslaug Mikkelsen had begun in 1911. Although Jensen was strongly influenced by the English writer during the composition of Singapore Noveller and the two additional volumes of short stories from 1909 and 1915, he is strongly critical of Kipling’s imperialistic attitude and deprecation of women.
In the fall of 1912 Jensen began his second great journey to Asia, from which he returned the following year. His reflections en route were later included in the philosophical travelogue Introduction til vor Tidsalder (1915, Introduction to Our Epoch). “I det indiske Hav” (In the Indian Ocean) displays pure poetry in some passages, where Jensen describes the voyage until the vessel sails into the Ceylonese port of Colombo in December. Singapore is revisited, and the homesickness that Jensen had experienced a decade earlier overtook him again. In China he found the theme for a lyrical short story, “Darduses Myndlinge” (Darduse’s Wards), which was later adapted for the stage as the comedy Darduse, Bryllupet i Peking: Eventyrkomedie i fire akter (1937,
Darduse, Wedding in Peking: Fairy Tale Comedy in Four Acts). From Manchuria, Jensen continued on his trip, describing in the chapter “fra Østen til Evropa” (from the East toward Europe) his experiences traveling on the Trans-Siberian Railroad, seeing again with joy the European peasant culture, and describing jubilantly his return to Scandinavia. In its analysis of the Northerner’s longing, the book forms a striking link between Den gotiske Renaissance and Christofer Columbus. At the same time, Jensen’s attempt to give religious feelings a purely physiological explanation is one more bit of evidence that the question of immortality kept troubling him.
Jensen planned to make still another journey to Asia, by way of the United States, the following summer, but did not complete his trip. In March 1914 he embarked for New York City, but the joy of rediscovery was moderate. Even though Jensen still admired the American press and the hectic and progressive atmosphere in the country, he sharply attacked what he saw as the childishness and bigotry of the Americans. In mid April, Jensen decided to return to Copenhagen, and a few months later World War I broke out. Since Denmark remained neutral, Jensen could leave for Berlin in August in order to negotiate with his German publisher, Samuel Fischer. During his visit he became strongly critical of the bellicosity that was shared by all of the German political parties, and he bitterly regretted the clash between Germany and Britain, since both nations had the same Gothic origin.
During the war Jensen was mainly occupied with writing the last volumes of Den lange Rejse. He also prepared a greatly enlarged edition of his collected poems, Digte: Tredie Udgave (1921, Poems: Third Edition). It includes–in addition to several memorial poems and other portraits–nine poems from Den lange Rejse. Of these, the alliterative “Drengs Gravsang” (Dreng’s Elegy) from Bræen binds up the experience of love and boundless longing in a mythic vision. “Vor Frue” (Our Lady), structured on the prosody of the medieval hymn “Dies irae,” and the ballad-like “Den sørgeligste Vise” (The Saddest Song) from Christofer Columbus treat other recurrent themes in Jensen’s writing: the first glorifies woman as a saving force and culminates in an apotheosis of the Madonna, while the second is one of the most overwhelming interpretations in Danish poetry of the futility of life. Disillusion and resignation also characterize the majority of remaining poems in the volume, many of which are in alliterative form.
Negative reviews of Cimbrernes Tog and the Den lange Rejse project as a whole by a critic for Politiken led to another of Jensen’s many literary feuds and culminated in a break with the newspaper that lasted until 1926. Shortly after the break, in December 1922, Jensen began the publication of his own periodical, Forum: Tidsskrift for Litteratur, Biologi og Samfundsspørgsmål (Forum: Periodical of Literature, Biology, and Social Issues), which survived for only one and a half years. In the first issue of Forum he published his most succinct and best-written settlement with the misuse of Darwinism, “Den daarlige Darwinisme” (The Bad Darwinism). In addition, Forum included a few insignificant poems about birds that, together with other animal poems illustrated by Jensen’s close friend the painter Johannes Larsen, were collected in the volume Aarstiderne (1923, The Seasons). Jensen’s interest in this painter and in the visual arts in general–he was himself a painter and was also an accomplished sculptor–resulted in several art books, two of which deal with the renowned Danish sculptor Bertel Thorvaldsen. Related to these works is the collection of newspaper articles about art, Form og Sjæl: Portræter af Personligheder (1931, Form and Soul: Portraits of Personalities), one more expression of Jensen’s love for classicism and of sculpture.
Essential for an understanding of Jensen’s fiction is his 1923 work Æsktik og Udvikling. The book is a sort of afterword to Den lange Rejse, in which Jensen also launches his “gradus” theory, a conception designating the gliding steps of evolution that he wanted to stress rather than the static species–further developed in later volumes, such as Aandens Stadier (1928, Stages of the Mind). In Æsktik og Udvikling, Jensen writes: “Den lange Rejse handler ikke eksklusivt om en Race, den handler om Udviklingstrin. Den ene Race er Udviklingstrinet af den anden” (Den lange Rejse does not deal exclusively with a race, it deals with stages of evolution. One race is the evolutionary stage of another). At the same time Jensen sharply attacks novels that focus exclusively on individuals as being pre-Darwinian and thus hero-worshiping and outdated. The concept of “bad Darwinism” is discussed again in the obscure and insignificant collection of previously published articles about Darwinism, Evolution og Moral (1925, Evolution and Ethics).
At the end of 1925, Jensen once again set out on a long journey; this time the goal was the Egyptian health resort of Helwan. From Egypt and later from Palestine, he sent several travel letters home to the newspaper Social-Demokraten, revised versions of which were included in Aandens Stadier.
Jensen continued to write memorial poems, and his perfection of this genre can be seen in the undervalued collection Verdens Lys (1926, The Light of the World). Verdens Lys includes six alliterative poems from Cimbrernes Tog that–in a mythic perspective–juxtapose and celebrate the union between North and South. Distinct among Jensen’s works in its focus on artistic expression is “Graven i Sne” (The Snow-Covered Grave), a memorial poem to the Danish Romantic poet Adam Oehlenschläger. It is both an idealized portrait of Oehlenschläger, Jensen’s beloved model as both a harmonious artist and human being–“To Gange gav han Livet Form, / i Livet selv, i ædel Norm” (Twice he gave to life a form, / in life itself, in noble norm)–and a poem about the eternal value of art in spite of the inevitability of death. It becomes a glorification of Jensen’s own poetic art, as it is expressed in a structure that combines observation, vision, and reflection into a perfect artistic entity that may turn out to be the only way to overcome the absurdity of life. The poems in memory of Jensen’s father offer an affectionate portrait of a man who lived in close intimacy with the miracles of life in nature, a closeness to nature inherited by the poet although not without discord. A much more somber tone is heard in the obituary for Jensen’s mother, the concluding poem of the volume, “Ved min Moders Død” (At My Mother’s Death), as well as in the hymn-like “Kirken i Hardanger” (The Church in Hardanger), in which the reader can perceive, behind the stoic resignation that results from placing oneself in the hands of the cycle of nature, the author’s fear of annihilation, which is so powerful that it threatens to break up the poetic form.
In 1927 Dyrenes Forvandling: Til Udviklingens Plastik (The Transformation of Animals: A Contribution to the Plasticity of Evolution) was published; it was another presentation of evolutionary theories but without the fierce attacks on Christianity that characterize some of his earlier writings. Jensen attempts–and the task appears scientifically absurd–to describe the animal soul and ethics as they change through all stages of evolution. Trips to Madeira and Rome followed in 1928, and impressions from both trips were likewise included in the anthropological study of human origin and development, Aandens Stadier.
In May 1929 Jensen received an honorary doctorate from the University of Lund. At the same time Jensen began to focus again on Nordic issues and started his translation of Egil’s Saga for a planned edition of De islandske Sagaer (The Icelandic Sagas), which was published in three volumes from 1930 to 1932. At the same time he returned to his preoccupation with Nordic archaeology in a series of articles subsequently published in the volume Paa danske Veje (1931, On Danish Roads) with his own photos and drawings by Larsen. With this work Jensen got involved in a fierce public debate about the preservation of the ancient burial mounds and stirred up so much political attention that stringent conservation laws were put into effect in 1937. His introduction to De islandske Sagaer, in which he discusses the sagas as products of a genuine Nordic mentality untouched by the traditions of antiquity and Christianity, points ahead to his book about women in the Viking Age, Kvinden i Sagatiden (1942, The Woman in the Saga Period), which also includes retellings of selected saga texts. For a new edition of Snorre Sturlason’s Heimskringla, published in 1948, Jensen took upon himself the difficult task of translating all 539 stanzas.
The transcendental aspect in his earlier poetry is not found in Den jydske Blæst: Digte 1926–1930 (1931, The Jutland Wind: Poems 1926-1930), Jensen’s last important poetry collection. Whereas death in, for instance, the memorial poem “Knut Hamsun” is still accepted as a pantheistic amalgamation into nature, most of the texts are structured on the tragic contrast between active life and the corruption of death, as in the memorial poem “Otto Benzon,” and now nature brings no consolation: “I Kammerdøren peb Vindens Røst, / en ensom Jammer, ingen Trøst” (The voice of the wind whistled in the chamber door / a lonesome lamentation, no comfort). When the memorial poem over others becomes a poem about Jensen himself, as in the concluding title poem, one finds the same death motif, the portrayal of death as man’s tragic but only certain verity. Jensen’s writing has this motif in common with Baroque literature, albeit without Heaven as the final destination.
On his sixtieth birthday, 20 January 1933, Jensen had reached such an esteemed position that a torchlight procession was held in his honor in Copenhagen and a festschrift published, Unge Digteres Hyldest til Johannes V. Jemen (Young Writers’ Homage to Johannes V. Jensen). In response to virulent public criticism of his lack of political commitment, Jensen, in an article published in Politiken on his birthday, emphasized–as he had done throughout his career–his independence as a freelance writer outside the political parties. Nevertheless, in a review of Hartvig Frisch’s Pest over Europa (1933, Plague over Europe) in the same newspaper on 3 December, he once again attacked Nietzsche’s philosophy and its consequences in the Germany of the early 1930s. Here Jensen strongly dissociated himself from contemporary political developments in Europe toward dictatorship, and, in the 1938 article “Hagekorset” (The Swastika), he publicly expressed his disgust with anti-Semitism.
In 1930 Jensen published the collection of essays and articles Retninger i Tiden: Artikler 1925-30 (Trends of the Times: Articles 1925-30); however, during the first part of the 1930s Jensen turned away from the essay form, with the exception of his short history of ideas, Det Blivende: Tankens Revolutionering i det 19de Aarhundrede og Tilbagefaldet i det 20de (1934, The Permanent: The Revolution of Thought in the 19th Century and the Backslide in the 20th), written in a more concise and concentrated style than his other philosophical works. With the underrated novel Dr. Renaults Fristelser (1935, The Temptations of Dr. Renault) he again took up fiction, reworking the Faust motif into a plot that, in contrast to the version presented by Goethe, lets the title character win over Mephistopheles because he is ready to fully accept the present. As in Hjulet, aestheticism is regarded as a barrier between man and reality, and in a significant scene Dr. Renault throws a valuable statue of Aphrodite into the sea, so that nothing will stand between him and life.
In October 1936 Jensen went on a short trip to the United States in order to collect material for a sequel to Dr. Renaults Fristelser, in spite of the negative reception that the book had received. The continuation never materialized. Instead, Jensen published the novel Gudrun (1936), a realization of a much older project: a contemporary novel of the Copenhagen woman, and thus also a novel about the city of Copenhagen, but completely different from Jensen’s first two novels. The city is no longer seen through the eyes of a student from the provinces. Now a citizen of Copenhagen for many years–and a matured artist–Jensen delivers a deeply intimate tribute to this city as a swarming, animated organism. Most of Jensen’s poems written in the 1930s were collected in 1937 as Paaskebadet: Digte 1931-1937 (The Easter Bath: Poems 1931-1937).
That same year, Jensen’s play adaptation of Darduse, Bryllupet i Peking, a “Fairy Tale Comedy in Four Acts,” had its premiere on 22 January at the Royal Theater in Copenhagen and was performed sixteen times. It was a weak play, carried by excellent acting and Knudáge Riisager’s rousing music. Jensen’s relationship with the theater was, on the whole, marked by a lack of success. During his stay in Chicago in the winter of 1902–1903 he had his own dramatization of some of his Himmerland stories performed by Danish-American amateurs with the title Trods (2002, Defiance). The one and only performance was a resounding fiasco. A dramatization of Madame D’Ora, titled Sangerinden (The Singer) and published as Sangerinden (Madame d’Ora): Drama i fem Akter (1921, The Singer [Madame d’Ora]: Drama in Five Acts), was performed unsuccessfully in 1923, premiering at the Odense Teater on 16 November; and when, on 24 April 1937, Jensen’s translation of William Shakespeare’s Hamlet was performed at the Royal Theater, it provoked such fierce criticism that the production had to be canceled.
Shortly before the outbreak of World War II, Jensen managed to visit Norway, described in the travelogue Nordvejen: Indtryk afnorsk Natur (1939, The Way North: Impressions of Norwegian Nature), and then the United States; in March 1939 he left Denmark for his sixth journey to the New World. His travel letters were first printed in Politiken and subsequently as a book, titled Fra Fristaterne: Rejsebreve, med et Tilbageblik (1939, From the Free States: Travel Letters with a Retrospect). This time Jensen’s encounter with the United States was marked by ambivalence. In the chapter “Fra Stillehavet til Atlanten” (From the Pacific Ocean to the Atlantic), which is among the best sections of the book, the impressions from the trip are summed up. Although his overall impressions were still positive, Jensen was disappointed by what he saw as the increasing vulgarity and materialism of American society, where technology had become an end in itself, not an expression of man’s inventiveness and ingenuity. After having arrived on the East Coast, Jensen became ill and had to return home earlier than planned. Back in Copenhagen in May 1939 he added a concluding chapter on Thomas Jefferson, meant as a counterweight to the current antidemocratic trends in Europe.
Jensen had planned a tour to France for September 1939 in order to visit the regions where traces of prehistoric man had been discovered, but this plan was thwarted by the outbreak of World War II and had to be postponed until spring 1948. When the Germans occupied Denmark in April 1940, Jensen, apparently fearing arrest, took the precaution of burning his diaries from the previous thirty years, together with all personal letters written to him. In this way much data related to the writing of his books and–most important–notes from his many travels were lost for posterity. Nevertheless, he continued to write throughout the occupation from 1940 to 1945. In 1941 came Mindets Tavle: Portrœter og Personligheder (Plaque of Commemoration: Portraits and Personalities) with portraits of Nordic and British explorers, scientists, writers–and Darwin once again–and in 1943 the collection of some unimportant ethnographic articles, Folkeslagene i østen (The People of the Orient). Of greater quality is the study Vor Oprindehe (1941, Our Origin), describing humanity’s gradual acquisition of civilization. Again Jensen’s myth-creating fantasy bloomed in a visionary description of cultural progress. At the same time he offers a subtle analysis of his authorship, clearly drawing up a balance sheet and expressing his feeling that he was at the end of the road, a feeling that is also expressed in the essay “Tak til Sproget” (Thanks to Language), Jensen’s farewell to literature; this essay was included, together with some linguistic studies, in a small book, Om Sproget og Undervisningen (1942, About Language and Teaching).
After having been nominated several times for the Nobel Prize in Literature, Jensen finally received it on 9 November 1944. The Nobel Committee announced that the prize was awarded in recognition of “the remarkable force and richness of his poetic imagination, combined with a wide-ranging intellectualism and bold, innovative sense of style,” and Den lange Rejse appeared to be a decisive factor for the committee. In his speech of thanks at the City Hall of Stockholm on 10 December 1945, the writer once again paid tribute to Darwin and identified himself with the scientist: “To him [Darwin], evolution was not only the subject of a life’s study but the very essence of life, proof of the inexhaustible richness and wonder of nature, revealed each day and taken to heart.”
Since 1925 a large number of Danish, Norwegian, and Swedish intellectuals, literary scholars, and scientists had been proposing Jensen for the prize. Initially, the proponents pointed to his earlier works, in particular his myths, and his distinctive handling of style and language as well as his depictions of the Danish mind and Danish nature as qualifying him for the prize. Judged negatively, however, were Jensen’s anthropological essays for their lack of speculative content as well as their polemical tone.
The publication in 1938 of the two-volume edition of Den lange Rejse once again drew attention to Jensen’s major series of novels, and the positive attitude was strengthened further in 1943 with the publication of his poetry collection Digte: 1901-1941. Also, the German occupation of Denmark during World War II probably had an impact on the decision to award Jensen the prize in 1944.
In Denmark the Nobel Prize was welcomed by the entire press, which published large articles on Jensen on their front pages and had his friends and colleagues comment on the significance of the award as well as on Jensen’s authorship in general. The underground press also noticed the prize, even though Jensen was not an active member of the resistance. Because of the war, however, the event did not attract much attention abroad.
Jensen definitely appreciated having been awarded the Nobel Prize and expressed a certain amount of pride and self-esteem in his banquet speech in Stockholm; but he was not dependent on the money, and the award did not have any visible impact on his personal life or his writing. Neither did it contribute to increase his reputation abroad, which had reached its zenith in the interwar years, particularly in Germany and the Scandinavian countries; he never became a popular writer in the Anglo-American countries.
During the last years of Jensen’s life his productivity decreased significantly. After an operation in September 1948 he managed to finish a book about the great explorers, Afrika: Opdagelserejserne (1949, Africa: Journeys of Discovery), demonstrating that one of the favorite topics of his childhood reading was still inspiring him. Otherwise, Jensen was primarily occupied with the preparation of a combined, revised edition of the three books Dyrenes For-vandling Aandens Stadier, and Vor Oprindelse, but he managed to complete only the first volume, published posthumously in 1951 in the volume Tilblivelsen (Genesis). Troubled by an ear disease, he also suffered from shingles during the summer of 1949 and was hospitalized. He still managed to write a few more articles; the most valuable of these, “Adam Oehlenschläger 1779-1850,” published in the book Swift og Oehlenschläger (1950, Swift and Oehlenschläger), is a finely drawn portrait of his artistic model as a man of simple nature, without stiltedness, who calmly accepted death as Jensen had described in the poem “Graven i Sne.” On 31 January 1950 Jensen commented on his second great model, Darwin, in a short article, “Træsk fra vor Oprindelse” (Traits from Our Origin). These two personalities, the poet Oehlenschläger and the scientist Darwin, perfectly symbolize the two facets of Jensen’s authorship. In his later years natural science came to dominate, but the first element stands as the most valuable, the one that will survive.
Johannes V. Jensen died on 25 November 1950. Brandes once claimed that as a thinker and preacher Jensen could not be taken seriously. Brandes had a point: the content ofjensen’s many collections of essays and articles dealing with natural science, archaeology, and anthropology are often based on dubious scientific theories and deductions. In addition, in these volumes Jensen linguistically turns from lyrical expressiveness to a terse, matter-of-fact diction. But one must not fail to notice that in spite of the scientific topic, his stylistic mastery often breaks forth in evocative passages that can be read as sublime prose poetry. Jensen’s critics have also frequently overlooked that he was brilliant as a journalistic writer. He was unusually well-informed about current trends, and his knowledge of American society and literature was unique for a Dane of his time. Neither should his contributions as a translator be overlooked. Besides his accomplished translations from Old Norse, his outstanding rendering of Whitman’s poetry in Danish must be acknowledged.
Jensen reached perfection as a lyrical poet and a creator of myth, both in his collections of poetry and myths and when he succeeded in combining the two genres, as in Kongens Fald, Skovene, Den huge Rejse, and several of his stories of Himmerland. With his debut collection of poems, Jensen introduced modernism in Danish poetry, and he became the writer who, arguably, has had the strongest impact on twentieth-century Danish literature.
Bibliographies
Frits Johansen and Aage Marcus, Johannes V. Jensen: En Bibliografi, 2 volumes (Copenhagen: Gyldendal, 1933-1951);
Aage Jørgensen, Litteratur om Johannes V. Jensen En bibliografi (Odense: Odense University Press, 1998).
Biographies
K. K. Nicolaisen, Johannes V. Jensen: Bidrag til hans Biografi og Karakteristik (Aalborg: Viggo Madsens Boghandel, 1914);
Oluf Friis, Den unge Johannes V. Jensen 1873-1902, 2 volumes (Copenhagen: Gad, 1974);
Villum Jensen, Min fars hus: Erindringer om Johannes V. Jensen og hans miljø (Copenhagen: Gyldendal, 1976);
Leif Nedergaard, Johannes V. Jensen, third edition (Copenhagen: C. A. Reitzel, 1993);
Lars Handesten, Johannes V. Jensen: Liv og Værk (Copenhagen: Gyldendal, 2000).
References
Harry Andersen, Afhandlinger om Johannes V. Jensen (Rødovre: Rolv, 1982);
Andersen, Studier i Johannes V. Jensens Lyrik: Verseteknik, Sprog og Stil (Copenhagen: Levin & Munksgaard, 1936);
Jørgen Elbek, Johannes V. Jensen (Copenhagen: Gyldendal, 1966);
Otto Gelsted, Johannes V. Jensen: Kurven i hans Udvikling (Copenhagen: Arthur Jensen, 1938);
Alf Henriques, Johannes V. Jensen (Copenhagen: H. Hirschsprung, 1938);
Iben Holk, Jartegn: Et essay om Johannes V. Jensens myter (Copenhagen: C. A. Reitzel, 2000);
Poul Houe, Johannes V. Jensens lange rejse: En postmoderne myte (Copenhagen: Museum Tusculanum, 1996);
Niels Ingwersen, “America as Setting and Symbol in Johannes V. Jensen’s Early Works,” American Norvegica, 3 (1971): 272-293;
Bent Haugaard Jeppesen, Johannes V. Jensen og den hvide mands byrde: Eksotisme og imperialisme (Copenhagen: Rhodos, 1984);
Aage Jørgensen, “Johannes V. Jensen (Literature): ‘. . . a good enough poet and; nowadays, a good enough human being. . . .’” in Neighbouring Nobel: The History of Thirteen Danish Nobel Prizes, edited by Henry Nielsen and Keld Nielsen (Aarhus: Aarhus University Press, 2001), pp. 207-243;
Jørgensen and Helene Kragh-Jacobsen, eds., Columbus fra Himmerland (Farsø: Farsø Bibliotek, 1994);
Jørgensen and Anders Thyrring Andersen, eds., Et spring ind i et billede: Johannes V. Jensens mytedigtning (Odense: Odense Universitetsforlag, 2000);
Aage Marcus, “Johannes V. Jensen,” American-Scandinavian Review, 20 (1932): 339-347;
Felix Nørgaard and Aage Marcus, eds., Johannes V. Jensen. 1873-20. Januar–1943 (Copenhagen: Gyldendal, 1943);
Sven Hakon Rossel, “Andersen og Jensen–Eventyret og myten,” in Hvad Fatter gjør... Boghistoriske, litterære og musikalske essays tilegnet Erik Dal, edited by Henrik Glahn and others (Herning: Poul Kristensen, 1982), pp. 392-402;
Rossel, Johannes V. Jensen (Boston: Twayne, 1984);
Aage Schiøttz-Christensen, Om sammenhængen i Johannes V. Jensens forfatterskab (Copenhagen: Borgen, 1956);
Henrik Wivel, Den titaniske eros: Drifts- og karakterfortolkning i Johannes V. Jensens forfatterskab (Copenhagen: Gyldendal, 1982).
Papers
The major collection of correspondence and manuscripts is in the Johannes V. Jensen Archives at Det kongelige Bibliotek (The Royal Library), Copenhagen. Additional material is located at Statsbiblioteket, Arhus, and the Farsø Bibliotek. Forty-three letters from Jensen to various Norwegian writers and friends are located at the University Library, Oslo.
|
|||||
correct_award_00093
|
FactBench
|
0
| 9
|
https://www.librarything.com/author/jensenjohannesv
|
en
|
Johannes V. Jensen
|
https://www.librarything.com/favicon_lt_32.ico
|
https://www.librarything.com/favicon_lt_32.ico
|
[
"https://image.librarything.com/pics/lt1wordmark_150w.png",
"https://image.librarything.com/pics/ltlogo_square_fff_trans.png",
"https://image.librarything.com/pics/sbar_4.png",
"https://pics.cdn.librarything.com//picsizes/4d/0a/4d0ab67711e4efc65756e7341514330414c5141_v5.jpg",
"https://images-na.ssl-images-amazon.com/images/P/0816677549.01._SX100_SCLZZZZZZZ_.jpg",
"https://images-na.ssl-images-amazon.com/images/P/0816677549.01._SX100_SCLZZZZZZZ_.jpg",
"https://images-na.ssl-images-amazon.com/images/P/0816677549.01._SX100_SCLZZZZZZZ_.jpg",
"https://images-na.ssl-images-amazon.com/images/P/0816677549.01._SX100_SCLZZZZZZZ_.jpg",
"https://image.librarything.com/pics/voteup-y.gif",
"https://pics.cdn.librarything.com//picsizes/16/06/16069e3b46354d5636e532b7377424c41457341_v5.jpg",
"https://pics.cdn.librarything.com//picsizes/6d/99/6d99691d0ff0e846541666f6e41426b41414141_v5.jpg",
"https://pics.cdn.librarything.com//picsizes/4b/e8/4be8a11fd85ad1d654573727a51426b41414141_v5.jpg",
"https://pics.cdn.librarything.com//picsizes/19/eb/19ebe60cbe3789765494c744151426b41414141_v5.jpg",
"https://pics.cdn.librarything.com//picsizes/cc/ec/ccece66899281a9654530727a51426b41414141_v5.jpg",
"https://pics.cdn.librarything.com//picsizes/af/4f/af4fa5944b28b0f654630727a51426b41414141_v5.jpg",
"https://pics.cdn.librarything.com//picsizes/7e/c3/7ec37ab9ce0e572654a666c4151426b41414141_v5.jpg",
"https://pics.cdn.librarything.com//picsizes/66/06/660699e6e5d0182654351576d77426b41414141_v5.jpg",
"https://pics.cdn.librarything.com//picsizes/19/eb/19ebe60cbe3789765494c744151426b41414141_v5.jpg",
"https://pics.cdn.librarything.com//picsizes/9e/7d/9e7da8838053b77654162774151426b41414141_v5.jpg",
"https://pics.cdn.librarything.com//picsizes/f4/48/f44879964ecb8c4654c67736951426b41414141_v5.jpg"
] |
[] |
[] |
[
"librarything",
"library",
"thing",
"catalog your books",
"catalogue your books",
"book cataloging",
"library",
"free book catalog",
"catalogue"
] | null |
[] | null |
Johannes V. Jensen, author of The Fall of the King, on LibraryThing
|
en
|
/favicon_lt_32.ico
|
LibraryThing.com
|
https://www.librarything.com/author/jensenjohannesv
| |||
correct_award_00093
|
FactBench
|
0
| 75
|
https://www.livescience.com/16364-nobel-prize-literature-history.html
|
en
|
Nobel Prize in Literature: 1901-Present
|
[
"https://mos.fie.futurecdn.net/nsn8tjxqmqhfgoo4-16455329884552.png",
"https://cdn.mos.cms.futurecdn.net/uaQ2tD3hpUGUsdKxfWn8Ao-320-80.jpg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://sb.scorecardresearch.com/p/?c1=2&c2=10055482&cv=4.4.0&cj=1"
] |
[] |
[] |
[
""
] | null |
[
"Live Science Staff"
] |
2019-10-11T12:53:00+00:00
|
A history of the Nobel Prize in Literature, including winners Toni Morrison, Gabriel Garcia Marquez and John Steinbeck.
|
en
|
livescience.com
|
https://www.livescience.com/16364-nobel-prize-literature-history.html
|
The Nobel Prize in Literature is given to "the person who shall have produced in the field of literature the most outstanding work in an ideal direction," according to Alfred Nobel's will. The 18-member Swedish Academy selects the Nobel Laureates in Literature.
The winners, along with the reasons given by the Swedish Academy for the award, are:
2019: Peter Handke "for an influential work that with linguistic ingenuity has explored the periphery and the specificity of human experience," the Royal Swedish Academy of Sciences announced. The 76-year-old Austrian author is perhaps best known for his novella on his mother's suicide, "A Sorrow Beyond Dreams." Handke was a controversial choice due to his support for the Serbs during the 1990s Yugoslav war, the BBC News reported.
2018: Olga Tokarczuk "for a narrative imagination that with encyclopedic passion represents the crossing of boundaries as a form of life," the Royal Swedish Academy of Sciences announced. Her novel "Primeval and Other Times" traces the history of Poland from WWI to the 1980s, the BBC News reported.
2017: English author Kazuo Ishiguro "who, in novels of great emotional force, has uncovered the abyss beneath our illusory sense of connection with the world," according to the Swedish Academy. His novels include: "The Remains of the Day," "Never Let Me Go," "The Buried Giant," "When We Were Orphans," "An Artist of the Floating World," "A Pale View of Hills," "Nocturnes: Five Stories of Music and Nightfall" and "The Unconsoled.
2016: The Swedish Academy of Science has awarded this year's Nobel Prize in Literature to Bob Dylan, "for having created new poetic expressions within the great American song tradition."
2015: Belarusian author Svetlana Alexievich "for her polyphonic writings, a monument to suffering and courage in our time," according to the Swedish Academy. She is known for her works about the women involved in World War II; the consequences of the Chernobyl nuclear disaster in 1986; a portrayal of the Soviet Union's war in Afghanistan; and other books depicting life in the Soviet Union.
2014: Patrick Modiano "for the art of memory with which he has evoked the most ungraspable human destinies and uncovered the life-world of the occupation," according to the Swedish Academy.
2013: Alice Munro, for "her finely tuned storytelling."
2012: Mo Yan, for his "mixture of fantasy and reality, historical and social perspectives."
2011: Tomas Tranströmer, "because, through his condensed, translucent images, he gives us fresh access to reality."
2010: Mario Vargas Llosa, "for his cartography of structures of power and his trenchant images of the individual's resistance, revolt and defeat."
2009: Herta Müller,"who, with the concentration of poetry and the frankness of prose, depicts the landscape of the dispossessed."
2008: Jean-Marie Gustave Le Clézio, "author of new departures, poetic adventure and sensual ecstasy, explorer of a humanity beyond and below the reigning civilization."
2007: Doris Lessing, "that epicist of the female experience, who with skepticism, fire and visionary power has subjected a divided civilization to scrutiny."
2006: Orhan Pamuk, "who in the quest for the melancholic soul of his native city has discovered new symbols for the clash and interlacing of cultures."
2005: Harold Pinter, "who in his plays uncovers the precipice under everyday prattle and forces entry into oppression's closed rooms."
2004: Elfriede Jelinek, "for her musical flow of voices and counter-voices in novels and plays that with extraordinary linguistic zeal reveal the absurdity of society's clichés and their subjugating power."
2003: John M. Coetzee, "who in innumerable guises portrays the surprising involvement of the outsider."
2002: Imre Kertész, "for writing that upholds the fragile experience of the individual against the barbaric arbitrariness of history."
2001: Sir Vidiadhar Surajprasad Naipaul, "for having united perceptive narrative and incorruptible scrutiny in works that compel us to see the presence of suppressed histories."
2000: Gao Xingjian, "for an œuvre of universal validity, bitter insights and linguistic ingenuity, which has opened new paths for the Chinese novel and drama."
1999: Günter Grass, "whose frolicsome black fables portray the forgotten face of history."
1998: José Saramago, "who with parables sustained by imagination, compassion and irony continually enables us once again to apprehend an elusory reality."
1997: Dario Fo, "who emulates the jesters of the Middle Ages in scourging authority and upholding the dignity of the downtrodden."
1996: Wislawa Szymborska, "for poetry that with ironic precision allows the historical and biological context to come to light in fragments of human reality."
1995: Seamus Heaney, "for works of lyrical beauty and ethical depth, which exalt everyday miracles and the living past."
1994: Kenzaburo Oe, "who with poetic force creates an imagined world, where life and myth condense to form a disconcerting picture of the human predicament today."
1993: Toni Morrison, "who in novels characterized by visionary force and poetic import, gives life to an essential aspect of American reality."
1992: Derek Walcott, "for a poetic oeuvre of great luminosity, sustained by a historical vision, the outcome of a multicultural commitment."
1991: Nadine Gordimer, "who through her magnificent epic writing has - in the words of Alfred Nobel - been of very great benefit to humanity."
1990: Octavio Paz, "for impassioned writing with wide horizons, characterized by sensuous intelligence and humanistic integrity."
1989: Camilo José Cela, "for a rich and intensive prose, which with restrained compassion forms a challenging vision of man's vulnerability."
1988: Naguib Mahfouz, "who, through works rich in nuance - now clear-sightedly realistic, now evocatively ambiguous - has formed an Arabian narrative art that applies to all mankind."
1987: Joseph Brodsky, "for an all-embracing authorship, imbued with clarity of thought and poetic intensity."
1986: Wole Soyinka, "who in a wide cultural perspective and with poetic overtones fashions the drama of existence."
1985: Claude Simon, "who in his novel combines the poet's and the painter's creativeness with a deepened awareness of time in the depiction of the human condition."
1984: Jaroslav Seifert, "for his poetry which endowed with freshness, sensuality and rich inventiveness provides a liberating image of the indomitable spirit and versatility of man."
1983: William Golding, "for his novels which, with the perspicuity of realistic narrative art and the diversity and universality of myth, illuminate the human condition in the world of today."
1982: Gabriel García Márquez, "for his novels and short stories, in which the fantastic and the realistic are combined in a richly composed world of imagination, reflecting a continent's life and conflicts."
1981: Elias Canetti, "for writings marked by a broad outlook, a wealth of ideas and artistic power."
1980: Czeslaw Milosz, "who with uncompromising clear-sightedness voices man's exposed condition in a world of severe conflicts."
1979: Odysseus Elytis, "for his poetry, which, against the background of Greek tradition, depicts with sensuous strength and intellectual clear-sightedness modern man's struggle for freedom and creativeness."
1978: Isaac Bashevis Singer, "for his impassioned narrative art which, with roots in a Polish-Jewish cultural tradition, brings universal human conditions to life."
1977: Vicente Aleixandre, "for a creative poetic writing which illuminates man's condition in the cosmos and in present-day society, at the same time representing the great renewal of the traditions of Spanish poetry between the wars."
1976: Saul Bellow, "for the human understanding and subtle analysis of contemporary culture that are combined in his work."
1975: Eugenio Montale, "for his distinctive poetry which, with great artistic sensitivity, has interpreted human values under the sign of an outlook on life with no illusions."
1974: Eyvind Johnson, "for a narrative art, far-seeing in lands and ages, in the service of freedom," and Harry Martinson, "for writings that catch the dewdrop and reflect the cosmos."
1973: Patrick White, "for an epic and psychological narrative art which has introduced a new continent into literature."
1972: Heinrich Böll, "for his writing which through its combination of a broad perspective on his time and a sensitive skill in characterization has contributed to a renewal of German literature."
1971: Pablo Neruda, "for a poetry that with the action of an elemental force brings alive a continent's destiny and dreams."
1970: Aleksandr Isayevich Solzhenitsyn, "for the ethical force with which he has pursued the indispensable traditions of Russian literature."
1969: Samuel Beckett, "for his writing, which - in new forms for the novel and drama - in the destitution of modern man acquires its elevation."
1968: Yasunari Kawabata, "for his narrative mastery, which with great sensibility expresses the essence of the Japanese mind."
1967: Miguel Angel Asturias, "for his vivid literary achievement, deep-rooted in the national traits and traditions of Indian peoples of Latin America."
1966: Shmuel Yosef Agnon, "for his profoundly characteristic narrative art with motifs from the life of the Jewish people," and Nelly Sachs, "for her outstanding lyrical and dramatic writing, which interprets Israel's destiny with touching strength."
1965: Mikhail Aleksandrovich Sholokhov, "for the artistic power and integrity with which, in his epic of the Don, he has given expression to a historic phase in the life of the Russian people."
1964: Jean-Paul Sartre, "for his work which, rich in ideas and filled with the spirit of freedom and the quest for truth, has exerted a far-reaching influence on our age."
1963: Giorgos Seferis, "for his eminent lyrical writing, inspired by a deep feeling for the Hellenic world of culture."
1962: John Steinbeck, "for his realistic and imaginative writings, combining as they do sympathetic humour and keen social perception."
1961: Ivo Andric, "for the epic force with which he has traced themes and depicted human destinies drawn from the history of his country."
1960: Saint-John Perse, "for the soaring flight and the evocative imagery of his poetry which in a visionary fashion reflects the conditions of our time."
1959: Salvatore Quasimodo, "for his lyrical poetry, which with classical fire expresses the tragic experience of life in our own times."
1958: Boris Leonidovich Pasternak, "for his important achievement both in contemporary lyrical poetry and in the field of the great Russian epic tradition."
1957: Albert Camus, "for his important literary production, which with clear-sighted earnestness illuminates the problems of the human conscience in our times."
1956: Juan Ramón Jiménez, "for his lyrical poetry, which in Spanish language constitutes an example of high spirit and artistical purity."
1955: Halldór Kiljan Laxness, "for his vivid epic power which has renewed the great narrative art of Iceland."
1954: Ernest Miller Hemingway, "for his mastery of the art of narrative, most recently demonstrated in 'The Old Man and the Sea,' and for the influence that he has exerted on contemporary style."
1953: Sir Winston Leonard Spencer Churchill, "for his mastery of historical and biographical description as well as for brilliant oratory in defending exalted human values."
1952: François Mauriac, "for the deep spiritual insight and the artistic intensity with which he has in his novels penetrated the drama of human life."
1951: Pär Fabian Lagerkvist, "for the artistic vigor and true independence of mind with which he endeavors in his poetry to find answers to the eternal questions confronting mankind."
1950: Earl (Bertrand Arthur William) Russell, "in recognition of his varied and significant writings in which he champions humanitarian ideals and freedom of thought."
1949: William Faulkner, "for his powerful and artistically unique contribution to the modern American novel."
1948: Thomas Stearns Eliot, "for his outstanding, pioneer contribution to present-day poetry."
1947: André Paul Guillaume Gide, "for his comprehensive and artistically significant writings, in which human problems and conditions have been presented with a fearless love of truth and keen psychological insight."
1946: Hermann Hesse, "for his inspired writings which, while growing in boldness and penetration, exemplify the classical humanitarian ideals and high qualities of style."
1945: Gabriela Mistral, "for her lyric poetry which, inspired by powerful emotions, has made her name a symbol of the idealistic aspirations of the entire Latin American world."
1944: Johannes Vilhelm Jensen, "for the rare strength and fertility of his poetic imagination with which is combined an intellectual curiosity of wide scope and a bold, freshly creative style."
1940-1943: No Nobel Prize awarded
1939: Frans Eemil Sillanpää, "for his deep understanding of his country's peasantry and the exquisite art with which he has portrayed their way of life and their relationship with Nature."
1938: Pearl Buck, "for her rich and truly epic descriptions of peasant life in China and for her biographical masterpieces."
1937: Roger Martin du Gard, "for the artistic power and truth with which he has depicted human conflict as well as some fundamental aspects of contemporary life in his novel-cycle Les Thibault."
1936: Eugene Gladstone O'Neill, "for the power, honesty and deep-felt emotions of his dramatic works, which embody an original concept of tragedy."
1935: No Prize awarded.
1934: Luigi Pirandello, "for his bold and ingenious revival of dramatic and scenic art."
1933: Ivan Alekseyevich Bunin, "for the strict artistry with which he has carried on the classical Russian traditions in prose writing."
1932: John Galsworthy, "for his distinguished art of narration which takes its highest form in The Forsyte Saga."
1931: Erik Axel Karlfeldt, for his poetry.
1930: Sinclair Lewis, "for his vigorous and graphic art of description and his ability to create, with wit and humor, new types of characters."
1929: Thomas Mann, "principally for his great novel, Buddenbrooks, which has won steadily increased recognition as one of the classic works of contemporary literature."
1928: Sigrid Undset, "principally for her powerful descriptions of Northern life during the Middle Ages."
1927: Henri Bergson, "in recognition of his rich and vitalizing ideas and the brilliant skill with which they have been presented."
1926: Grazia Deledda, "for her idealistically inspired writings which with plastic clarity picture the life on her native island and with depth and sympathy deal with human problems in general."
1925: George Bernard Shaw, "for his work which is marked by both idealism and humanity, its stimulating satire often being infused with a singular poetic beauty."
1924: Wladyslaw Stanislaw Reymont, "for his great national epic, The Peasants."
1923: William Butler Yeats, "for his always inspired poetry, which in a highly artistic form gives expression to the spirit of a whole nation."
1922: Jacinto Benavente, "for the happy manner in which he has continued the illustrious traditions of the Spanish drama."
1921: Anatole France, "in recognition of his brilliant literary achievements, characterized as they are by a nobility of style, a profound human sympathy, grace, and a true Gallic temperament."
1920: Knut Pedersen Hamsun, "for his monumental work, Growth of the Soil."
1919: Carl Friedrich Georg Spitteler, "in special appreciation of his epic, Olympian Spring."
1918: No Prize awarded
1917: Karl Adolph Gjellerup, "for his varied and rich poetry, which is inspired by lofty ideals," and Henrik Pontoppidan, "for his authentic descriptions of present-day life in Denmark."
1916: Carl Gustaf Verner von Heidenstam, "in recognition of his significance as the leading representative of a new era in our literature."
1915: Romain Rolland, "as a tribute to the lofty idealism of his literary production and to the sympathy and love of truth with which he has described different types of human beings."
1914: No Prize awarded
1913: Rabindranath Tagore, "because of his profoundly sensitive, fresh and beautiful verse, by which, with consummate skill, he has made his poetic thought, expressed in his own English words, a part of the literature of the West."
1912: Gerhart Johann Robert Hauptmann, "primarily in recognition of his fruitful, varied and outstanding production in the realm of dramatic art."
1911: Count Maurice (Mooris) Polidore Marie Bernhard Maeterlinck, "in appreciation of his many-sided literary activities, and especially of his dramatic works, which are distinguished by a wealth of imagination and by a poetic fancy, which reveals, sometimes in the guise of a fairy tale, a deep inspiration, while in a mysterious way they appeal to the readers' own feelings and stimulate their imaginations."
1910: Paul Johann Ludwig Heyse, "as a tribute to the consummate artistry, permeated with idealism, which he has demonstrated during his long productive career as a lyric poet, dramatist, novelist and writer of world-renowned short stories."
1909: Selma Ottilia Lovisa Lagerlöf, "in appreciation of the lofty idealism, vivid imagination and spiritual perception that characterize her writings."
1908: Rudolf Christoph Eucken, "in recognition of his earnest search for truth, his penetrating power of thought, his wide range of vision, and the warmth and strength in presentation with which in his numerous works he has vindicated and developed an idealistic philosophy of life."
1907: Rudyard Kipling, "in consideration of the power of observation, originality of imagination, virility of ideas and remarkable talent for narration which characterize the creations of this world-famous author."
1906: Giosuè Carducci, "not only in consideration of his deep learning and critical research, but above all as a tribute to the creative energy, freshness of style, and lyrical force which characterize his poetic masterpieces."
1905: Henryk Sienkiewicz, "because of his outstanding merits as an epic writer."
1904: Frédéric Mistral, "in recognition of the fresh originality and true inspiration of his poetic production, which faithfully reflects the natural scenery and native spirit of his people, and, in addition, his significant work as a Provençal philologist," and José Echegaray y Eizaguirre, "in recognition of the numerous and brilliant compositions which, in an individual and original manner, have revived the great traditions of the Spanish drama."
1903: Bjørnstjerne Martinus Bjørnson, "as a tribute to his noble, magnificent and versatile poetry, which has always been distinguished by both the freshness of its inspiration and the rare purity of its spirit."
1902: Christian Matthias Theodor Mommsen, "the greatest living master of the art of historical writing, with special reference to his monumental work, A history of Rome."
|
|||||
correct_award_00093
|
FactBench
|
2
| 83
|
https://booksien.wordpress.com/2017/08/14/bookish-in-copenhagen/
|
en
|
Bookish in Copenhagen
|
[
"https://booksien.wordpress.com/wp-content/uploads/2016/10/cropped-header-leuven2.jpg",
"https://booksien.wordpress.com/wp-content/uploads/2017/08/thumb-politikens-boghal.jpg?w=900",
"https://booksien.wordpress.com/wp-content/uploads/2017/08/thumb-black-diamond.jpg?w=900",
"https://booksien.wordpress.com/wp-content/uploads/2017/08/thumb-bispekjerg.jpg?w=900",
"https://booksien.wordpress.com/wp-content/uploads/2017/08/thumb-hc-andersen.jpg?w=900",
"https://assets.pinterest.com/images/pidgets/pinit_fg_en_rect_gray_20.png",
"https://0.gravatar.com/avatar/39a7288fd6e204735b420c432f3a3f54e0e7f833a7d1314f33ae699fbe7b2b62?s=60&d=identicon&r=G",
"https://1.gravatar.com/avatar/42e86f22336edb9172b53bac67484134d251c2b2e6e88bd4b5759144e212fb8b?s=48&d=identicon&r=G",
"https://0.gravatar.com/avatar/3aa8e69e63b929546389e3ef7bbe878a47c5ff8b7fa4a3d9029095e8dc229aa1?s=48&d=identicon&r=G",
"https://0.gravatar.com/avatar/950b9b2cc7f758e12175260840cabf916689473244c53a7118d847a0779d9194?s=48&d=identicon&r=G",
"https://1.gravatar.com/avatar/42e86f22336edb9172b53bac67484134d251c2b2e6e88bd4b5759144e212fb8b?s=48&d=identicon&r=G",
"https://0.gravatar.com/avatar/3aa8e69e63b929546389e3ef7bbe878a47c5ff8b7fa4a3d9029095e8dc229aa1?s=48&d=identicon&r=G",
"https://0.gravatar.com/avatar/950b9b2cc7f758e12175260840cabf916689473244c53a7118d847a0779d9194?s=48&d=identicon&r=G",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://pixel.wp.com/b.gif?v=noscript"
] |
[] |
[] |
[
""
] | null |
[] |
2017-08-14T00:00:00
|
The city of Hans Christian Andersen honours its literary background with beautiful bookstores. Get to know all the best literary places in Copenhagen!
|
en
|
https://s1.wp.com/i/favicon.ico
|
BOOKSIEN
|
https://booksien.wordpress.com/2017/08/14/bookish-in-copenhagen/
|
The city of Hans Christian Andersen honours its literary background with beautiful bookstores. Get to know all the best literary places in Copenhagen!
Politikens Boghal – Rådhuspladsen 37
This bookstore was my favourite store for new books. They have the best collection of translated Danish literature I could find in Copenhagen, and a very good English collection as well. They organise a lot of events, so check their calendar when planning a visit! I also like their collection of literary merchandise: mugs, notebooks, that kind of stuff. It’s a great place to find a gift for a bookish friend!
Secondhand bookstores
Copenhagen has many cool secondhand bookstores, all with cute nooks, corners, dark wood and old books. Most of them have a good collection of English books, so it’s worth to check them out. These are my favourites:
The Booktrader – Skindergade 23
Paludan Bog & Café – Fiolstræde 10
Books & Curiosities – Læderstræde 9
The Royal Library – Søren Kierkegaards Pl. 1
This building is a beautiful combination of modern and older architecture. Enjoy the view over the water from the staircases in the lobby, or take a peek at the old reading room in the older part of the building. This library is the biggest in Northern Europe and definitely worth the visit.
The Little Mermaid – Langelinie
One of the biggest highlights of Copenhagen is a literary one: the statue of the Little Mermaid from the story by Hans Christian Andersen. Prepare for a lot of tourists, but also for a lovely view over the water.
Bispebjerk Kirkegård – Frederiksborgvej 125
At this beautiful graveyard, you can find the grave of Johannes Vilhelm Jensen, who won the Nobel Prize in Literature in 1944. His book The Fall of the King was named the best book of the 20th century by the Danish people. You can’t find his name on the map the cemetery provides, but if you show them a picture of the grave they’ll recognize it and point you the right way.
Bakkehus Museum – Rahbeks Alle 23
This house used to belong to Kamma and Knud Lyne Rahbek, who played an important role in the Romantic period in Denmark. Not only did they write themselves, but they helped other great Danish writers financially. They were visited by writers like Hans Christian Andersen and Adam Oehlenschläger, who wrote the Danish national anthem. Rahbek’s old study room has a beautiful collection of old books!
|
||||
correct_award_00093
|
FactBench
|
3
| 20
|
https://www.livescience.com/16362-nobel-prize-physics-list.html
|
en
|
Nobel Prize in Physics: 1901-Present
|
[
"https://mos.fie.futurecdn.net/nsn8tjxqmqhfgoo4-16455329884552.png",
"https://cdn.mos.cms.futurecdn.net/j8cKnbQFcGupxZQsTfdL34-320-80.jpg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://vanilla.futurecdn.net/cyclingnews/media/img/missing-image.svg",
"https://sb.scorecardresearch.com/p/?c1=2&c2=10055482&cv=4.4.0&cj=1"
] |
[] |
[] |
[
""
] | null |
[
"Live Science Staff"
] |
2022-10-04T14:39:22+00:00
|
The history of the winners of the Nobel Prize in physics, including Steven Chu, Aage Niels Bohr and Enrico Fermi.
|
en
|
livescience.com
|
https://www.livescience.com/16362-nobel-prize-physics-list.html
|
According to Alfred Nobel's will, the Nobel Prize in Physics was to go to "the person who shall have made the most important discovery or invention within the field of physics." The prize has been awarded every year except for 1916, 1931, 1934, 1940, 1941 and 1942.
Here is the full list of winners:
2023: Pierre Agostini, Ferenc Krausz, and Anne L’Huillier won the 2023 prize for devising a way to generate pulses of light measured in attoseconds — one quintillionth of a second. An attosecond is to a second what a second is to the age of the universe, a miniscule slice of time so short that it can be used to peer at the movements of electrons and molecules.
2022: American physicist John Clauser, French physicist Alain Aspect and Austrian physicist Anton Zeilinger each shared the 2022 prize "for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science,” according to the Nobel Prize organization. Their work demonstrated that what Einstein so famously dubbed "spooky action at a distance" is real and laid the groundwork for early quantum computers.
2021: The 2021 Nobel prize went to three scientists whose work alerted the world to the dangers of climate change. The prize was awarded for "for groundbreaking contributions to our understanding of complex physical systems." Syukuro Manabe and Klaus Hasselmann shared one-half of the prize "for the physical modeling of Earth’s climate, quantifying variability and reliably predicting global warming" while Giorgio Parisi won the other half "for the discovery of the interplay of disorder and fluctuations in physical systems from atomic to planetary scales."
2020: The Nobel Prize in Physics 2020 was divided amongst a trio of black hole researchers. One half of the award went to Roger Penrose, "for the discovery that black hole formation is a robust prediction of the general theory of relativity", while Reinhard Genzel and Andrea Ghez jointly shared the other half "for the discovery of a supermassive compact object at the centre of our galaxy"
2019: Canadian-American James Peebles of Princeton University received one-half of the Nobel "for theoretical discoveries in physical cosmology," the Royal Swedish Academy of Sciences said. The other half of the prize was awarded jointly to Michel Mayor and Didier Queloz, "for the discovery of an exoplanet orbiting a solar-type star," the Academy said. Mayor is a professor at the University of Geneva in Switzerland, and Queloz is at both the University of Geneva and the University of Cambridge in the U.K.
Together, the trio won the Nobel "for contributions to our understanding of the evolution of the universe and Earth’s place in the cosmos," the Academy said.
2018: Arthur Ashkin was awarded one half of the prize, and the other half was awarded jointly to Donna Strickland and Gérard Mourou, "for groundbreaking inventions in the field of laser physics." This was the first time in 55 years that a woman was part of the Nobel Prize in physics. [Read more about the 2018 prize and Nobel Laureates]
2017: Half of the 9 million Swedish krona ($1.1 million) award went to Rainer Weiss of MIT. The other half was shared jointly to Barry Barish and Kip Thorne of Caltech. The prize honored the trio's "decisive contributions to the LIGO detector and the observation of gravitational waves," according to Nobelprize.org. The three scientists were integral in the first detection of the ripples in space-time called gravitational waves. The waves in this case came from the collision of two black holes 1.3 billion years ago.
2016: One half was awarded to David J. Thouless, of the University of Washington, Seattle, and the other half to F. Duncan M. Haldane, Princeton University, and J. Michael Kosterlitz, Brown University, Providence. Their theoretical discoveries opened the door to a weird world where matter can take on strange states. According to the Nobel Foundation: "Thanks to their pioneering work, the hunt is now on for new and exotic phases of matter. Many people are hopeful of future applications in both materials science and electronics."
2015: Takaaki Kajita and Arthur B. McDonald for showing the metamorphosis of neutrinos, which revealed that the subatomic particles have mass and opened up a new realm in particle physics.
2014: Isamu Akasaki, Hiroshi Amano and Shuji Nakamura for their invention of an energy-efficient light source: blue light-emitting diodes (LEDs).
2013: Peter Higgs of the United Kingdom and François Englert of Belgium, two of the scientists who predicted the existence of the Higgs boson nearly 50 years ago. [Related: Higgs Boson Physicists Snag Nobel Prize]
2012: French physicist Serge Haroche and American physicist David Wineland, for their pioneering research in quantum optics.
2011: One half awarded to Saul Perlmutter, the other half jointly to Brian P. Schmidt and Adam G. Riess, "for the discovery of the accelerating expansion of the Universe through observations of distant supernovae."
2010: Andre Geim and Konstantin Novoselov, "for groundbreaking experiments regarding the two-dimensional material graphene."
2009: Charles K. Kao, "for groundbreaking achievements concerning the transmission of light in fibers for optical communication," and Willard S. Boyle and George E. Smith, "for the invention of an imaging semiconductor circuit – the CCD sensor."
2008: Yoichiro Nambu, "for the discovery of the mechanism of spontaneous broken symmetry in subatomic physics," and Makoto Kobayashi, Toshihide Maskawa, "for the discovery of the origin of the broken symmetry which predicts the existence of at least three families of quarks in nature."
2007: Albert Fert and Peter Grünberg, "for the discovery of Giant Magnetoresistance"
2006: John C. Mather and George F. Smoot, "for their discovery of the blackbody form and anisotropy of the cosmic microwave background radiation."
2005: Roy J. Glauber, "for his contribution to the quantum theory of optical coherence," and John L. Hall and Theodor W. Hänsch, "for their contributions to the development of laser-based precision spectroscopy, including the optical frequency comb technique."
2004: David J. Gross, H. David Politzer and Frank Wilczek, "for the discovery of asymptotic freedom in the theory of the strong interaction."
2003: Alexei A. Abrikosov, Vitaly L. Ginzburg and Anthony J. Leggett, "for pioneering contributions to the theory of superconductors and superfluids."
2002: Raymond Davis Jr. and Masatoshi Koshiba, "for pioneering contributions to astrophysics, in particular for the detection of cosmic neutrinos," and Riccardo Giacconi, "for pioneering contributions to astrophysics, which have led to the discovery of cosmic X-ray sources."
2001: Eric A. Cornell, Wolfgang Ketterle and Carl E. Wieman, "for the achievement of Bose-Einstein condensation in dilute gases of alkali atoms, and for early fundamental studies of the properties of the condensates."
2000: Zhores I. Alferov and Herbert Kroemer, "for developing semiconductor heterostructures used in high-speed- and opto-electronics," and Jack S. Kilby "for his part in the invention of the integrated circuit."
1999: Gerardus 't Hooft and Martinus J.G. Veltman, "for elucidating the quantum structure of electroweak interactions in physics."
1998: Robert B. Laughlin, Horst L. Störmer and Daniel C. Tsui, "for their discovery of a new form of quantum fluid with fractionally charged excitations."
1997: Steven Chu, Claude Cohen-Tannoudji and William D. Phillips, "for development of methods to cool and trap atoms with laser light."
1996: David M. Lee, Douglas D. Osheroff and Robert C. Richardson, "for their discovery of superfluidity in helium-3."
1995: Martin L. Perl, "for the discovery of the tau lepton," and Frederick Reines, "for the detection of the neutrino."
1994: Bertram N. Brockhouse, "for the development of neutron spectroscopy," and Clifford G. Shull, "for the development of the neutron diffraction technique."
1993: Russell A. Hulse and Joseph H. Taylor Jr., "for the discovery of a new type of pulsar, a discovery that has opened up new possibilities for the study of gravitation."
1992: Georges Charpak, "for his invention and development of particle detectors, in particular the multiwire proportional chamber."
1991: Pierre-Gilles de Gennes, "for discovering that methods developed for studying order phenomena in simple systems can be generalized to more complex forms of matter, in particular to liquid crystals and polymers."
1990: Jerome I. Friedman, Henry W. Kendall and Richard E. Taylor, "for their pioneering investigations concerning deep inelastic scattering of electrons on protons and bound neutrons, which have been of essential importance for the development of the quark model in particle physics."
1989: Norman F. Ramsey, "for the invention of the separated oscillatory fields method and its use in the hydrogen maser and other atomic clocks," and Hans G. Dehmelt and Wolfgang Paul, "for the development of the ion trap technique."
1988: Leon M. Lederman, Melvin Schwartz and Jack Steinberger, "for the neutrino beam method and the demonstration of the doublet structure of the leptons through the discovery of the muon neutrino."
1987: J. Georg Bednorz and K. Alexander Müller, "for their important break-through in the discovery of superconductivity in ceramic materials."
1986: Ernst Ruska, "for his fundamental work in electron optics, and for the design of the first electron microscope," and Gerd Binnig and Heinrich Rohrer, "for their design of the scanning tunneling microscope."
1985: Klaus von Klitzing, "for the discovery of the quantized Hall effect".
1984: Carlo Rubbia and Simon van der Meer, "for their decisive contributions to the large project, which led to the discovery of the field particles W and Z, communicators of weak interaction."
1983: Subramanyan Chandrasekhar, "for his theoretical studies of the physical processes of importance to the structure and evolution of the stars," and William Alfred Fowler, "for his theoretical and experimental studies of the nuclear reactions of importance in the formation of the chemical elements in the universe."
1982: Kenneth G. Wilson, "for his theory for critical phenomena in connection with phase transitions."
1981: Nicolaas Bloembergen and Arthur Leonard Schawlow, "for their contribution to the development of laser spectroscopy," and Kai M. Siegbahn, "for his contribution to the development of high-resolution electron spectroscopy."
1980: James Watson Cronin and Val Logsdon Fitch, "for the discovery of violations of fundamental symmetry principles in the decay of neutral K-mesons."
1979: Sheldon Lee Glashow, Abdus Salam and Steven Weinberg, "for their contributions to the theory of the unified weak and electromagnetic interaction between elementary particles, including, inter alia, the prediction of the weak neutral current."
1978: Pyotr Leonidovich Kapitsa, "for his basic inventions and discoveries in the area of low-temperature physics," and Arno Allan Penzias, Robert Woodrow Wilson "for their discovery of cosmic microwave background radiation."
1977: Philip Warren Anderson, Sir Nevill Francis Mott and John Hasbrouck van Vleck, "for their fundamental theoretical investigations of the electronic structure of magnetic and disordered systems."
1976: Burton Richter and Samuel Chao Chung Ting, "for their pioneering work in the discovery of a heavy elementary particle of a new kind."
1975: Aage Niels Bohr, Ben Roy Mottelson and Leo James Rainwater, "for the discovery of the connection between collective motion and particle motion in atomic nuclei and the development of the theory of the structure of the atomic nucleus based on this connection."
1974: Sir Martin Ryle and Antony Hewish, "for their pioneering research in radio astrophysics: Ryle for his observations and inventions, in particular of the aperture synthesis technique, and Hewish for his decisive role in the discovery of pulsars."
1973: Leo Esaki and Ivar Giaever, for "for their experimental discoveries regarding tunneling phenomena in semiconductors and superconductors, respectively," and Brian David Josephson, "for his theoretical predictions of the properties of a supercurrent through a tunnel barrier, in particular those phenomena which are generally known as the Josephson effects."
1972: John Bardeen, Leon Neil Cooper, John Robert Schrieffer, "for their jointly developed theory of superconductivity, usually called the BCS-theory."
1971: Dennis Gabor, "for his invention and development of the holographic method."
1970: Hannes Olof Gösta Alfvén, "for fundamental work and discoveries in magnetohydro- dynamics with fruitful applications in different parts of plasma physics," and Louis Eugène Félix Néel, "for fundamental work and discoveries concerning antiferromagnetism and ferrimagnetism which have led to important applications in solid state physics."
1969: Murray Gell-Mann, "for his contributions and discoveries concerning the classification of elementary particles and their interactions."
1968: Luis Walter Alvarez, "for his decisive contributions to elementary particle physics, in particular the discovery of a large number of resonance states, made possible through his development of the technique of using hydrogen bubble chamber and data analysis."
1967: Hans Albrecht Bethe, "for his contributions to the theory of nuclear reactions, especially his discoveries concerning the energy production in stars."
1966: Alfred Kastler, "for the discovery and development of optical methods for studying Hertzian resonances in atoms."
1965: Sin-Itiro Tomonaga, Julian Schwinger and Richard P. Feynman, "for their fundamental work in quantum electrodynamics, with deep-ploughing consequences for the physics of elementary particles."
1964: Charles Hard Townes, "for fundamental work in the field of quantum electronics, which has led to the construction of oscillators and amplifiers based on the maser-laser principle," and Nicolay Gennadiyevich Basov and Aleksandr Mikhailovich Prokhorov, "for fundamental work in the field of quantum electronics, which has led to the construction of oscillators and amplifiers based on the maser-laser principle."
1963: Eugene Paul Wigner, "for his contributions to the theory of the atomic nucleus and the elementary particles, particularly through the discovery and application of fundamental symmetry principles," and Maria Goeppert-Mayer and J. Hans D. Jensen, "for their discoveries concerning nuclear shell structure."
1962: Lev Davidovich Landau, "for his pioneering theories for condensed matter, especially liquid helium."
1961: Robert Hofstadter, "for his pioneering studies of electron scattering in atomic nuclei and for his thereby achieved discoveries concerning the structure of the nucleons," and Rudolf Ludwig Mössbauer, "for his researches concerning the resonance absorption of gamma radiation and his discovery in this connection of the effect which bears his name."
1960: Donald Arthur Glaser, "for the invention of the bubble chamber."
1959: Emilio Gino Segrè and Owen Chamberlain, "for their discovery of the antiproton."
1958: Pavel Alekseyevich Cherenkov, Il´ja Mikhailovich Frank and Igor Yevgenyevich Tamm, "for the discovery and the interpretation of the Cherenkov effect."
1957: Chen Ning Yang and Tsung-Dao (T.D.) Lee, "for their penetrating investigation of the so-called parity laws which has led to important discoveries regarding the elementary particles."
1956: William Bradford Shockley, John Bardeen and Walter Houser Brattain, "for their researches on semiconductors and their discovery of the transistor effect."
1955: Willis Eugene Lamb, "for his discoveries concerning the fine structure of the hydrogen spectrum," and Polykarp Kusch, "for his precision determination of the magnetic moment of the electron."
1954: Max Born, "for his fundamental research in quantum mechanics, especially for his statistical interpretation of the wavefunction," and Walther Bothe, "for the coincidence method and his discoveries made therewith."
1953: Frits (Frederik) Zernike, "for his demonstration of the phase contrast method, especially for his invention of the phase contrast microscope."
1952: Felix Bloch and Edward Mills Purcell, "for their development of new methods for nuclear magnetic precision measurements and discoveries in connection therewith."
1951: Sir John Douglas Cockcroft and Ernest Thomas Sinton Walton, "for their pioneer work on the transmutation of atomic nuclei by artificially accelerated atomic particles."
1950: Cecil Frank Powell, "for his development of the photographic method of studying nuclear processes and his discoveries regarding mesons made with this method."
1949: Hideki Yukawa, "for his prediction of the existence of mesons on the basis of theoretical work on nuclear forces."
1948: Patrick Maynard Stuart Blackett, "for his development of the Wilson cloud chamber method, and his discoveries therewith in the fields of nuclear physics and cosmic radiation."
1947: Sir Edward Victor Appleton, "for his investigations of the physics of the upper atmosphere especially for the discovery of the so-called Appleton layer."
1946: Percy Williams Bridgman, "for the invention of an apparatus to produce extremely high pressures, and for the discoveries he made therewith in the field of high pressure physics."
1945: Wolfgang Pauli, "for the discovery of the Exclusion Principle, also called the Pauli Principle."
1944: Isidor Isaac Rabi, "for his resonance method for recording the magnetic properties of atomic nuclei."
1943: Otto Stern, "for his contribution to the development of the molecular ray method and his discovery of the magnetic moment of the proton."
1940-1942: No Prizes awarded.
1939: Ernest Orlando Lawrence, "for the invention and development of the cyclotron and for results obtained with it, especially with regard to artificial radioactive elements."
1938: Enrico Fermi, "for his demonstrations of the existence of new radioactive elements produced by neutron irradiation, and for his related discovery of nuclear reactions brought about by slow neutrons."
1937: Clinton Joseph Davisson and George Paget Thomson, "for their experimental discovery of the diffraction of electrons by crystals."
1936: Victor Franz Hess, "for his discovery of cosmic radiation," and Carl David Anderson, "for his discovery of the positron."
1935: James Chadwick, "for the discovery of the neutron."
1934: No Prize awarded
1933: Erwin Schrödinger and Paul Adrien Maurice Dirac, "for the discovery of new productive forms of atomic theory."
1932: Werner Karl Heisenberg, "for the creation of quantum mechanics, the application of which has, inter alia, led to the discovery of the allotropic forms of hydrogen."
1931: No Prize awarded
1930: Sir Chandrasekhara Venkata Raman, "for his work on the scattering of light and for the discovery of the effect named after him"
1929: Prince Louis-Victor Pierre Raymond de Broglie, "for his discovery of the wave nature of electrons."
1928: Owen Willans Richardson, "for his work on the thermionic phenomenon and especially for the discovery of the law named after him."
1927: Arthur Holly Compton, "for his discovery of the effect named after him," and Charles Thomson Rees Wilson, "for his method of making the paths of electrically charged particles visible by condensation of vapor."
1926: Jean Baptiste Perrin, "for his work on the discontinuous structure of matter, and especially for his discovery of sedimentation equilibrium."
1925: James Franck and Gustav Ludwig Hertz, "for their discovery of the laws governing the impact of an electron upon an atom."
1924: Karl Manne Georg Siegbahn, "for his discoveries and research in the field of X-ray spectroscopy."
1923: Robert Andrews Millikan, "for his work on the elementary charge of electricity and on the photoelectric effect."
1922: Niels Henrik David Bohr, "for his services in the investigation of the structure of atoms and of the radiation emanating from them."
1921: Albert Einstein, "for his services to Theoretical Physics, and especially for his discovery of the law of the photoelectric effect."
1920: Charles Edouard Guillaume, "in recognition of the service he has rendered to precision measurements in Physics by his discovery of anomalies in nickel steel alloys."
1919: Johannes Stark, "for his discovery of the Doppler effect in canal rays and the splitting of spectral lines in electric fields."
1918: Max Karl Ernst Ludwig Planck, "in recognition of the services he rendered to the advancement of Physics by his discovery of energy quanta."
1917: Charles Glover Barkla, "for his discovery of the characteristic Röntgen radiation of the elements."
1916: No Prize awarded.
1915: Sir William Henry Bragg and William Lawrence Bragg, "for their services in the analysis of crystal structure by means of X-rays."
1914: Max von Laue, "for his discovery of the diffraction of X-rays by crystals."
1913: Heike Kamerlingh Onnes, "for his investigations on the properties of matter at low temperatures which led, inter alia, to the production of liquid helium."
1912: Nils Gustaf Dalén, "for his invention of automatic regulators for use in conjunction with gas accumulators for illuminating lighthouses and buoys."
1911: Wilhelm Wien, "for his discoveries regarding the laws governing the radiation of heat."
1910: Johannes Diderik van der Waals, "for his work on the equation of state for gases and liquids."
1909: Guglielmo Marconi and Karl Ferdinand Braun, "in recognition of their contributions to the development of wireless telegraphy."
1908: Gabriel Lippmann, "for his method of reproducing colors photographically based on the phenomenon of interference."
1907: Albert Abraham Michelson, "for his optical precision instruments and the spectroscopic and metrological investigations carried out with their aid."
1906: Joseph John Thomson, "in recognition of the great merits of his theoretical and experimental investigations on the conduction of electricity by gases."
1905: Philipp Eduard Anton von Lenard, "for his work on cathode rays."
1904: Lord Rayleigh (John William Strutt), "for his investigations of the densities of the most important gases and for his discovery of argon in connection with these studies."
1903: Antoine Henri Becquerel, " "in recognition of the extraordinary services he has rendered by his discovery of spontaneous radioactivity," and Pierre Curie and Marie Curie, née Sklodowska, "in recognition of the extraordinary services they have rendered by their joint researches on the radiation phenomena discovered by Professor Henri Becquerel."
1902: Hendrik Antoon Lorentz and Pieter Zeeman, "in recognition of the extraordinary service they rendered by their researches into the influence of magnetism upon radiation phenomena."
|
|||||
correct_award_00093
|
FactBench
|
1
| 39
|
https://www.theglobeandmail.com/arts/books-and-media/nobel-prize-misfits-are-a-literary-whos-who/article15986213/
|
en
|
Nobel Prize misfits are a literary who’s who
|
https://www.theglobeandmail.com/resizer/v2/NFCK72ZGUBGDBBNBDPVOVIELAQ?auth=f9228e21fe5cbc3e3f2d0fac0e4953e8414cf749aab624fb91cc6a981831bb98&width=1200&height=785&quality=80
|
https://www.theglobeandmail.com/resizer/v2/NFCK72ZGUBGDBBNBDPVOVIELAQ?auth=f9228e21fe5cbc3e3f2d0fac0e4953e8414cf749aab624fb91cc6a981831bb98&width=1200&height=785&quality=80
|
[
"https://sb.scorecardresearch.com/p?c1=2&c2=16433046&cv=3.9.1&cj=1",
"https://www.theglobeandmail.com/resizer/v2/NFCK72ZGUBGDBBNBDPVOVIELAQ?auth=f9228e21fe5cbc3e3f2d0fac0e4953e8414cf749aab624fb91cc6a981831bb98&width=600&quality=80",
"https://www.theglobeandmail.com/resizer/v2/5FFHF5OJZNDXDE2OXVDA2MFHKA.png?auth=60b191d3022bbca3f16ac852c79f8a272fe58413787ab273d3b2863f0bd4445f&width=300&height=200&quality=80",
"https://www.theglobeandmail.com/files/dev/www/cache-long/images/puzzles/puzz-cryptic-sm.png",
"https://www.theglobeandmail.com/files/dev/www/cache-long/images/puzzles/puzz-sudoku-sm.png",
"https://www.theglobeandmail.com/files/dev/www/cache-long/images/puzzles/puzz-universal-sm.png"
] |
[] |
[] |
[
"August Strindberg",
"Liu Xiaobo",
"Alice Munro",
"Nobel literature committee",
"academy",
"prize",
"literature",
"literary",
"winner",
"nobel",
"ideal direction",
"Swedish literary historian",
"literature committee",
"chemistry laureate",
"committee favourite",
"Academy member",
"city hall",
"Home field advantage",
"Stockholm",
"China",
"Sweden"
] | null |
[
"Robert Everett-Green"
] |
2013-12-16T20:30:34+00:00
|
How to qualify for ‘outstanding work in an ideal direction’ has depended on which age you were in
|
en
|
The Globe and Mail
|
https://www.theglobeandmail.com/arts/books-and-media/nobel-prize-misfits-are-a-literary-whos-who/article15986213/
|
In Sweden, the Nobel Prize ceremonies are huge, like the Academy Awards for smart people. The banquet at Stockholm's city hall is "one of the most-watched television programs in Sweden," writes a columnist for Svenska Dagbladet, the daily newspaper that on Tuesday gave blanket coverage to the clothes, the menu, the after-party and the four naked people who rushed the doors to protest the continued imprisonment in China of dissident writer Liu Xiaobo. Live bloggers recorded the fact that the British-Israeli chemistry laureate Michael Levitt gave his acceptance speech in Swedish, and that, when Alice Munro's daughter Jenny dropped her purse on the steps of the banquet hall, gallant Prince Carl Philip quickly scooped it up for her.
Close attention to the event over the past century has been matched, on the literary side, by intense scrutiny not just of the winners, but of exactly what Alfred Nobel had in mind. "The history of the Literature Prize appears as a series of attempts to interpret an imprecisely worded will," writes Swedish literary historian Kjell Espmark on the Nobel website. The literary award has stood for different things at different times, depending on who was in charge and who won it.
In his will, Nobel calls for a prize for "the most outstanding work in an ideal direction," and that, like the other prizes, the winner should be someone "who, during the preceding year, shall have conferred the greatest benefit on mankind." He threw this task to "the Academy in Stockholm," by which everyone assumed he meant the Swedish Academy, a sleepy institution set up in 1786 by King Gustav III to protect "the purity, vigour and majesty" of the Swedish language. This parochial panel of 18 writers, linguists and other academics had no obvious credentials for running a major international literary prize. "It was simply not fit for the task," writes Espmark, an Academy member since 1981 and one of five on its current Nobel committee.
Some members wanted to reject the commission, but the then-permanent secretary Carl David af Wirsén shrewdly foresaw "the enormous power and prestige that the Nobel will bequeaths to the Eighteen" – a local term for the Academy. They were convinced not by what they could do for the prize, but by what it could do for them.
The Academy gave itself a (temporary) monopoly on nominations, ignored the "preceding year" part of Nobel's direction, and decided that "ideal direction" meant literature of a romantic idealistic bent. That suited the Academy's conservative mentality, and slammed the door on alleged pessimists, such as Leo Tolstoy, August Strindberg and Émile Zola. It became a cliché of prize citations for the first few decades to praise the "lofty idealism" of the winner's prose, or to say that it captured "the spirit of a nation." Gabriela Mistral's citation in 1945 linked the two conceits into one, praising her for voicing "the idealist aspirations of the entire Latin American world."
There has been plenty of retrospective grief about the victims of that policy, who included Marcel Proust, James Joyce and Virginia Woolf, and especially about Strindberg, the native Swede who was banished so completely that he was never even nominated. To get the measure of that affront, you have to know that a candidate can be named multiple times, "until the nominee either wins the prize or dies or the sponsors give up," as the Academy's website says. Strindberg died in 1912, meaning he was snubbed a dozen times. The Danish novelist Johannes Vilhelm Jensen was nominated a record 18 times before he prevailed in 1944. Home field advantage may be gauged by the fact that 15 Scandinavians have won the prize; five of the seven Swedish winners were in the Academy at the time.
In the 1930s, the Academy reinterpreted Nobel's "benefit of mankind" phrase to mean that winner's work should be broadly accessible. That eliminated T.S. Eliot and Ezra Pound, and let in Sinclair Lewis (1930) and Pearl Buck (1938). The same thinking may have helped John Steinbeck win in 1962, although papers released earlier this year show that the endorsement was tepid (the Academy keeps its deliberations secret for 50 years). Eugene O'Neill also got a rough ride from dissenting jurors, who complained that his work was "not finished" and that he had "no culture." But his 1936 citation praised his "great poetic force."
After the Second World War, the Academy swung around to the idea that an "ideal direction" could be ground-breaking. The apex of this "pioneers" period, as Espmark calls it, was the award to Samuel Beckett in 1969.
Through this whole period, the Academy was almost entirely male. It has only ever had seven women members in its 227-year existence, including the five who are in it now. "The female element has been limited," says the Academy's website. That may have been a factor in the small number of women who have won the literary Nobel – just 13 of 110 winners, including the four who claimed it in the past decade.
Politics burst upon the prize in a big way in 1958, when winner Boris Pasternak was bullied by the USSR government into declining such a "bourgeois" honour. The Soviets weren't happy with Alexander Solzhenitsyn's win in 1970 either.
But no country has bent itself more out of shape over the literature prize than China, where in the past few decades, the award became a focus of a "near-pathological yearning for international prizes and 'face,'" writes Julia Lovell in her 2006 book, The Politics of Cultural Capital: China's Quest for a Nobel Prize in Literature. National competitions, conferences and delegations to Stockholm were bitterly rewarded in 2000, when the first prize to a Chinese writer went to Gao Xingjian, who lived in France and was no friend of the Chinese regime. The naked protesters at this year's award ceremony knew that Beijing would be watching.
The prize is now so international that people often ask – as Svenska Dagbladet readers did in a Q&A with Nobel literature committee chair Per Wästberg – whether the determining factor is literary quality or home address, and also how the Academy can possibly judge works it can often read only in translation. Neither worry seems to be an issue this year: Munro is said to have been a committee favourite, and not a compromise winner. But we'll have to wait 50 years to know for sure.
|
|||
correct_award_00093
|
FactBench
|
3
| 2
|
https://english.elpais.com/culture/2023-10-04/the-nobel-prize-in-literatures-history-of-snubs-and-long-forgotten-winners.html
|
en
|
The Nobel Prize in Literature’s history of snubs and long-forgotten winners
|
[
"https://static.elpais.com/dist/resources/images/logos/primary/el-pais.svg",
"https://imagenes.elpais.com/resizer/v2/345TDPMZSVCWPPFHHUPL5KX75U.jpg?auth=5225c9778d3aa9d17f5cf858a0b3c8ffd1b5d17e8979880c164bf68dd0673759&width=414 414w,https://imagenes.elpais.com/resizer/v2/345TDPMZSVCWPPFHHUPL5KX75U.jpg?auth=5225c9778d3aa9d17f5cf858a0b3c8ffd1b5d17e8979880c164bf68dd0673759&width=828 640w,https://imagenes.elpais.com/resizer/v2/345TDPMZSVCWPPFHHUPL5KX75U.jpg?auth=5225c9778d3aa9d17f5cf858a0b3c8ffd1b5d17e8979880c164bf68dd0673759&width=980 1000w,https://imagenes.elpais.com/resizer/v2/345TDPMZSVCWPPFHHUPL5KX75U.jpg?auth=5225c9778d3aa9d17f5cf858a0b3c8ffd1b5d17e8979880c164bf68dd0673759&width=1200 1200w",
"https://imagenes.elpais.com/resizer/v2/https%3A%2F%2Fs3.amazonaws.com%2Farc-authors%2Fprisa%2F7a9aca29-6268-4989-a7e4-1536610495c4.png?auth=7050b87eef2ddac3948587d7c985e761f21b140c84083d72b7d2e1555ec73366&width=100&height=100&smart=true",
"https://imagenes.elpais.com/resizer/v2/U2QF6YUQJJBVNGTPKJJJFMMPXY.jpg?auth=9cb81cac7a93d52043315cf9a7a8ff188a36434bdc1e169fadd52a3309e52ae8&width=414 414w,https://imagenes.elpais.com/resizer/v2/U2QF6YUQJJBVNGTPKJJJFMMPXY.jpg?auth=9cb81cac7a93d52043315cf9a7a8ff188a36434bdc1e169fadd52a3309e52ae8&width=828 640w,https://imagenes.elpais.com/resizer/v2/U2QF6YUQJJBVNGTPKJJJFMMPXY.jpg?auth=9cb81cac7a93d52043315cf9a7a8ff188a36434bdc1e169fadd52a3309e52ae8&width=980 1000w,https://imagenes.elpais.com/resizer/v2/U2QF6YUQJJBVNGTPKJJJFMMPXY.jpg?auth=9cb81cac7a93d52043315cf9a7a8ff188a36434bdc1e169fadd52a3309e52ae8&width=1960 1960w",
"https://imagenes.elpais.com/resizer/v2/R53HOZXYYVHK7MMXD2QGEEX624.jpg?auth=07021a6e932ad00796f2b3285f3aaca5ec38e6e19be32b5d1406a7083b8c60b0&width=414 414w,https://imagenes.elpais.com/resizer/v2/R53HOZXYYVHK7MMXD2QGEEX624.jpg?auth=07021a6e932ad00796f2b3285f3aaca5ec38e6e19be32b5d1406a7083b8c60b0&width=828 640w,https://imagenes.elpais.com/resizer/v2/R53HOZXYYVHK7MMXD2QGEEX624.jpg?auth=07021a6e932ad00796f2b3285f3aaca5ec38e6e19be32b5d1406a7083b8c60b0&width=980 1000w,https://imagenes.elpais.com/resizer/v2/R53HOZXYYVHK7MMXD2QGEEX624.jpg?auth=07021a6e932ad00796f2b3285f3aaca5ec38e6e19be32b5d1406a7083b8c60b0&width=1960 1960w",
"https://imagenes.elpais.com/resizer/v2/GLC5MQYB5RXQAD3NV3FVFCSKJQ.jpg?auth=42cdd32d9edf8e76b068ee9584fadf90a6c04e0565fb287df9f542958a3e4d08&width=414 414w,https://imagenes.elpais.com/resizer/v2/GLC5MQYB5RXQAD3NV3FVFCSKJQ.jpg?auth=42cdd32d9edf8e76b068ee9584fadf90a6c04e0565fb287df9f542958a3e4d08&width=828 640w,https://imagenes.elpais.com/resizer/v2/GLC5MQYB5RXQAD3NV3FVFCSKJQ.jpg?auth=42cdd32d9edf8e76b068ee9584fadf90a6c04e0565fb287df9f542958a3e4d08&width=980 1000w,https://imagenes.elpais.com/resizer/v2/GLC5MQYB5RXQAD3NV3FVFCSKJQ.jpg?auth=42cdd32d9edf8e76b068ee9584fadf90a6c04e0565fb287df9f542958a3e4d08&width=1960 1960w",
"https://imagenes.elpais.com/resizer/v2/LZQTRTTVUJF6VA7RHLAKHT745M.jpg?auth=f0d06c0fb7bd8eacf05e8ac3bca4d3134fc0b914d7a6a86489e5501efc67a039&width=414&height=311&focal=2923%2C624",
"https://imagenes.elpais.com/resizer/v2/QKKOHMT2V5H5RIQYK2J4YC24SE.jpg?auth=933f7afe4ffb8681533674293b0ce6056a816d52818235494efdde7a12182e0d&width=414&height=311&focal=1126%2C1076",
"https://static.elpais.com/dist/resources/images/the-trust-project.svg",
"https://imagenes.elpais.com/resizer/v2/TT6DPNSQJBDYRMTMX2K36ATUAU.png?auth=39f75d53f49aa0d0bf3db526864bd693ee8f4f82e53d31b748c089222e2d3204&width=414&height=233&smart=true 414w,https://imagenes.elpais.com/resizer/v2/TT6DPNSQJBDYRMTMX2K36ATUAU.png?auth=39f75d53f49aa0d0bf3db526864bd693ee8f4f82e53d31b748c089222e2d3204&width=828&height=466&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/SKRAIQBHQRBFBHUQHVT4IN4US4.png?auth=395a7e0c66dd7ebe39de79483751bdaf71c42b3ca97b99c090b314f8dcf8d593&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/SKRAIQBHQRBFBHUQHVT4IN4US4.png?auth=395a7e0c66dd7ebe39de79483751bdaf71c42b3ca97b99c090b314f8dcf8d593&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/U2QCTDL5MBEZLPW2NSIZYOIPOA.png?auth=36504fa369c49edb78f42a88029e75770a4d1f673e18a86f8c53e77e3fcb7563&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/U2QCTDL5MBEZLPW2NSIZYOIPOA.png?auth=36504fa369c49edb78f42a88029e75770a4d1f673e18a86f8c53e77e3fcb7563&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/EKOW3OBD75EXNNMAXK7PXGYIEU.png?auth=f6e047c32d11dd9a34deb0d211d2e3a9c6a236ef577567a135dee2b0369b3820&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/EKOW3OBD75EXNNMAXK7PXGYIEU.png?auth=f6e047c32d11dd9a34deb0d211d2e3a9c6a236ef577567a135dee2b0369b3820&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/27PIG4KZVRAFXCBPNV45L5WAVQ.png?auth=439d9dc2f75bd573101763717962b16559ee1a7062978c94ed2f08d7b713eb78&width=414&height=233&smart=true 414w,https://imagenes.elpais.com/resizer/v2/27PIG4KZVRAFXCBPNV45L5WAVQ.png?auth=439d9dc2f75bd573101763717962b16559ee1a7062978c94ed2f08d7b713eb78&width=828&height=466&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/DPXWL6ZA2VFNPANOTFOOPV5HCE.png?auth=b3b82d82850d1ec12c2cd38e87b54f9d0504f9b6be1894bb7c1607f5fc8fd378&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/DPXWL6ZA2VFNPANOTFOOPV5HCE.png?auth=b3b82d82850d1ec12c2cd38e87b54f9d0504f9b6be1894bb7c1607f5fc8fd378&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/7U5G74WNR5BQDMKS4I56D6SEWU.png?auth=cf8be68a1f3e0ff9aded113501362742fc063404b8eb151dfcabe7914be09d2d&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/7U5G74WNR5BQDMKS4I56D6SEWU.png?auth=cf8be68a1f3e0ff9aded113501362742fc063404b8eb151dfcabe7914be09d2d&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/L4N7YE4Q65FRDIRVTAKFIZRRRY.png?auth=2e1f6f3e58b049a1a972239a592f13fedd4725505fb7b554deeb9de0b50d5b51&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/L4N7YE4Q65FRDIRVTAKFIZRRRY.png?auth=2e1f6f3e58b049a1a972239a592f13fedd4725505fb7b554deeb9de0b50d5b51&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/WOWRRUJIJJBPNIBH3WHN7HWZXM.png?auth=243147cee6608b864aa12de77465d6e4410cbd04106c1a61d88df7e0cac633ef&width=414&height=233&smart=true 414w,https://imagenes.elpais.com/resizer/v2/WOWRRUJIJJBPNIBH3WHN7HWZXM.png?auth=243147cee6608b864aa12de77465d6e4410cbd04106c1a61d88df7e0cac633ef&width=828&height=466&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/J6U724DFR5G3NALSYBGRTGUYQM.png?auth=17fee7a9f5ca5f12459aa7835ee1215335b35ba19cf2dd62ee0e2f0f1ac577c1&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/J6U724DFR5G3NALSYBGRTGUYQM.png?auth=17fee7a9f5ca5f12459aa7835ee1215335b35ba19cf2dd62ee0e2f0f1ac577c1&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/ND2Z5E2I7FB6DGCZLC2TNPCDM4.png?auth=6da6574d21d37173f1d04c6ff8ac30cf291a06cfb070dd7ac0d09cbb7d44e07e&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/ND2Z5E2I7FB6DGCZLC2TNPCDM4.png?auth=6da6574d21d37173f1d04c6ff8ac30cf291a06cfb070dd7ac0d09cbb7d44e07e&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/WSBC2MER55EUBD3FECXDKJANWI.png?auth=4884a7459106e5c8c0452edd7c2549b09fcdfa1d3af74aab58584f8ad17a5697&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/WSBC2MER55EUBD3FECXDKJANWI.png?auth=4884a7459106e5c8c0452edd7c2549b09fcdfa1d3af74aab58584f8ad17a5697&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/6WM7L24ODVCYFLRMXNRHKUDQV4.jpg?auth=c53c6e5ef0dec0ddae515942cda6c0024df150a1c1c5bec973f1d4e92e862d9e&width=414&height=233&smart=true 414w,https://imagenes.elpais.com/resizer/v2/6WM7L24ODVCYFLRMXNRHKUDQV4.jpg?auth=c53c6e5ef0dec0ddae515942cda6c0024df150a1c1c5bec973f1d4e92e862d9e&width=828&height=466&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/U27POCZ6LRBF5KKN4L2MII3FSY.jpg?auth=7936471020fa74840f19b9d96df2f27802a77fe804afdddc35e238680432ba35&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/U27POCZ6LRBF5KKN4L2MII3FSY.jpg?auth=7936471020fa74840f19b9d96df2f27802a77fe804afdddc35e238680432ba35&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/UGTOS3HM65GTXLDJYPUH5AO4LY.jpg?auth=c45b2f9597b4964c06e46a5fcdc811cf48bde17a55816f6ea2db31d21359646b&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/UGTOS3HM65GTXLDJYPUH5AO4LY.jpg?auth=c45b2f9597b4964c06e46a5fcdc811cf48bde17a55816f6ea2db31d21359646b&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/7H6OV4JSVVBLNIUXGM5S4FFEXY.jpg?auth=d6d3850a605a114583c88bd82a387ad58cc6ea30afc963554dccd38089aa44dd&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/7H6OV4JSVVBLNIUXGM5S4FFEXY.jpg?auth=d6d3850a605a114583c88bd82a387ad58cc6ea30afc963554dccd38089aa44dd&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/7UGQT3GIP5ASVKXTF3RE36AQPA.png?auth=81fac75c52fb708129ef02bcd9637e7bbf328c573ec2d33eec845930374fc37d&width=414&height=233&smart=true 414w,https://imagenes.elpais.com/resizer/v2/7UGQT3GIP5ASVKXTF3RE36AQPA.png?auth=81fac75c52fb708129ef02bcd9637e7bbf328c573ec2d33eec845930374fc37d&width=828&height=466&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/V2YG2WJTFBGGLNJOWXULTCFXD4.png?auth=24e8d6401cb1bdfb25ad0812efe00481b4d5368579786a832e71ea08920a9701&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/V2YG2WJTFBGGLNJOWXULTCFXD4.png?auth=24e8d6401cb1bdfb25ad0812efe00481b4d5368579786a832e71ea08920a9701&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/GAPUSPOFNJEGVJBZWEBLJDOP6U.png?auth=e98225f1848b55031aef6bd8bfeee46ed788d4efbec26f0c1fa853bb44e150e3&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/GAPUSPOFNJEGVJBZWEBLJDOP6U.png?auth=e98225f1848b55031aef6bd8bfeee46ed788d4efbec26f0c1fa853bb44e150e3&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/SS4FI6EETBADVINTQZEV6RPL6I.png?auth=fcabc5b4d2a9e5bd7a0bcff2c79386d7ae86ab28032998f94401cf828d385c2c&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/SS4FI6EETBADVINTQZEV6RPL6I.png?auth=fcabc5b4d2a9e5bd7a0bcff2c79386d7ae86ab28032998f94401cf828d385c2c&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/VIN5IIS5AZFS3KO6MBKCKBHTMY.png?auth=c28829855d999f19266dcb232d1543cdc3fe7e2fbfc8e16065d7820955b72943&width=414&height=233&smart=true 414w,https://imagenes.elpais.com/resizer/v2/VIN5IIS5AZFS3KO6MBKCKBHTMY.png?auth=c28829855d999f19266dcb232d1543cdc3fe7e2fbfc8e16065d7820955b72943&width=828&height=466&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/RCDMZDSHVBCA3JA7MVROIY3M44.PNG?auth=4f8314d074a8dc04d4fe19f4eb1bd3c5528edf8984e720342cc144d23f9c42b2&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/RCDMZDSHVBCA3JA7MVROIY3M44.PNG?auth=4f8314d074a8dc04d4fe19f4eb1bd3c5528edf8984e720342cc144d23f9c42b2&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/4YB4EYBV6NF5LGXGDBVUQM7OBM.PNG?auth=cc85504108598bdd35e7030eec42583ba2357f09e6db4f54e6969e2237329800&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/4YB4EYBV6NF5LGXGDBVUQM7OBM.PNG?auth=cc85504108598bdd35e7030eec42583ba2357f09e6db4f54e6969e2237329800&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/S4Z47GCKOVCEZCM3TAUPUR2A5I.PNG?auth=69b65cd800f471882e6a2df3e596e8301f7ab2fe1a032f2d4adaa5b488668c6d&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/S4Z47GCKOVCEZCM3TAUPUR2A5I.PNG?auth=69b65cd800f471882e6a2df3e596e8301f7ab2fe1a032f2d4adaa5b488668c6d&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/BR6EYSOVLBESHEOSWKLWWC6NGM.png?auth=6b779098cc8dbceeee3d05f2675dc7ed0e3eaae8149a6322e98929b6fc4d3fd1&width=414&height=233&smart=true 414w,https://imagenes.elpais.com/resizer/v2/BR6EYSOVLBESHEOSWKLWWC6NGM.png?auth=6b779098cc8dbceeee3d05f2675dc7ed0e3eaae8149a6322e98929b6fc4d3fd1&width=828&height=466&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/EKQNHXNRMRDQDJ7MYWVBHDAHQU.png?auth=fabcd32aed8b51b8d2a2f98e2d4d750a6f88eb789f9faa7f1c698b40f48bc435&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/EKQNHXNRMRDQDJ7MYWVBHDAHQU.png?auth=fabcd32aed8b51b8d2a2f98e2d4d750a6f88eb789f9faa7f1c698b40f48bc435&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/Q5N6XZG6QRAO3N6BE6UGVA5H44.png?auth=41b1e70c75875a9d04290a7758d677814ec72d8bed8e83b22f28dd17410063ed&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/Q5N6XZG6QRAO3N6BE6UGVA5H44.png?auth=41b1e70c75875a9d04290a7758d677814ec72d8bed8e83b22f28dd17410063ed&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/WYHQ3UXVO5F33NCEO2Z7GI5J6A.png?auth=8040d02866d6557e03a6cf4212bc35d9a520ff84cba9a4dab24fba747a398dcb&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/WYHQ3UXVO5F33NCEO2Z7GI5J6A.png?auth=8040d02866d6557e03a6cf4212bc35d9a520ff84cba9a4dab24fba747a398dcb&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/G2LU4EFUQ5D23BJZGSM6HG4IRQ.png?auth=ad9654d22833846dbd645846710e251208d3360517c8dd2b8036cd2144689410&width=414&height=233&smart=true 414w,https://imagenes.elpais.com/resizer/v2/G2LU4EFUQ5D23BJZGSM6HG4IRQ.png?auth=ad9654d22833846dbd645846710e251208d3360517c8dd2b8036cd2144689410&width=828&height=466&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/U2QCTDL5MBEZLPW2NSIZYOIPOA.png?auth=36504fa369c49edb78f42a88029e75770a4d1f673e18a86f8c53e77e3fcb7563&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/U2QCTDL5MBEZLPW2NSIZYOIPOA.png?auth=36504fa369c49edb78f42a88029e75770a4d1f673e18a86f8c53e77e3fcb7563&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/5N7JRZZRKVAFLIWITBKS3RZL7M.png?auth=53743582e2104dbadc8422a63bc8e58feb4fc53e8cc3f6bd23e0e5d454b7627c&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/5N7JRZZRKVAFLIWITBKS3RZL7M.png?auth=53743582e2104dbadc8422a63bc8e58feb4fc53e8cc3f6bd23e0e5d454b7627c&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/GRIHDI2CAFGX7O7MU3KR4ZYBDI.png?auth=7c0987f02246597c2a5c0d75a838f84a2187af8f7ec19dc2e081403565e8786e&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/GRIHDI2CAFGX7O7MU3KR4ZYBDI.png?auth=7c0987f02246597c2a5c0d75a838f84a2187af8f7ec19dc2e081403565e8786e&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/QC2I3K5S5RDR7N4QHRRSIVE4XE.png?auth=8bd3e9a7c8e5ece59f79448e3695c21839af356d56809b125919b00009822c1c&width=414&height=233&smart=true 414w,https://imagenes.elpais.com/resizer/v2/QC2I3K5S5RDR7N4QHRRSIVE4XE.png?auth=8bd3e9a7c8e5ece59f79448e3695c21839af356d56809b125919b00009822c1c&width=828&height=466&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/53AAJTGK2ZF6HJ3AWGSI7G4J4A.png?auth=0d528cff1678b145d62e902649349bf6d3f4cfe1e59ee1ff05b5e1804028dc21&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/53AAJTGK2ZF6HJ3AWGSI7G4J4A.png?auth=0d528cff1678b145d62e902649349bf6d3f4cfe1e59ee1ff05b5e1804028dc21&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/OVVJONBPSVGGLFG3SV6UBXPDZM.png?auth=e47b4c882f3a0245ef3afa3f7c3ddcc030725f119bf758bdfdb0a9d2bc801ea6&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/OVVJONBPSVGGLFG3SV6UBXPDZM.png?auth=e47b4c882f3a0245ef3afa3f7c3ddcc030725f119bf758bdfdb0a9d2bc801ea6&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/VTWVC26CXJHRLLEMJGI3BS5BVI.png?auth=20f11a8b9828019f1b598f59ebd676d9b277cc36c18d621d0717babede3b2e81&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/VTWVC26CXJHRLLEMJGI3BS5BVI.png?auth=20f11a8b9828019f1b598f59ebd676d9b277cc36c18d621d0717babede3b2e81&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/7UGQT3GIP5ASVKXTF3RE36AQPA.png?auth=81fac75c52fb708129ef02bcd9637e7bbf328c573ec2d33eec845930374fc37d&width=414&height=233&smart=true 414w,https://imagenes.elpais.com/resizer/v2/7UGQT3GIP5ASVKXTF3RE36AQPA.png?auth=81fac75c52fb708129ef02bcd9637e7bbf328c573ec2d33eec845930374fc37d&width=828&height=466&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/V2YG2WJTFBGGLNJOWXULTCFXD4.png?auth=24e8d6401cb1bdfb25ad0812efe00481b4d5368579786a832e71ea08920a9701&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/V2YG2WJTFBGGLNJOWXULTCFXD4.png?auth=24e8d6401cb1bdfb25ad0812efe00481b4d5368579786a832e71ea08920a9701&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/GAPUSPOFNJEGVJBZWEBLJDOP6U.png?auth=e98225f1848b55031aef6bd8bfeee46ed788d4efbec26f0c1fa853bb44e150e3&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/GAPUSPOFNJEGVJBZWEBLJDOP6U.png?auth=e98225f1848b55031aef6bd8bfeee46ed788d4efbec26f0c1fa853bb44e150e3&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/SS4FI6EETBADVINTQZEV6RPL6I.png?auth=fcabc5b4d2a9e5bd7a0bcff2c79386d7ae86ab28032998f94401cf828d385c2c&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/SS4FI6EETBADVINTQZEV6RPL6I.png?auth=fcabc5b4d2a9e5bd7a0bcff2c79386d7ae86ab28032998f94401cf828d385c2c&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/3SSECTKRDVEIXDDMD2SY2KTMGY.webp?auth=6b9b85bee76fb0c77941fa5755a7a3a60801818c8088264ca0eb016f18a1ebcf&width=414&height=233&smart=true 414w,https://imagenes.elpais.com/resizer/v2/3SSECTKRDVEIXDDMD2SY2KTMGY.webp?auth=6b9b85bee76fb0c77941fa5755a7a3a60801818c8088264ca0eb016f18a1ebcf&width=828&height=466&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/CWKOAHZHFJA3FFE7EK6WRBJ7LY.webp?auth=19b7cb45e90ab42416a5140d083132e407e6008cc836c9bf507ea474872eff9c&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/CWKOAHZHFJA3FFE7EK6WRBJ7LY.webp?auth=19b7cb45e90ab42416a5140d083132e407e6008cc836c9bf507ea474872eff9c&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/YHDHNIAEBBCSFMXIEI55GNNR4Y.webp?auth=dcca049bb64b518ea33b86c482457cff082ba3abbbcd1f95b119e259ac8dbe3a&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/YHDHNIAEBBCSFMXIEI55GNNR4Y.webp?auth=dcca049bb64b518ea33b86c482457cff082ba3abbbcd1f95b119e259ac8dbe3a&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/QJBCWL66JBGKBPWLA6KQYOOS4M.webp?auth=7fb115659c1e5b1b1be4da5a96dfae0f63f159303c186fd525d4fe6ddbcd2c86&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/QJBCWL66JBGKBPWLA6KQYOOS4M.webp?auth=7fb115659c1e5b1b1be4da5a96dfae0f63f159303c186fd525d4fe6ddbcd2c86&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/VIN5IIS5AZFS3KO6MBKCKBHTMY.png?auth=c28829855d999f19266dcb232d1543cdc3fe7e2fbfc8e16065d7820955b72943&width=414&height=233&smart=true 414w,https://imagenes.elpais.com/resizer/v2/VIN5IIS5AZFS3KO6MBKCKBHTMY.png?auth=c28829855d999f19266dcb232d1543cdc3fe7e2fbfc8e16065d7820955b72943&width=828&height=466&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/RCDMZDSHVBCA3JA7MVROIY3M44.PNG?auth=4f8314d074a8dc04d4fe19f4eb1bd3c5528edf8984e720342cc144d23f9c42b2&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/RCDMZDSHVBCA3JA7MVROIY3M44.PNG?auth=4f8314d074a8dc04d4fe19f4eb1bd3c5528edf8984e720342cc144d23f9c42b2&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/4YB4EYBV6NF5LGXGDBVUQM7OBM.PNG?auth=cc85504108598bdd35e7030eec42583ba2357f09e6db4f54e6969e2237329800&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/4YB4EYBV6NF5LGXGDBVUQM7OBM.PNG?auth=cc85504108598bdd35e7030eec42583ba2357f09e6db4f54e6969e2237329800&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/S4Z47GCKOVCEZCM3TAUPUR2A5I.PNG?auth=69b65cd800f471882e6a2df3e596e8301f7ab2fe1a032f2d4adaa5b488668c6d&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/S4Z47GCKOVCEZCM3TAUPUR2A5I.PNG?auth=69b65cd800f471882e6a2df3e596e8301f7ab2fe1a032f2d4adaa5b488668c6d&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/6WPG4SJ365HEZCQ6CQGMVDZLRE.png?auth=7ea12bf94807bec3174e60ce786a0f24e7f5cea487844cadc0bc43504219ca2f&width=414&height=233&smart=true 414w,https://imagenes.elpais.com/resizer/v2/6WPG4SJ365HEZCQ6CQGMVDZLRE.png?auth=7ea12bf94807bec3174e60ce786a0f24e7f5cea487844cadc0bc43504219ca2f&width=828&height=466&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/X6T2C5X4Q5FO7FZRPN6SU6DHIE.png?auth=a53b1d0eca3fab3ff66449c9a9e2a2ab7463f784f2216e640384e013a726e119&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/X6T2C5X4Q5FO7FZRPN6SU6DHIE.png?auth=a53b1d0eca3fab3ff66449c9a9e2a2ab7463f784f2216e640384e013a726e119&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/VZDCMDXGVVHLBJNNK35EJDAS24.png?auth=e4df6430b3fba2ad642786687f6c6aec73fc5e7706913cb3b67776c1b3365821&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/VZDCMDXGVVHLBJNNK35EJDAS24.png?auth=e4df6430b3fba2ad642786687f6c6aec73fc5e7706913cb3b67776c1b3365821&width=828&height=621&smart=true 828w",
"https://imagenes.elpais.com/resizer/v2/W4NNM2NNM5EZFKTK5I7PK4WV6I.png?auth=e5aa47d40657f163e5b9e661979fb99cc89191532863e22ec2c9f00f8739896c&width=414&height=311&smart=true 414w,https://imagenes.elpais.com/resizer/v2/W4NNM2NNM5EZFKTK5I7PK4WV6I.png?auth=e5aa47d40657f163e5b9e661979fb99cc89191532863e22ec2c9f00f8739896c&width=828&height=621&smart=true 828w"
] |
[] |
[] |
[
""
] | null |
[
"Sergio C. Fanjul"
] |
2023-10-04T00:00:00
|
Shaped by the political and literary climate of the time, the award does not assure a spot in the literary canon and often overlooks exceptional authors
|
en
|
https://static.elpais.com/dist/resources/images/favicon.ico
|
EL PAÍS English
|
https://english.elpais.com/culture/2023-10-04/the-nobel-prize-in-literatures-history-of-snubs-and-long-forgotten-winners.html
|
Mostly people would say that winning the Nobel Prize in Literature is a great honor. But a closer look at the history of the award might change their minds. Several publishers offer collections of works by Nobel Prize winners. Some people like to decorate their living rooms with unread books with gold-embossed leather covers. Open one up, and you may find something by Albert Camus, Gabriel García Márquez, William Faulkner, Samuel Beckett or Yasunari Kawabata — the Mount Olympus of the literary gods.
But in those elegant tomes of Nobel laureates, you may find some unfamiliar names, even if you are a well-read bibliophile. How about Sully Prudhomme, the French poet and essayist who won the first Nobel Prize for Literature in 1901? Or Verner von Heidenstam (1916), Frans Eemil Sillanpää (1939) and Johannes Vilhelm Jensen (1944)? Do you know about Giosuè Carducci (1906), Henrik Pontoppidan (1917) or Carl Spitteler (1919)?
Some deserving writers have followers and readers in their home countries but never achieve international recognition or a Nobel Prize. Some of their works remain untranslated, while others have not seen updated editions in years. Winning the Nobel Prize, the highest international literary honor, doesn’t ensure future recognition or the immortality sought by aspiring writers.
The Nobel Prize eras
One possible reason for all the forgotten Nobel laureates is the evolution of the Nobel Prize over time, says professor Juan Bravo, who wrote a book about Nobel Prize winners for literature. Until a few years after the World War I, he says the prize was very regional. “At times, the Nobel Prize academy is influenced by diplomatic criteria, shall we say. A glance at the first 20 Nobel laureates reveals mostly low-profile names, with a few exceptions like Kipling (1907), Tagore (1913) and possibly Selma Lagerlöf (1909).”
Moreover, the further back we go in the list of winners, the more we find unfamiliar names. Time has sifted and settled the layer of dust, slowly erasing their memory. “Each era has its own particular interests. Many books that used to be bestsellers have been relegated to the back shelf,” said Cristina Oñoro, a literature professor at the Complutense University of Madrid.
A gradual period of openness started in 1920, but after World War II, Bravo says the Nobel Prize for Literature became more introverted, especially after Jean-Paul Sartre’s abrupt rejection of his award in 1964. In 1990, a period of “globalization” began with Mexican author Octavio Paz’s win. “This doesn’t mean that there haven’t been periods of regression, like the one we’re currently experiencing,” said Bravo.
During this globalization, we first encountered authors from different corners of the world. These writers are often unknown because they hail from distant lands, not because of their lack of skill or time-tested relevance. The awards to Wole Soyinka of Nigeria (1986), Gao Xingjian (2000) and Mo Yan (2012) of China, and Abdulrazak Gurnah (2021) of Tanzania perplexed the literary community and made cultural journalists scramble to find out more. In the 20th century, 80% of the Nobel laureates in literature were citizens of the U.S., Canada or Europe, while Africa, Asia, and the rest of the Americas accounted for the remaining 20%. Interestingly, no individual born in Oceania has ever received the Nobel Prize for Literature. However, a noticeable shift is now taking place.
The Nobel Prizes introduced new names and expanded the literary horizons of Western societies, challenging their ethnocentrism. In recent decades, female winners have become more common, as 93% of the winners were male until 1990 — typically older, white European men. While Europeans and Americans still dominate, they have only won 66% of the awards in the last 15 years. However, there is no guarantee that these diverse winners won’t also fade into literary obscurity, like many of their predecessors. “I think some of the recent Nobel Prize winners might end up being completely forgotten in the future. Who even reads Soyinka, Louise Glück, or Szymborska nowadays, apart from some book clubs?” wonders Javier Aparicio Maydeu, a professor of Spanish and comparative literature at Pompeu Fabra University in Spain.
Once the annual awards are revealed, the inevitable debates ignite, fueling heated discussions and opinions. “I think they should validate the career of an author who has already proven their worth, like Mario Vargas Llosa,” said Aparicio. “The prizes being given now, you know, to discover new talents, I don’t think they align with Alfred Nobel’s original intentions.” Others say the presence of lesser-known names adds depth and variety. “Literature is a vehicle for sharing voices and cultures, so I think it’s positive that the Nobel Prize can serve to broaden our horizons,” said Oñoro.
Spanish winners
The Spanish winners of the Nobel Prize for Literature tell the same story. While Juan Ramón Jiménez (1956) is highly regarded, Vicente Aleixandre (1977) is an excellent representative of the Generation of ‘27, and Camilo José Cela (1989) still casts a long shadow, others like José Echegaray (1904) and Jacinto Benavente (1922) are not widely known or read anymore. Surprisingly, the prestigious award was never bestowed upon their contemporaries, Benito Pérez Galdós and Ramón Valle-Inclán.
While some award-winning authors have been forgotten by readers, other deserving writers have simply been overlooked by the Swedish Academy. Among them are living authors like Haruki Murakami (who won Spain’s Princess of Asturias award this year) and António Lobo Antunes. There was no Nobel for revered authors like James Joyce, Vladimir Nabokov, Philip Roth and Javier Marías. “Marías had all the qualities to win a Nobel Prize — a great body of work translated into multiple languages, and you can even find pocket editions of his books in bookstores when you travel abroad,” said Aparicio.
Many authors have left a lasting legacy without winning a Nobel, which proves that it is far from being the ultimate literary authority. “In no way,” said Bravo. “I’ll give you just one example. The canon of the 20th century novel has five names: Proust, Joyce, Kafka, Faulkner, and Virginia Woolf. Well, the only one who received the award was Faulkner, in 1949.”
Studying the impact of the Nobel Prize for Literature enables us to understand the mechanisms of literary fame and other types of celebrity. It does not ensure transcendence and lasting recognition. Authors may rise to prominence in one era, only to be forgotten in another. Conversely, forgotten authors can be rediscovered, while canonical figures in one country may be insignificant elsewhere. Success is fleeting, memento mori.
|
||||
correct_award_00093
|
FactBench
|
0
| 34
|
https://www.pinterest.com/pin/johannes-vilhelm-jensen--315111305162129740/
|
en
|
[] |
[] |
[] |
[
""
] | null |
[] |
2016-11-02T22:09:09+00:00
|
The Nobel Prize in Literature 1944 was awarded to Johannes Vilhelm Jensen "for the rare strength and fertility of his poetic imagination with which is combined an intellectual curiosity of wide scope and a bold, freshly creative style"
|
en
|
Pinterest
|
https://www.pinterest.com/pin/the-nobel-prize-in-literature-1944--381609768441497123/
| |||||||
correct_award_00093
|
FactBench
|
0
| 63
|
http://www.genordell.com/stores/spirit/NobelPrizeLit.htm
|
en
|
[] |
[] |
[] |
[
""
] | null |
[] | null | null | ||||||||||
correct_award_00093
|
FactBench
|
3
| 36
|
https://www.britannica.com/sitemap/j/20
|
en
|
Alphabetical Browse
|
[
"https://cdn.britannica.com/mendel/eb-logo/MendelNewThistleLogo.png",
"https://cdn.britannica.com/mendel/eb-logo/MendelNewThistleLogo.png"
] |
[] |
[] |
[
"britannica",
"reference",
"online",
"encyclopedia",
"encyclopaedia",
"store",
"dictionary",
"thesaurus"
] | null |
[] | null |
Explore the fact-checked online encyclopedia from Encyclopaedia Britannica with hundreds of thousands of objective articles, biographies, videos, and images from experts.
|
en
|
/favicon.png
|
Encyclopedia Britannica
|
https://www.britannica.com/sitemap/j/20
|
Jenny (airplane)
Glenn Hammond Curtiss: The Curtiss JN-4 (“Jenny”) was the standard training and general-purpose aircraft in American military service during the years prior to the U.S. entry into World War I. The NC-4, a multiengine Curtiss flying boat, made the first flight across the Atlantic Ocean in 1919, opening the…
Jenny (work by Lewald)
Fanny Lewald: The novels Clementine (1842) and Jenny (1843) describe circumscribed lives built around family virtues. Die Familie Darner, 3 vol. (1888; “The Darner Family”), and Von Geschlecht zu Geschlecht, 8 vol. (1863–65; “From Generation to Generation”), are realistic novels about the lives of family members over several generations. Diogena (1847) is…
Jenny Jones (American television show)
Television in the United States: Tabloid TV: Jenny Jones (syndicated, 1991–2003) specialized in guests with salacious and unconventional stories, usually of a sexual nature, and Ricki Lake (syndicated, 1993–2004) was designed especially for younger female audiences. Jerry Springer (syndicated, begun 1991) was the most extreme and notorious of the shows, presenting shocking…
Jenny Lind (carriage)
buggy: …top it was called a Jenny Lind.
Jenny Lind (furniture)
cottage furniture: …type was also called “Jenny Lind,” in honour of the famous Swedish soprano Jenny Lind, whose American concert tour with the American showman P.T. Barnum during the period of this style’s introduction made her name widely familiar.
Jenolan Caves (caves, New South Wales, Australia)
Jenolan Caves, series of caves constituting one of Australia’s best known tourist attractions, in east central New South Wales, 70 mi (113 km) west of Sydney. They comprise a series of tunnels and caverns formed by two converging streams in a thick bed of limestone at an elevation of 2,600 ft (800
Jenseits von Gut und Böse (work by Nietzsche)
ethics: Nietzsche: In Beyond Good and Evil (1886), he wrote with approval of “the distinguished type of morality,” according to which “one has duties only toward one’s equals; toward beings of a lower rank, toward everything foreign to one, one may act as one sees fit, ‘as one’s…
Jensen, Adolph E. (Danish anthropologist)
dema deity: Jensen. According to this myth, a dema man named Amenta found a coconut speared on a boar’s tusk and in a dream was instructed to plant it. In six days a palm had sprung from the nut and flowered. Amenta cut his finger, and his…
Jensen, Anina Margarete Kirstina Petra (British dancer)
Dame Adeline Genée was a dancer, choreographer, and teacher who was founder-president of the Royal Academy of Dancing. The daughter of a farmer, Anina Jensen was adopted at age eight by her uncle, Alexander Genée, director of a modest touring ballet company. Trained by her uncle and his wife,
Jensen, Bodil Louise (Danish actress)
Bodil Ipsen was a Danish actress who, with her frequent stage partner, the character actor Poul Reumert, reilluminated the dramas of Henrik Ibsen and August Strindberg. Ipsen first appeared on the stage at the Royal Theatre, Copenhagen, in 1909. She showed wide versatility, achieving success in
Jensen, Georg (Danish silversmith)
Georg Jensen was a Danish silversmith and designer who achieved international prominence for his commercial application of modern metal design. The simple elegance of his works and their emphasis on fine craftsmanship, hallmarks of Jensen’s products, are recognized around the world. Jensen was
Jensen, Gerrit (British artist)
Gerrit Jensen was a royal cabinetmaker of Louis XIV-style furniture, who became one of the most fashionable and foremost designers and craftsmen of his time. Apparently the first cabinetmaker to earn individual distinction in England, he became famous for his technique of metal- inlaid furniture
Jensen, J. Hans D. (German physicist)
J. Hans D. Jensen was a German physicist who shared half of the 1963 Nobel Prize for Physics with Maria Goeppert Mayer for their proposal of the shell nuclear model. (The other half of the prize was awarded to Eugene P. Wigner for unrelated work.) After obtaining his Ph.D. at the University of
Jensen, Jens (American landscape architect)
Jens Jensen was a highly original landscape architect whose public and private works, mostly in the U.S. Midwest, are marked by harmonious use of natural terrain and native flora. Jensen went to the U.S. in 1884 and settled in Chicago, where he was employed by the municipal West Side Park System
Jensen, Johannes Hans Daniel (German physicist)
J. Hans D. Jensen was a German physicist who shared half of the 1963 Nobel Prize for Physics with Maria Goeppert Mayer for their proposal of the shell nuclear model. (The other half of the prize was awarded to Eugene P. Wigner for unrelated work.) After obtaining his Ph.D. at the University of
Jensen, Johannes Hans Daniel (German physicist)
J. Hans D. Jensen was a German physicist who shared half of the 1963 Nobel Prize for Physics with Maria Goeppert Mayer for their proposal of the shell nuclear model. (The other half of the prize was awarded to Eugene P. Wigner for unrelated work.) After obtaining his Ph.D. at the University of
Jensen, Johannes V. (Danish author)
Johannes V. Jensen was a Danish novelist, poet, essayist, and writer of many myths, whose attempt, in his later years, to depict man’s development in the light of an idealized Darwinian theory caused his work to be much debated. He received the Nobel Prize for Literature in 1944. Of old peasant
Jensen, Johannes Vilhelm (Danish author)
Johannes V. Jensen was a Danish novelist, poet, essayist, and writer of many myths, whose attempt, in his later years, to depict man’s development in the light of an idealized Darwinian theory caused his work to be much debated. He received the Nobel Prize for Literature in 1944. Of old peasant
Jensen, Michael C. (American economist)
financial agency theory: Theoretical development: …Financial Economics by financial economist Michael C. Jensen and management theorist William H. Meckling. Building on earlier work by the American economists Ronald Coase, Armen Alchian, and Harold Demsetz, Jensen and Meckling developed an economic model specifically designed to capture the essence of the principal-agent relationship.
Jenson, Nicolas (French printer)
Nicolas Jenson was a publisher and printer who developed the roman-style typeface. Apprenticed as a cutter of dies for coinage, Jenson later became master of the royal mint at Tours. In 1458 he went to Mainz to study printing under Johannes Gutenberg. In 1470 he opened a printing shop in Venice,
Jenson, Vicky (American animator and film director)
Jentinkia sumichrasti (mammal)
cacomistle: (formerly Jentinkia) sumichrasti ranges in forests from Central America to Peru. Larger, darker-furred, and more arboreal than the ringtail, it has pointed ears and nonretractile claws.
Jenyns, Soame (British writer)
English literature: Johnson’s poetry and prose: …in his relentless review of Soame Jenyns’s Free Inquiry into the Nature and Origin of Evil (1757), which caustically dissects the latter’s complacent attitude to human suffering, and his analytic capacities are evidenced at their height in the successful completion of two major projects, his innovative Dictionary of the English…
Jeollabuk-do (province, South Korea)
North Jeolla, do (province), southwestern South Korea. It is bounded by the provinces of South and North Chungcheong (north), North and South Gyeongsang (east), and South Jeolla (south), and by the Yellow Sea (west). The province is divided by the Noryong Mountains, a spur of the Taebaek Mountains.
Jeollanam-do (province, South Korea)
South Jeolla, do (province), extreme southwestern South Korea. It is bounded by North Jeolla province (north), South Gyeongsang province (east), Jeju Strait (south), and the Yellow Sea (west). Its coastline, including nearly 2,000 islands, of which three-fourths are uninhabited, is about 3,800
Jeong Seung-Hwa (South Korean general)
Chung Sŭng-Hwa Korean general and army chief of staff who was implicated in the October 1979 assassination of South Korean Pres. Park Chung-Hee. During the Korean War (1950–53), Chung helped defend Taegu (Daegu) against a North Korean assault. In 1961 he was made a brigadier general, and he built a
Jeonju (South Korea)
Jeonju, city and capital of North Jeolla do (province), southwestern South Korea. It is 21 miles (34 km) east of the Yellow Sea and is surrounded by steep hills with fortified castles. One of the oldest cities in Korea, Jeonju had its origins in the Three Kingdoms period (c. 57 bce–668 ce), when it
Jeopardy (film by Sturges [1953])
John Sturges: Bad, Magnificent, and Great: …was on more-familiar ground with Jeopardy (1953), a thriller that featured Barbara Stanwyck as a wife and mother who is menaced by a killer (Ralph Meeker) while on vacation in Mexico. Fast Company (1953), a musical comedy about horse racing, was a mismatch of director and material. Escape from Fort…
Jeopardy! (American television game show)
Jeopardy!, daily syndicated American television quiz show in which three contestants try to win money as they are quizzed on all types of trivia—history, science, math, geography, language, popular culture, and more. The long-running program, which has won dozens of Daytime Emmy Awards, is a
Jephté (opera by Montéclair)
Michel de Montéclair: …best known opera, or tragédie-lyrique, Jephté (1732), was banned by the Archbishop of Paris because of its biblical subject. It has a grandeur reminiscent of Lully and is known to have influenced Rameau. Other works include 20 French and 4 Italian cantatas (four books, 1709–28), a requiem, chamber music, and…
Jephtha (oratorio by Handel)
George Frideric Handel: Life: …the last of his oratorios, Jephtha, which was performed at Covent Garden Theatre, London, in 1752. He kept his interest in musical activities alive until the end. After his death on April 14, 1759, he was buried in Poets’ Corner in Westminster Abbey.
Jephtha (oratorio by Carissimi)
Giacomo Carissimi: …well displayed in his oratorio Jephtha, lasting about 20 minutes, where both solo narrator and chorus act as commentators and the latter also take the roles of opposing groups in the story. George Frideric Handel expanded this basic scheme in his oratorios. Carissimi greatly influenced later music not only through…
Jephthah (Hebrew leader)
Jephthah, a judge or regent (often a hero figure) of Israel who dominates a narrative in the Book of Judges, where he is presented as an exemplar of faith for Israel in its monotheistic commitment to Yahweh. Of the Israelite tribe in Gilead (present northwest Jordan), he was banished from his home
Jepsen, Carly Rae (Canadian singer, songwriter, and musician)
Carly Rae Jepsen is a Canadian singer, songwriter, and guitarist best known for the global pop phenomenon “Call Me Maybe,” which became the biggest-selling song in the world in 2012 and the best-selling domestic Canadian single in history. A self-professed “musical-theatre nerd,” Jepsen starred in
Jeqe, the Bodyservant of King Shaka (novel by Dube)
John Langalibalele Dube: …of Insila ka Shaka (1930; Jeqe, the Bodyservant of King Shaka), the first novel published by a Zulu in his native language.
Jequié (Brazil)
Jequié, city, southeastern Bahia estado (state), northeastern Brazil, on the Contas River, at 653 feet (199 metres) above sea level. It was elevated to city status in 1910. Jequié is the trade centre for a zone yielding mainly livestock and other agricultural products, as well as some manufactured
jequirity bean (plant)
jequirity bean, (Abrus precatorius), plant of the pea family (Fabaceae), found in tropical regions. The plant is sometimes grown as an ornamental and is considered an invasive species in some areas outside its native range. Although highly poisonous, the hard red and black seeds are attractive and
Jequitinhonha River (river, Brazil)
Jequitinhonha River, river, eastern Brazil, rising in the Serra do Espinhaço, south of Diamantina, Minas Gerais estado (state), and flowing northward and then east-northeastward across the uplands. At Salto da Divisa, it is interrupted by the Cachoeira (falls) do Salto Grande (140 ft [43 m] high).
Jerahmeel, Chronicles of (Jewish work)
Judaism: Medieval legendary histories and Haggadic compendiums: There is also the voluminous Chronicles of Jerahmeel, written in the Rhineland in the 14th century, which draws largely on Pseudo-Philo’s earlier compilation and includes Hebrew and Aramaic versions of certain books of the Apocrypha.
Jerba (island, Tunisia)
Jerba, island situated in the Gulf of Gabes on the Mediterranean Sea, located off the Tunisian mainland, to which it is connected by a causeway almost 4 miles (6 km) long. Jerba island is about 17 miles (27 km) long by 16 miles (26 km) wide and has an area of 197 square miles (510 square km). The
jerboa (rodent)
jerboa, any of 33 species of long-tailed leaping rodents well adapted to the deserts and steppes of eastern Europe, Asia, and northern Africa. Jerboas are mouselike, with bodies ranging from 5 to 15 cm (2 to 5.9 inches) in length and long tails of 7 to 25 cm. Certain traits are highly variable
Jere (people)
Zwangendaba: …African king who led his Jere people on a monumental migration of more than 1,000 miles (1,600 km) that lasted more than 20 years. A leader of incomparable stature, he took his initially small group (later called the Ngoni) from its original home near modern Swaziland to the western part…
Jeremiah (Hebrew prophet)
Jeremiah was a Hebrew prophet, reformer, and author of a biblical book that bears his name. He was closely involved in the political and religious events of a crucial era in the history of the ancient Near East; his spiritual leadership helped his fellow countrymen survive disasters that included
Jeremiah (work by Donatello)
Donatello: Early career: …bald head); and the so-called Jeremiah (actually Habakkuk) for the western niches. The Zuccone is deservedly famous as the finest of the campanile statues and one of the artist’s masterpieces. In both the Zuccone and the Jeremiah (1427–35), their whole appearance, especially highly individual features inspired by ancient Roman portrait…
Jeremiah Johnson (film by Pollack [1972])
Sydney Pollack: Film directing: …ode with the metaphysical western Jeremiah Johnson (1972), a beautifully photographed account of the life of a former soldier who escapes civilization by living in the Rocky Mountains, where he struggles to survive the perils of nature, Native Americans, and even the U.S. cavalry. Redford successfully sacrificed his golden-boy persona…
Jeremiah, The Book of (Old Testament)
The Book of Jeremiah, one of the major prophetical writings of the Old Testament. Jeremiah, a Judaean prophet whose activity spanned four of the most tumultuous decades in his country’s history, appears to have received his call to be a prophet in the 13th year of the reign of King Josiah (627/626
Jeremiah, The Lamentations of (Bible)
The Lamentations of Jeremiah, Old Testament book belonging to the third section of the biblical canon, known as the Ketuvim, or Writings. In the Hebrew Bible, Lamentations stands with Ruth, the Song of Solomon, Ecclesiastes, and Esther and with them makes up the Megillot, five scrolls that are read
Jeremiah, The Letter of (Old Testament)
The Letter of Jeremiah, apocryphal book of the Old Testament, in the Roman canon appended as a sixth chapter to the book of Baruch (itself apocryphal in the Jewish and Protestant canons). The work is supposedly a letter sent by Jeremiah to Jews exiled to Babylon by King Nebuchadrezzar in 597 bc,
Jeremias (Hebrew prophet)
Jeremiah was a Hebrew prophet, reformer, and author of a biblical book that bears his name. He was closely involved in the political and religious events of a crucial era in the history of the ancient Near East; his spiritual leadership helped his fellow countrymen survive disasters that included
Jeremias II (patriarch of Constantinople)
Jeremias II was the patriarch of Constantinople and one of the most capable leaders of the Greek Orthodox church. Elected patriarch in 1572 by popular acclaim, Jeremias immediately instituted a reform by disciplining the clergy and prosecuting simony (the sale and purchase of ecclesiastical
Jeremias, The Lamentations of (Bible)
The Lamentations of Jeremiah, Old Testament book belonging to the third section of the biblical canon, known as the Ketuvim, or Writings. In the Hebrew Bible, Lamentations stands with Ruth, the Song of Solomon, Ecclesiastes, and Esther and with them makes up the Megillot, five scrolls that are read
Jérémie (Haiti)
Jérémie, town, southwestern Haiti, on the northern shore of Pointe de (Cape) Tiburon, on the Gulf of Gonâve. It was founded in 1756, and the port was opened in 1807. It developed as a market and port for the produce (cacao, coffee, sugarcane, bananas, mangoes, logwood, and hides) of the fertile
Jerevan (national capital, Armenia)
Yerevan, capital of Armenia. It is situated on the Hrazdan River, 14 miles (23 km) from the Turkish frontier. Though first historically recorded in 607 ce, Yerevan dates by archaeological evidence to a settlement on the site in the 6th–3rd millennia bce and subsequently to the fortress of Erebuni
Jerez (alcoholic beverage)
sherry, fortified wine of Spanish origin that typically has a distinctive nutty flavour. It takes its name from the province of Jerez de la Frontera in Andalusia, Spain, sherry being an Anglicization of Jerez. The substance is also produced elsewhere—notably in Cyprus, South Africa, Australia, and
Jerez de García Salinas (Mexico)
Jerez de García Salinas, city, south-central Zacatecas estado (state), north-central Mexico. Formerly known simply as Jerez, the city is on the Jerez River, 6,650 feet (2,027 metres) above sea level and southwest of Zacatecas, the state capital. It is the commercial and manufacturing centre for an
Jerez de la Frontera (Spain)
Jerez de la Frontera, city, Cádiz provincia (province), in the comunidad autónoma (autonomous community) of Andalusia, southwestern Spain. It lies northeast of Cádiz city and near the north bank of the Guadalete River. Of obscure origin but probably identical with the Roman Asido Caesariana, the
Jergović, Miljenko (Croatian author)
Croatian literature: …writer of the younger generation, Miljenko Jergović, and, at the turn of the 21st century, by Zoran Ferić, Ante Tomić, and Julijana Matanović.
Jericho (town, West Bank)
Jericho, town located in the West Bank. Jericho is one of the earliest continuous settlements in the world, dating perhaps from about 9000 bce. Archaeological excavations have demonstrated Jericho’s lengthy history. The city’s site is of great archaeological importance; it provides evidence of the
Jericho Mile, The (film by Mann [1979])
Michael Mann: …his first directing credit for The Jericho Mile (1979), a made-for-TV movie about a prisoner training to run in the Olympics. Mann cowrote the teleplay with Patrick J. Nolan, and the pair won an Emmy Award for outstanding writing in a limited series or special.
Jericho, rose of (plant)
rose of Jericho, either of two species of unrelated plants known for their ability to survive dessication. The true rose of Jericho (Anastatica hierochuntica) is native to western Asia and is the only species of the genus Anastatica of the mustard family (Brassicaceae). The small gray plant curls
Jerimoth Hill (hill, Rhode Island, United States)
Jerimoth Hill, highest point (812 feet [247 metres]) in Rhode Island, U.S. It is near North Foster, 20 miles (32 km) west of Providence, near the Connecticut border. The hill is on land owned by Brown
jerk chicken (food)
jerk chicken, a spicy grilled-meat dish mostly associated with Jamaica but common throughout the Caribbean. Jerk refers to a style of cooking in which the main ingredient—which most often is chicken but may also be beef, pork, goat, boar, seafood, or vegetables—is coated in spices and slow-cooked
jerk nystagmus (physiology)
nystagmus: …the type referred to as jerk nystagmus the movements are sharper and quicker in one direction than in the other. Jerk nystagmus can occur normally, such as when one is dizzy (e.g., from spinning around in circles) or is watching objects pass by quickly from the window of a moving…
Jerk, The (film by Reiner [1979])
Carl Reiner: Film directing: Reiner then made The Jerk, one of the biggest hits of 1979 and the film that launched comedian Steve Martin on the path to screen stardom. Martin starred as a dim-witted man who, after discovering that—unlike his adoptive parents—he is not Black, moves to St. Louis, where he…
Jermyn, Henry, Earl of Saint Albans (English courtier)
Henry Jermyn, Earl of Saint Albans was a courtier, favourite of Henrietta Maria, queen of Charles I of England. It was rumoured, falsely, that he became her husband after the king’s execution (1649). He entered Parliament in 1625. In Henrietta Maria’s household he was made vice chamberlain (1628),
Jernberg, Sixten (Swedish skier)
Sixten Jernberg was a Swedish skier who was one of the most successful cross-country skiers of his era, amassing nine Olympic medals. Jernberg was originally a lumberjack by trade and first came to prominence as a skier in the 1954 world championships, where he finished fourth in the 30 km and
Jerne, Niels K. (Danish immunologist)
Niels K. Jerne was a Danish immunologist who shared the 1984 Nobel Prize for Physiology or Medicine with César Milstein and Georges Köhler for his theoretical contributions to the understanding of the immune system. Jerne was born of Danish parents and grew up in the Netherlands. After studying
Jerne, Niels Kaj (Danish immunologist)
Niels K. Jerne was a Danish immunologist who shared the 1984 Nobel Prize for Physiology or Medicine with César Milstein and Georges Köhler for his theoretical contributions to the understanding of the immune system. Jerne was born of Danish parents and grew up in the Netherlands. After studying
Jernigan (Florida, United States)
Orlando, city, seat (1856) of Orange county, central Florida, U.S. It is situated in a region dotted by lakes, about 60 miles (95 km) northwest of Melbourne and 85 miles (135 km) northeast of Tampa. The city is the focus for one of the state’s most populous metropolitan areas. The region was
Jernigan, Kenneth (American activist)
Kenneth Jernigan was an American activist and administrator who was a prominent opponent of discrimination against people with visual impairments. Jernigan grew up in Tennessee on a family farm. Although he had been born blind, Jernigan had a typical farm upbringing, doing chores and playing
Jero’s Metamorphosis (play by Soyinka)
Wole Soyinka: … (performed 1960; published 1963) and Jero’s Metamorphosis (1973). But his more serious plays, such as The Strong Breed (1963), Kongi’s Harvest (opened the first Festival of Negro Arts in Dakar, 1966; published 1967), The Road (1965), From Zia, with Love (1992), and even the parody King Baabu (performed 2001; published…
Jerobaal (biblical figure)
Gideon, a judge and hero-liberator of Israel whose deeds are described in the Book of Judges. The author apparently juxtaposed two traditional accounts from his sources in order to emphasize Israel’s monotheism and its duty to destroy idolatry. Accordingly, in one account Gideon led his clansmen of
Jeroboam (kings of Israel)
Jeroboam, in the Bible, either of two kings of northern Israel. The events of their reigns are recorded chiefly in 1 and 2 Kings and 2 Chronicles. (10th century bce), son of Nebat, was a corvée overseer under Solomon, who incurred the suspicion of the king as an instrument of the popular democratic
Jeroboam I (king of Israel)
Jeroboam: Jeroboam I: (10th century bce), son of Nebat, was a corvée overseer under Solomon, who incurred the suspicion of the king as an instrument of the popular democratic and prophetic parties. He fled to Egypt but was recalled by the northern tribes on the refusal…
Jeroboam II (king of Israel)
Jeroboam: Jeroboam II: (8th century bce), son of Joash, was the last of the great kings of Israel, after whose death the country fell into confusion and ultimate servitude. Aided, perhaps, by Assyrian pressure from the east, he brought to an end the long struggle between…
Jerome (recording by Lizzo)
Lizzo: Music career and solo success: …such hits as “Juice,” “Jerome,” and “Tempo,” the latter of which was a collaboration with Missy Elliott. The album’s deluxe edition also contained the bonus track “Truth Hurts” (which was first released in 2017), and it became her first single to top the Billboard Hot 100 chart. The song…
Jérôme (king of Westphalia)
Jérôme Bonaparte was Napoleon I’s youngest brother, who became king of Westphalia and marshal of France. It was through Jérôme that the Bonaparte line extended into the United States; his eldest son, Jerome, grew up in Maryland with his American mother. The Bonaparte family had endured poverty and
Jerome of Prague (Czech philosopher)
Jerome of Prague was a Czech philosopher and theologian whose advocacy of sweeping religious reform in the Western church made him one of the first Reformation leaders in central Europe. A student at the Charles University of Prague, Jerome came under the influence of the Czech reformer Jan Hus,
Jerome Robbins Foundation (United States charitable organization)
Jerome Robbins: …organization bearing his name, the Jerome Robbins Foundation. Originally intended to fund dance and theatre projects, the foundation also provided financial support to projects combating the effects of the AIDS crisis. In accordance with Robbins’ earlier wishes, in 2003 the foundation awarded the first Jerome Robbins Prizes in recognition of…
Jerome Robbins’ Broadway (American musical)
Jason Alexander: Career: …musical for his performance in Jerome Robbins’ Broadway in 1989. He won the Screenwriters Guild Award for outstanding performance by a male actor in a comedy series for his work on Seinfeld in 1995. He sang and was part of the writing team that won a Daytime Emmy Award for…
Jerome, Jerome K. (English writer)
Jerome K. Jerome was an English novelist and playwright whose humour—warm, unsatirical, and unintellectual—won him a wide following. Jerome left school at the age of 14, working first as a railway clerk, then as a schoolteacher, an actor, and a journalist. His first book, On the Stage—and Off, was
Jerome, Jerome Klapka (English writer)
Jerome K. Jerome was an English novelist and playwright whose humour—warm, unsatirical, and unintellectual—won him a wide following. Jerome left school at the age of 14, working first as a railway clerk, then as a schoolteacher, an actor, and a journalist. His first book, On the Stage—and Off, was
Jerome, Jharrel (American actor)
Moonlight: …with Kevin (now played by Jharrel Jerome). Terrel then manipulates Kevin into knocking Chiron down to be kicked and beaten by Terrel and his cohort. When Chiron returns to school, he strides into the classroom and smashes a chair into Terrel. Chiron is then taken away in handcuffs.
Jerome, St (Christian scholar)
St. Jerome ; feast day September 30) was a biblical translator and monastic leader, traditionally regarded as the most learned of the Latin Fathers. He lived for a time as a hermit, became a priest, served as secretary to Pope Damasus I, and about 389 established a monastery at Bethlehem. His
Jeronimo (work by Kyd)
Thomas Kyd: …English dramatist who, with his The Spanish Tragedy (sometimes called Hieronimo, or Jeronimo, after its protagonist), initiated the revenge tragedy of his day. Kyd anticipated the structure of many later plays, including the development of middle and final climaxes. In addition, he revealed an instinctive sense of tragic situation, while…
Jeronimo de Cevallos (painting by El Greco)
El Greco: Later life and works of El Greco: El Greco’s portrait of Jeronimo de Cevallos (1605–10), on the other hand, is most sympathetic. The work is half-length, painted thinly and limited to black and white. The huge ruff collar, then in fashion, enframes the kindly face. By such simple means, the artist created a memorable characterization that…
Jerónimos Monastery (monastery, Lisbon, Portugal)
Lisbon: The Age of Discovery: …site in 1983, and the Jerónimos Monastery, about 4 miles (6 km) downstream from the city centre, are far less exuberant than those in the rival Portuguese cities of Batalha and Tomar. The tower and the monastery are nevertheless the most important architectural monuments in the Lisbon area. The five-story…
Jerrold, Douglas William (English playwright, journalist, and humorist)
Douglas William Jerrold was an English playwright, journalist, and humorist. Jerrold achieved success in the theatre with Black-Eyed Susan (1829), a nautical melodrama that draws on the patriotic tar (sailor) while critiquing authoritarianism in the British Navy. He also mastered a special brand of
Jerry and Marge Go Large (film by Frankel [2022])
Annette Bening: Career: …Nile and in the comedy-drama Jerry and Marge Go Large opposite Bryan Cranston. She garnered her fifth Oscar nomination for her portrayal of long-distance swimmer Diana Nyad in the sports biopic Nyad (2023).
Jerry Maguire (film by Crowe [1996])
Paula Abdul: Choreography: …on such popular hits as Jerry Maguire (1996), American Beauty (1999), and Black Knight (2001). In 1989 she earned an Emmy Award for her choreography on The Tracey Ullman Show.
Jerry Springer Show, The (American television show)
Jerry Springer: …and politician, best known for The Jerry Springer Show, a daytime talk show featuring controversial topics and outrageous guest behaviour.
Jersey (island, Channel Islands, English Channel)
Jersey, British crown dependency and island, the largest and southernmost of the Channel Islands, lying south of England’s coast and 12 miles (19 km) west of the Cotentin peninsula of France. Its capital, St. Helier, is 100 miles (160 km) south of Weymouth, England. Jersey is about 10 miles (16 km)
Jersey (breed of cattle)
Jersey, breed of small short-horned dairy cattle originating on Jersey, one of the Channel Islands; it is believed to have descended from French cattle. The colour of the Jersey is usually a shade of fawn or cream, but darker shades are common. In the late 18th century measures were passed
jersey (clothing)
sweater, outer garment, usually knitted or crocheted, that is worn on the upper part of the body, either pulled over the head or buttoned down the front or back. Although hand knitting of wool had been practiced for about 2,000 years, it was not until the 15th century that the first knitted shirts
Jersey Act (British history)
Jersey Act, resolution passed in 1913 by the English Jockey Club and named after its sponsor, Victor Albert George, 7th Earl of Jersey, one of the club stewards. It declared that the only horses and mares acceptable for registration in the General Stud Book would be those that could be traced in
Jersey Boys (film by Eastwood [2014])
the Four Seasons: film 2014). The Four Seasons were inducted into the Rock and Roll Hall of Fame in 1990.
Jersey Boys (musical theater)
the Four Seasons: Tony Award-winning Broadway musical Jersey Boys (2006; film 2014). The Four Seasons were inducted into the Rock and Roll Hall of Fame in 1990.
Jersey City (city, New Jersey, United States)
Jersey City, city, seat (1840) of Hudson county, northeastern New Jersey, U.S. It is situated on a peninsula between the Hudson and Hackensack rivers, opposite Manhattan Island, New York City, with which it is connected by the Holland Tunnel and the Port Authority Trans-Hudson rapid transit system.
|
||||
correct_award_00093
|
FactBench
|
0
| 62
|
https://www.linkedin.com/posts/nobelprize_at-noontime-in-midsummer-when-the-sun-is-activity-7154396388611530752-GkpJ
|
en
|
The Nobel Prize on LinkedIn: “At noontime in midsummer, when the sun is at its highest and everything…
|
https://media.licdn.com/dms/image/D4E10AQGWDMIxjoY5pg/image-shrink_1280/0/1705741021160?e=2147483647&v=beta&t=u9jmVhO3wYMCsF0XiAz1QupVW0SNnRxHOdLEb1Nxe_Q
|
https://media.licdn.com/dms/image/D4E10AQGWDMIxjoY5pg/image-shrink_1280/0/1705741021160?e=2147483647&v=beta&t=u9jmVhO3wYMCsF0XiAz1QupVW0SNnRxHOdLEb1Nxe_Q
|
[
"https://media.licdn.com/dms/image/D4D3DAQHA7z_U5pIOrw/image-scale_191_1128/0/1696924261282/nobelprize_cover?e=2147483647&v=beta&t=l6mYC2BiWwAN7v4EYmDH84xbXDZnTa_CSTQdzbiEBbA"
] |
[] |
[] |
[
""
] | null |
[
"The Nobel Prize"
] |
2024-01-20T08:57:01.282000+00:00
|
“At noontime in midsummer, when the sun is at its highest and everything is in a state of embroiled repose, flashes may be seen in the southern sky. Into the…
|
en
|
https://static.licdn.com/aero-v1/sc/h/al2o9zrvru7aqj8e1x2rzsrca
|
https://www.linkedin.com/posts/nobelprize_at-noontime-in-midsummer-when-the-sun-is-activity-7154396388611530752-GkpJ
|
"First, dream. And when you dream, you should make your dream as big as you can." Tawakkol Karman's dream for a more just world with freedom of expression and equal rights for all individuals, positioned her as a courageous leader in Yemen during the Arab Spring. Despite imprisonment and persecution for her journalism and advocacy for democracy and freedom of speech, she continued to stand firm in her beliefs under an oppressive regime. She was awarded the peace prize in 2011 for her non-violent efforts to promote peace and her struggle for women’s rights. Watch Karman's Nobel Prize lecture here: https://bit.ly/3a19mHr #NobelPrize
"It just so happens that people who value freedom the most are often deprived of it" – Ales Bialiatski's 2022 Nobel Peace Prize lecture. The human rights activist is currently imprisoned in one of the harshest prisons in Belarus, having been jailed for 10 years without trial or conviction in the wake of demonstrations against President Lukashenko's dictatorship in 2021. It is not the first time that Bialiatski has been targeted by the country’s authorities. He has worked to promote democracy and human rights in Belarus since the 1980s and been imprisoned before. In response to a crackdown on protests, he founded Viasna (Spring) to provide support for the incarcerated demonstrators and their families. Last year, it was reported that Bialiatski had been denied contact with his lawyer and relatives, and was being held in a cell-type room. Recently it has been reported that his health is deteriorating yet medicines being provided by his family are not reaching him in jail. Learn more about Bialiatski's work: https://lnkd.in/g9-cXpAv
What is your definition of a writer? "All good books are alike in that they are truer than if they had really happened and after you are finished reading one you will feel that all that happened to you and afterwards it all belongs to you: the good and the bad, the ecstasy, the remorse and sorrow, the people and the places and how the weather was. If you can get so that you can give that to people, then you are a writer." - Nobel Prize laureate Ernest Hemingway's definition of a writer. #NobelPrize
"When I was nine years old, my parents gave me a chemistry set. Within a week, I had decided to become a chemist and never wavered from that choice. As I grew my interest in chemistry grew more intense, if not more sophisticated. (...) I was not a particularly distinguished student as a child. My grades were good but obtained more by steady work than any brilliance on my part. I vividly remember my father telling me that one of my elementary school teachers had told him that I was not brilliant but I was a steady hard worker. Somehow the further I progressed in school, the easier it became to do well." Remembering Robert Curl who shared the 1996 Nobel Prize in Chemistry with Harold Kroto and Richard Smalley "for their discovery of fullerenes". From an early age Robert Curl realised that he wanted to become a chemist, read his life story: https://bit.ly/2On128N #NobelPrize
"I think writing is a kind of gift. A new novel or a new play, it's a gift I get... I need to have breaks or pauses when I don't write. You can't get gifts all the time." Literature laureate Jon Fosse in our new podcast episode: https://lnkd.in/eUA9wtbj #NobelPrize
“They made me fall in love with quantum mechanics and atomic physics,” said physics laureate Anne L’Huillier of two “great teachers”. She benefitted from being taught by Claude Cohen-Tannoudji and Serge Haroche, who would be awarded the Nobel Prize in Physics in 1997 and 2012 respectively. L'Huillier was awarded the Nobel Prize in Physics 2023 for experimental methods that generate attosecond pulses of light for the study of electron dynamics in matter. Learn more about her life and work: https://lnkd.in/egnFT-SC Were you inspired by any great teachers?
|
|||
correct_award_00093
|
FactBench
|
3
| 60
|
https://www.laboutiquedanoise.com/en/designers
|
en
|
Designers
|
https://www.laboutiquedanoise.com/img/favicon.ico?1663773290
|
https://www.laboutiquedanoise.com/img/favicon.ico?1663773290
|
[
"https://www.laboutiquedanoise.com/img/logo-1663773290.jpg",
"https://www.laboutiquedanoise.com/img/l/1.jpg",
"https://www.laboutiquedanoise.com/img/l/1.jpg",
"https://www.laboutiquedanoise.com/img/l/2.jpg",
"https://www.laboutiquedanoise.com/img/logo-1663773290.jpg",
"https://www.laboutiquedanoise.com/img/l/1.jpg",
"https://www.laboutiquedanoise.com/img/l/1.jpg",
"https://www.laboutiquedanoise.com/img/l/2.jpg",
"https://www.laboutiquedanoise.com/img/logo-1663773290.jpg",
"https://www.laboutiquedanoise.com/img/l/1.jpg",
"https://www.laboutiquedanoise.com/img/l/1.jpg",
"https://www.laboutiquedanoise.com/img/l/2.jpg",
"https://www.laboutiquedanoise.com/img/su/105-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/380-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/316-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/354-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/94-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/178-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/54-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/283-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/364-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/399-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/159-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/339-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/344-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/239-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/25-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/391-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/366-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/152-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/174-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/93-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/304-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/30-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/227-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/229-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/310-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/291-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/160-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/222-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/139-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/355-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/29-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/219-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/278-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/12-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/351-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/106-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/213-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/329-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/214-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/240-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/305-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/256-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/110-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/248-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/230-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/9-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/243-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/323-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/206-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/325-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/362-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/319-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/284-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/237-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/78-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/211-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/225-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/372-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/365-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/321-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/265-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/314-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/286-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/324-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/104-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/89-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/47-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/99-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/352-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/387-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/46-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/134-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/138-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/318-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/231-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/282-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/268-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/44-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/350-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/176-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/162-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/118-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/140-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/378-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/347-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/232-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/131-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/69-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/376-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/373-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/113-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/341-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/129-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/11-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/36-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/267-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/290-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/144-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/62-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/67-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/263-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/74-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/72-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/154-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/338-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/336-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/137-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/55-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/166-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/330-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/238-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/390-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/274-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/377-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/269-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/260-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/309-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/153-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/65-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/191-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/128-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/285-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/381-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/13-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/245-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/75-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/155-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/234-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/76-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/82-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/254-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/353-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/384-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/165-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/217-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/266-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/375-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/117-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/288-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/392-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/385-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/397-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/136-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/306-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/295-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/250-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/215-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/125-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/185-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/307-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/168-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/343-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/395-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/251-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/367-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/181-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/326-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/370-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/149-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/357-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/124-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/164-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/90-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/216-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/51-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/96-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/363-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/73-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/61-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/58-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/322-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/281-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/272-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/382-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/383-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/135-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/360-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/292-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/235-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/167-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/393-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/207-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/273-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/368-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/298-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/83-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/126-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/361-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/179-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/15-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/4-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/109-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/293-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/303-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/145-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/107-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/80-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/108-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/320-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/81-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/334-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/297-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/112-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/317-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/220-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/223-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/255-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/332-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/157-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/182-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/64-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/247-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/396-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/374-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/66-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/123-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/172-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/212-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/262-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/337-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/171-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/101-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/275-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/386-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/300-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/228-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/389-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/204-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/102-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/31-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/133-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/19-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/84-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/296-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/en-default-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/148-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/177-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/379-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/132-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/28-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/398-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/270-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/32-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/53-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/70-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/289-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/340-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/57-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/327-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/95-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/369-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/287-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/143-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/257-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/311-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/279-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/218-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/244-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/346-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/280-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/313-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/308-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/331-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/312-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/301-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/226-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/202-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/335-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/359-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/371-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/63-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/388-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/299-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/277-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/358-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/276-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/52-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/333-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/26-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/356-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/151-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/170-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/10-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/342-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/328-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/86-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/163-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/79-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/264-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/59-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/249-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/241-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/100-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/142-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/271-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/en-default-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/su/233-manu_default.jpg",
"https://www.laboutiquedanoise.com/img/logo-1663773290.jpg"
] |
[] |
[] |
[
""
] | null |
[] | null |
Designers list
|
en
|
https://www.laboutiquedanoise.com/img/favicon.ico?1663773290
|
La Boutique Danoise
|
https://www.laboutiquedanoise.com/en/designers
|
Aino Aalto (1894 - 1949) is a famous Finnish architect and designer. She studied architecture at the Helsinki University of Technology, as did her husband, the famous Alvar Aalto. In 1923, she went to work for the office of Gunnar A. Wahlroos in Jyväskylä, then left this position the following year to collaborate with Alvar Aalto.
This marked the beginning of a long and productive partnership. In 1935, together with Alvar Aalto, Maire Gullichesen and Nils-Gustav Hahl, Aino Aalto founded Artek, an internationally renowned furniture and lighting company. She played an important role in the company and served as general manager for Artek from 1941 to 1949. Aino Alto's career as a furniture and interior designer, architect and photographer is best known for her work in glass. Among her most famous works is the Aino Aalto glassware line, created in 1932 under the name Bölgeblick, which means "water rings."
Andreas Ferdinand Riise Bergsaker (b. 1990) is a Norwegian product designer based in Copenhagen. He graduated with a master degree in Product design from Oslo and Akershus University College of Applied Sciences (HiOA), in June 2015.
His work covers different everyday objects, designed for product longevity and generations to come. His designs are rooted in exploration of material and shape, and referencing cultural heritage with emphasis on traditional craftsmanship, industrial production processes and tactile qualities. His work has led to exhibitions in London, Milan, New York, Oslo and Stockholm
Arne Hovmand-Olsen had the world in mind, both before and after the 1950s, when he became one of Denmark's renowned furniture designers. A farmer's son and aesthete by nature, he began as a cabinetmaker's apprentice, before studying furniture design and pursuing his passion for visual composition and furniture design. In 1944, Arne Hovmand-Olsen opened his own design firm, where he combined the details of classic craftsmanship with innovative, modern design. He was a pioneer, not only in the field of architecture, but also in exporting designer furniture to meet the great interest abroad. In consultation with Arne Hovmand-Olsen's family, Warm Nordic continues the legacy of this great furniture designer and will regularly launch other classics from the House of Arne Hovmand-Olsen.
Arne Vodder was a leading light of what might be called the “second generation” of forward-thinking 20th-century Danish furniture designers — those who, following in the footsteps of Hans Wegner, Arne Jacobsen, Finn Juhl and others, first applied the skills, traditions and philosophical tenets of Danish craftsmanship to a modern furniture idiom.
As a graduate student in architecture trained by Finn Juhl, Vodder made his mark in the 1960s, when modern design had gained wide acceptance, particularly in the business world. Accordingly, many of Vodder’s chairs are quiet in form — projecting an air of sturdiness and strength, rather than avant-garde styling.
Vodder’s aesthetic flair was very pronounced in his cabinets and storage pieces — sideboards, bookcases, credenzas and buffets. In such pieces, Vodder liked to play with asymmetry. His bookcases often have a seemingly random array of variously sized shelves and nooks. A typical Vodder sideboard might have four sections, each different in purpose and look: an open stack of vertical shelves, and other cupboards covered with sliding panels in contrasting colored laminates and wood veneers. As you will see from the works on these pages, Arne Vodder had a sense of what kind of design was appropriate for which space: sobriety in the boardroom; playfulness at home.
Bent Karlby (1912-1998) was a Danish designer best known for his lighting fixtures: he is considered one of the most prolific and versatile Danish lighting designers of the 20th century. Many of his creations are based on various organic forms that indirect light adorns with details: for example, perforations slightly filter the light to make it warm. Towards the end of his career, however, he also designed several functional lighting fixtures with a strong geometric aesthetic. Bent Karlby worked with the Danish brand LYFA for more than 40 years and is probably best known for the lights he designed for the firm.
Henrik Blum (born 1960, Denmark) and Rune Balle Olesen (born 1957, Denmark).
Blum and Balle are industrial designers educated at the Danish Design School. Balle Olesen is also a qualified silversmith from Georg Jensen. Since 1996 Henrik and Rune have run the design company Blum & Balle. Their work methods are characterized by teamwork comprising several experts, such as technical specialists. At the same time they also find room for workplace specialists and information psychologists.
Blum & Balle designed products and design solutions for danish companies designed tables, chairs.... All of these pieces of furniture are excellent examples of how form and function may be united.
Architect, designer, moovie director, profesor, Charles Eames creates, in 1930, his own design studio of architecture. With Eero. Saarinen, he wins in 1940, the first prize of design competition of the MODERN ART MUSEUM of NY .
Together With his wife Ray, they contribute to create an up to date style of furniture, produced byHerman Miller firm , the lounge chair is one of their famoust and extremly confortable chair. They also work together as photographs and moovie directors. Since 1957, Charles & Ray Eames produced their furniture with VITRA firm .
This mythic couple of designers influenced many generations by their way of life and philosophy and an always up to date design.
It is the details that distinguish Charlotte Høncke's design.
One of the great passions of this Danish designer is upholstered furniture,
where original seams, a play of colors or a surprising shape arouse our curiosity.
In Charlotte Høncke's creations, the richness of details coexists.
Coexists with pure forms and lines, while taking functionality into account.
Charlotte Høncke immerses herself in the process of creating exquisite designs for the home,
capturing a sense of entity, comfort, and a timeless aesthetic, which not only
recalls history, but also embraces the spirit of our times.
Christian Dell was born in Offenbach am Main in Hesse. He completed the silver forging studies at the academy in 1911 and from 1912-13 he studied at the Saxon college of arts and crafts in Weimar. From 1922 to 1925 he worked as a foreman of the metal workshop at the Bauhaus in Weimar where he was the man behind a highly innovative and pioneering style of design.
After World War II, Christian Dell manufactured silver goods and opened a jeweller´s shop in Wiesbaden in 1948, which he operated until 1955. He died in Wiesbaden in 1974.
Born in London in 1974. He graduated in 1998 from the Department of Product Design at Kingston University, London, with a thesis on "advanced materials". He started his activity working on several design projects, collaborating with a few design studios until, in 2004, he founded his design studio based in Stockholm. The office focuses on combining technological research and innovation. His projects are often characterized by a sense of inventiveness and a poetic quality. Since 2005, he has been a guest lecturer in the industrial design course at LTH University in Sweden.
(1889-1965) was known as an uncompromising designer. He viewed quality craftsmanship as the most important element of his work, making it his focus when developing new furniture. Unlike other cabinetmakers, Henningsen always created his own furniture pieces - although his greatest desire was to be recognized as a cabinetmaker and not as a furniture designer.
n his designs, Henningsen was able to capture traditional expressions from other style periods and reinterpret them with a modern and organic flavor. He drew inspiration from earlier styles such as French Empire, Rococo, and British 17th-century furniture, much like one of the other great furniture designers of his day, Kaare Klint.
uring Frits Henningsen's lifetime, Carl Hansen & Son was one of only two furniture producers to be shown such trust.
With a background of working with several projects in the Swedish film industry, Gunilla enrolled at Konstfack, the University College of Arts, Crafts and Design in Stockholm, in 1983, to study interior and furniture design.
Gunilla became Lammhults first female designer and her continued work for Lammhults has resulted in a numerous range of furniture design.
The statement of the jury of the George Jensen prize received in 1996, one of many prestigious awards received, well describes her products. ”… strict, minimalistic and elegant design. The proportions and details of her furniture reveal a genuine sense of quality.”
Gunilla´s collaboration with saveral manufacturers, as well as designing kitchens, glassware, lightning and carpets, makes her one of the foremost names in contemporary Scandinavian design.
Helge Sibast (1908-1985) was a Danish cabinetmaker and designer who continued the work of his father Peder Olsen Sibast. Besides a skillful carpenter, Helge Sibast was also a visionary designer, and in the 1950s and 1960s his simple, functional and high-quality designs gained success also outside Denmark, and even made their way to the White House. His best known designs, Sibast chairs No 7 and 8, Helge Sibast designed in 1953. In the 21st century Sibast’s tradition is carried on by Sibast Furniture, which was relaunched in 2012 by Helge Sibast’s grandson and his wife.
ISKOS — BERLIN Design is a partnership of Boris Berlin and Aleksej Iskos.
The company works within the fields of industrial, furniture and graphic design. The firm was only established in late 2010, but Boris and Aleksej have been working together for a number of years. In 1987, Boris co-founded Komplot Design, where Aleksej was employed as an assistant for more than 11 years. This long and fruitful cooperation has allowed them to develop a shared philosophy of design.
Jeppe Utzon is a third-generation architect - and fourth-generation designer. He has achieved international acclaim for his work, and the cross-creative DNA comes from no stranger. As the grandson of the famous Danish architect Jørn Utzon, who created the Sydney Opera House, Jeppe Utzon continues a rich family tradition through his work in the art of architecture. He runs a design studio in the centre of Copenhagen, and is doing projects for clients in Europe, Africa, Asia, and Latin America. The projects are as diverse as private luxury homes in Denmark, eco holiday resorts in Nicaragua, iconic commercial buildings in Shanghai and floating social housing in Bolivia.
Working by reducing the product to its essence. Using mechanical solutions as the basis for innovative design. That's how he thinks. That's why he gets up in the morning. He wants to come up with clever ideas, not just make something beautiful.
Master's degree in industrial design engineering, Chalmers, Göteborg, 2005. Studied furniture design at Stenebyskolan in 2002.
Now runs the design studio in Göteborg, Sweden, where he works on furniture and product design projects.
Markus Johansson graduated in 2011 from HDK (School of Design and Crafts) Goteborg, Sweden.
His studio is located in Central Goteborg where he specializes in design and the development of design of furniture, product, graphic and lighting.
Markus works with both Swedish and international design companies and has won numerous awards which also includes his highly acclaimed Nest chair which was exhibited at The Museum of Art and Design, New York.
Markus explains about his work:
"We try to find inspiration from various environments stretching from the human to the artefact, through analysing and processes. Our vision is to combine construction, function and form to enrich the everyday experience. Environmental thinking is of course important, but shouldn’t hinder creativity."
The work of Danish architect Mette Schelde (born 1985) ranges from product design to architecture and spatial planning. Rethinking the conventional, her works are sensory and functional. The studio was founded in 2012.
With a simple, sculptural expression, her creations take shape through an intuitive process that always puts people first. She is constantly exploring new methods and technologies that come and go in a mix of tradition and renewal.
Norwegian designer Mia Hamborg (b.1980), BA in furniture from the Steneby School of Crafts and Design in Gothenburg has a passion for order, functionality, and colour. With her joyous and playful furniture, she continues an old Nordic tradition for wood turning and brings new life into the material. She compliments designer Sir Terence Conran for his acknowledgment that modern people often live with limited space and therefore making storage a design issue. She doesn't follow any strict design tradition, but the joy of life is her inspiration.
The Swedish architect Nils Strinning (1917-2006) was one of the most prominent designers of the mid-20th century. He was one of those who laid the foundations for what is now known as Scandinavian design. Nils Strinning is the creator of the famous String shelving system, a lightweight shelving system created in the 1940s that is still very popular today. The String system is simple to assemble and the shelves can easily be repositioned.
Norway Says is a company based in Oslo, designing furniture, interiors and products. They have been awarded the designer of the year award in Norway and have received several national and international design awards for their work. Norway Says’ designers are Torbjørn Anderssen (b. 1976), Andreas Engesvik (b. 1970), Espen Voll (b. 1965), and Hallgeir Homstvedt (b. 1977). Kjersti Bekken (b. 1977) joined Norway Says in 2006 to run the Norway Says Shop. Norway Says has a broad and international approach to design and works with various national and international clients. Norway Says’ company culture is based on strong ideas, long-term friendship, and knowledge.
Founded in Stockholm in 2007, NOTE has quickly become an established design studio.
NOTE is a multidisciplinary studio and works within the fields of architecture, interiors product design, graphic design and design strategy. As designers, they constantly observe and explore their environment and, through their areas of expertise they try to invoke others to do the same.
By identifying what is unique about each project, they each project, they transform non-material values into tactile objects and spaces.
Oivind Slaatto, industrial designer, 1978 . Oivind finds great inspiration in nature, particularly in the snail – with regard to both the construction of the snail shell and the slow pace of the snail. In addition, the Fibonacci sequence, fractals, and other basic mathematical principles fascinate him. These sources of inspiration formed the basis for his very first design after getting his degree. For B&O he made the loudspeaker A9, which was introduced in 2012. The same principles have inspired his new series of lamps – the Swirl. The lamps have actually been in progress ever since he completed his studies in 2008, but Øivind has worked intensely on improving and polishing the features. His desire that light, form and function should work together to provide a handsome light experience, while the design of the lamp is quite airy, ties in very well with the Scandinavian tradition. As such, it is in complete harmony with LE KLINT’s collection, which these lamps will now become a part of.
Oki Sato, born in 1977 in Toronto, Canada, studied architecture at Waseda University in Tokyo.
He obtained his Master’s degree in 2002. In the same year, the nendo studio was born in Tokyo.
Now he has taken up the international stage, with offices in Tokyo and Milan.
Nendo, meaning ‘modelling clay’, expresses his desire of having certain flexibility and the ability to reinvent oneself. The designer takes his inspiration from Japanese uncluttered style to create a language of his time.
The philosophy of Oki Sato is reflected in his designs, thanks to which he transforms the interactions of people with the objects surrounding them by creating a parenthesis in their life.
This desire is found in the uncluttered and characteristic shapes of his designs, to which he always adds a touch of humour and conviviality.
Ole Gjerløv-Knudsen (1930-2009) was a productive Danish designer best known for his OGK Safari bench and chair, as well as his Modeline collection, designed in collaboration with Tørben Lind.
After completing his training as a cabinetmaker, Gjerløv-Knudsen studied furniture design at the Danish School of Art, Craft and Architecture at the Royal Danish Academy of Fine Arts. He later taught at the Faculty of Architecture at the Danish School of Art and Crafts, and served as rector of his former school from 1967 to 1990.
Pekka Koivikki is born in 1975, lives in Helsinki, Finland. He is educated as an industrial designer / Master of Arts
He designs and produce tailored furniture for private and public purposes. He is trained to use design tools of high standards, and he has more than two decades of first hand experience at the workshops making furniture for clients. This combination of design and practical knowledge enables efficient use of available resources to produce quality results.
Piet Hein (1905 –1996) was a Danish scientist, mathematician, inventor, designer, author, and poet.
He mastered it all and made Danish cultural life a little richer, a little prettier and a little more witty with his trademark line, his square head and his quirky language. For him there was no unbridgeable gap between the subjectivity of the fine arts and the objective world of science. He had a rare ability to bring poetry, geometry and design together in a harmonious design, where function and art is weighted equally.
His superellipse was to solve the double contrast between the circle and the square and that of the ellipse and the rectangle. Piet Hein's superellipse shape was adopted by various Scandinavian industries for use in architectural design, furniture, and household consumer products.
In addition to his work with the development of the superellipse Piet Hein has executed a great many other design commissions, from board in wood to the elegant Sinus lamp.
"Art is solution to problems which cannot be formulated clearly before they have been solved".
Piet Hein
Rainer Bachschmid was born in southern Germany on Lake Constance. In addition to his training as a carpenter, he studied industrial design at the University of Wuppertal. After graduating in 1994, he worked with various designers such as Babel Design and Moll Design. Later, Rainer Bachschmid moved to Switzerland. He worked for 10 years as a development and design manager for the furniture factory Reinhard AG.
In 2006 he founded his own design studio and in 2009 rabadesign GmbH from Switzerland. Since 2005, Rainer Bachschmid has also been a design lecturer at the Bürgenstock Training Center for Master Carpenters (VSSM). He currently lives in Beckenried, Switzerland, with his wife and two children.
Roger Persson was born in 1967 in Karlskrona, trained as an industrial designer at HDK / University of Gothenburg and also studied furniture design at the School of Visual Arts in New York.
University of Gothenburg and also studied furniture design at the School of Visual Arts in New York.
He then opened a design studio with two colleagues. Since 2009 he has been running his own studio,
Roger Persson Design, which designs furniture and other products - from lighting, switches and 3D scanners to toys.
His furniture has been exhibited throughout Europe, Japan and the USA.
Sakari Hartikainen is a versatile Finnish industrial designer and manufacturer, working internationally and extensively in the fields of product design furniture and concepts. Flexibility and the courage to explore new possibilities are accompanied by a humble hand on design and creativity. His solid experience in working with different materials and especially his knowledge of the possibilities and character of wood play a major role in simple, intelligent and clean finished products. Aiming to design and create products that have the potential to create an emotional connection between the object and the end user. In a way that the pieces invite and encourage the owner to take care of them, while they have the quality to age with dignity.
Sebastian Herkner (born 1981) studied Product Design at HfG Offenbach University of Art and Design. During his studies he began to focus on designing objects and furniture, honing his keen sense for materials, colors and textures. For a short while he then worked for Stella McCartney in London. In the year 2006 Sebastian Herkner founded his own studio. Since that time, he has designed furniture, lamps and smaller items for manufacturers such as Ames, and Tradition, Cappellini, ClassiCon, Dedon, Ex.t, Fontana Arte, Gubi, Linteloo, Moroso, Pulpo, Rosenthal, Schramm Werkstätten, Thonet, Wittmann and Zanotta. Herkner also realizes interior architecture projects and museum and exhibition design. His works have won numerous accolades – among others the IF Award, the Iconic Award and the Elle Deco International Design Award (EDIDA). As Guest of Honor to imm cologne 2016, Sebastian Herkner provided the design for “Das Haus”. Maison&Objet has elected him 2019 “Designer of the Year”. In 2021 Sebastian Herkner won the most prestigious design award in Milan, the EDIDA award for Best Designer of the Year. 2022 Sebastian Herkner was one of the curators at Homo Faber next to Robert Wilson, Judith Clarke, Michele de Lucchi and Naoto Fukasawa commissioned by Michelangelo Foundation in Venice.
For furniture designer Shawn Place, a bold move from central Canada to the West Coast offered more than a change of scenery - it was exactly what he needed to launch a new career.
Originally from Brampton, Ontario, Shawn Place moved to Prince George in 2005. He was cycling through Gastown on a trip to Vancouver when he came across a contemporary furniture shop. Intrigued, he walked into the shop and, with a little luck and hard work, the rest fell into place, as if by chance
"I'd been working as a bicycle designer and was looking for a change and saw some of the furniture," recalls Place, who spoke to the Georgia Straight by phone while in town. "And I thought, 'Huh, I think I'll become a furniture designer. "
Sigurd Ressell is a Norwegian designer who has strongly influenced Scandinavian design. His avant-garde spirit has left an indelible mark on the history of design.
Ressell designed numerous steel and leather armchairs, sofas, chairs and even a bar (1960). In 1971, he designed the legendary Falcon chair.
Ressell was born in 1920 in Norway. He studied at the Norwegian National Academy of Crafts, Art and Industry and graduated in 1947. Sigurd Ressell entered one of his designs in a competition. This was noticed by Niels Vodder, a designer who regularly collaborated with Finn Juhl, a leading figure in design. This led to the creation of a prototype based on his sketch. In 1958 Ressell received an award for his SR 600 chair at the Copenhagen Cabinetmakers' Guild Exhibitions. Ressell had a profound effect on 20th century design.
The Danish designer Svend Aage Holm Sørensen (1913-2004) is known for his self-produced lighting designs from the 1950s, 1960s and 1970s. Despite the timeliness of his designs on the vintage market, there is a lack of biographical information about the designer and his eponymous manufacturing company.
It is assumed that Holm Sørensen designed lamps for the well-known Danish lighting manufacturers Fog & Mørup and Lyfa in the 1950s, before establishing his own lighting company, Holm Sørensen A/S to produce and distribute his own designs.
Holm Sørensen's style varies considerably, with designs from the 1950s truly reflecting the mid-century modern lighting style, with clear influences from the De Stijl and Bauhaus movements. His muted floor and table lamps contain the classic tripod base that was popular at the time, referencing designs such as J. A. Busquet's H. Th. Pinocchio Lamp (1954).
From the 1960s onwards, Holm Sørensen's style changed completely. His designs range from colourful, geometric table and floor lamps to hanging lamps with unfinished brass and copper surfaces. These hanging lamps present Holm Sørensen's interpretation of the brutalist style, which was popular from the 1950s to the mid-1970s. Originally invented by Swedish architect Hans Asplund, the style has been adopted internationally by many iconic designers, including Le Corbusier.
Søren Juul has had a lifelong experience within both design and architecture. He was trained as a cabinetmaker, then studied as a furniture designer at the Design School in Copenhagen and the architect school in Aarhus, where he subsequently was employed by architects Friis & Moltke for more than 40 years. Here he dealt primarily with furniture, lamps and fixtures for schools, colleges, hotels, offices etc. The design according to Søren Juul, should be simple, functional and honest and the quality should be no less than perfect.
While Hallgeir Homstvedt and Jonah Takagi have been friends for years, their first collaboration as designers came in 2016 as part of an exhibition curated by the influential design blog Sight Unseen and sponsored by the Embassy of Norway. To date they have designed and developed products for a range of international brands that include Roll & Hill, Le Klint, Design Within Reach, Dims and L.K. Hjelle.
Finnish designer Sami Kallio and Danish designer Jakob Thau have joined forces to create the Betty TK1 chair for & Tradition.
Kallio grew up in Finland and received a master's degree in design from the Gothenburg School of Design and Crafts. Thau grew up in Denmark and graduated from the Royal Danish Academy of Fine Arts, School of Design with a Master's degree.
Both went on to train as cabinetmakers, moving forward to execute their own timeless and purposeful designs; reflecting traditional techniques that have been reworked in a contemporary context.
Since graduating from the Danish School of Design in 1997, Thomas E. Alken (born 1970) has been recognised as one of Denmark's leading industrial designers of recent years. For over 20 years, he has run the Copenhagen-based design company Format Design. Thanks to his great sense of functionality, Thomas E. Alken's furniture not only makes you sit comfortably, it also makes you feel at home. Like Børge Mogensen and previous FDB furniture designers, he takes the human body and movement as his starting point, so that the furniture reflects the social dimension of the human being. A chair should not only be beautiful, it should also be comfortable.
Medal-receiving, fully qualified Cabinetmaker from Aksel Kjersgaard and Risskov Møbelsnedkeri.
To work with living material can be challenging. However, for me, it is exactly what makes me love my profession. As a cabinetmaker at Duus & Møller my prominent task is to utilize the entire potential of the quality materials and to create the furniture in an innovative way, however, still, with the traditional craftsmanship in mind.
For me, joinery is not only an industry, but a lifestyle. I am inspired by the beauty and elegance I find in the Scandinavian nature, which I try to extend into our furniture by the use of creative solutions and a focus on sustainability.
Torbjørn Afdal (1917 - 1999) started as a designer at Bruksbo Tegnekontor in 1946 and was one of the most prolific Norwegian designers of the post-war period. Today, Afdal is considered one of Norway's most famous designers. His furniture is in the collections of the White House and the Emperor of Japan. Torbjørn Afdal was awarded the gold medal at the 1959 So wohnt Europa craft fair in Munich for work including the Form dining set.
Tove and Edvard Kindt-Larsen played an important role in the development of Danish furniture design through their participation in the exhibitions of the Cabinetmakers' Guild from the 1930s to the 1960s.
Tove and Edvard married in 1937 and together they established a studio to work on furniture design, jewellery, accessories, textiles and architecture. Edvard was an architect by training and made a name for himself with a large-scale hotel project by the lakes in Copenhagen. Tove studied under Kaare Klint at the furniture design department of the Royal Danish Academy of Fine Arts after years of designing furniture.
Danish modernist architect Viggo Boesn was a follower of Scandinavian functionalism (funkis-style), which opts for soft, rounded forms that fit the figure.
shapes that fit the figure. The forms must be fluid, they are often organic but always functional and timeless.
In particular, he created a new concept of modernist and avant-garde houses that were very successful in Denmark, while working on upholstered armchairs and rattan furniture.
Vihelm Lauritzen was born in 1894 in Denmark and deceased in 1984, he was considered to be one of the most important and leading architect of the danish Functionalism, he is the author of numerous referent constructions of this architecture current in Denmark.
Between them, The radiohus building, the Nørrebro theatre and the terminal 39 at Kastrup airport are well representative.
Despite beeing one of the most influent, recognised and productive danish architect, Vihelm lauritzen, master in uniting materials and light, dedicated his life improving and developing his lamps, wich combined perfecly with his architectural realisations.
Vivianna Torun Bülow-Hübe is without a doubt the most famous Swedish silversmiths of the post-war period. After studying at the Academy of Industrial Arts in Stockholm, she moved to Paris, where she quickly became successful thanks to her beautiful jewelry designs. She also collaborated with the Danish silversmiths Georg Jensen, for whom she designed a number of pendants (including the Dew-Drop and Infinity collections), rings and earrings, but also the famous Vivianna 326 bracelet watch.
Hee Welling (Denmark) and Gudmundur Ludvik (Iceland) are contributing to the renewal of environmentally friendly design that focuses on the personal experience. The industrial design of Welling / Ludvik's industrial design aims to achieve the best possible solution seen from a number of different perspectives. Comfort, materials, construction, price and environment are all determining factors in this are all determining factors in this approach to design, which are also linked to the contemporary state of technology and the challenges of technology and the challenges of society. Welling / Ludvik always seeks to maximise the potential of modern manufacturing in their designs.
Yngve Ekström (1913-1988) was born in Hagafors in Småland, Sweden, which happened to be where the country's oldest furniture industry was placed. He studied drawing, sculpture, painting, music and art history. He founded Swedese in 1945 and was leading the company right up until his death in 1988.
His career coincided in time with the best part of the postwar modern movement and together with names including Alvar Aalto, Bruno Mathsson, Arne Jacobsen and Poul Kjaerholm he was at the core of a generation designers who made the concept "Scandinavian Modern" famous all over the world.
Ekström's furniture have been exhibited in Amsterdam, Vienna, Berlin, Paris, Munich and Belgrade, to name a few places, and are represented in many modern permanent collections, including the Victoria & Albert Museum, London and Nationalmuseum, Stockholm.
|
||
correct_award_00093
|
FactBench
|
1
| 79
|
https://mydailyartdisplay.uk/tag/wilhelm-jensen/
|
en
|
Wilhelm Jensen – my daily art display
|
[
"https://defaultcustomheadersdata.files.wordpress.com/2016/07/beach1.jpg?resize=1200,280",
"https://mydailyartdisplay.uk/wp-content/uploads/2013/03/andrc3a9-masson.jpg?w=840",
"https://mydailyartdisplay.uk/wp-content/uploads/2013/03/battle-of-the-fishes-by-andrc3a9-masson-1926.jpg?w=840",
"https://mydailyartdisplay.uk/wp-content/uploads/2013/03/gradiva-by-andrc3a9-msson-1939-bb.jpg?w=840",
"https://mydailyartdisplay.uk/wp-content/uploads/2013/03/bas-relief-of-gradiva.jpg?w=204&h=300",
"https://mydailyartdisplay.uk/wp-content/uploads/2013/03/the-novel-gradiva-by-wilhelm-jensen.jpg?w=201&h=300",
"https://2.gravatar.com/avatar/b648911dfada1a4c24f384d57eeaec1b66feb78b269303fad84847acbc3f194d?s=49&d=identicon&r=G",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://s2.wp.com/i/logo/wpcom-gray-white.png",
"https://pixel.wp.com/b.gif?v=noscript"
] |
[] |
[] |
[
""
] | null |
[] |
2013-03-27T09:58:07+00:00
|
Posts about Wilhelm Jensen written by jonathan5485
|
en
|
https://s1.wp.com/i/favicon.ico
|
my daily art display
|
https://mydailyartdisplay.uk/tag/wilhelm-jensen/
|
After my last two blogs looking at the exquisite artistry of the American landscape painter, Frederic Church, I am going to give you something completely different today. I was going to facetiously say that I was moving from the sublime to the ridiculous but I know that labelling Surrealism as “ridiculous” is a rather facile and childish statement. Not being an artist, I would be curious to know if the upbringing of an artist and how life has treated them has any bearing on their painting style. For example, Frederic Church came from a happy and financially sound family background and lived close to a very picturesque countryside and in some ways the works he produced mirrored not just the environment around him but the peace and tranquillity of his mind. My featured artist today probably felt little of that peace and tranquillity in his life and that may account for some of the disturbing images he produced. My artist today is André Masson and the painting of his I want to look at is entitled Gradiva which he completed in 1939. It is not just about a painting but about a German novel and a renowned Austrian neurologist who was hailed as the founding father of psychoanalysis and along the way I will delve into the world of automatism in art!
Andre Masson was born in January 1896, in Balagny-sur-Thérain, in the northern French province of Oise, about sixty miles north of Paris. Although born in France, because of his father’s business, he spent most of his childhood in neighbouring Belgium. The family relocated to Lille in 1903 and then later moved to the capital Brussels. In 1907, aged 11, he enrolled at the Académie Royale des Beaux-Arts et l’École des Arts Décoratifs in Brussels where he received tuition from the Belgian painter and muralist, Constant Montald, who would later teach the likes of Rene Magritte and Paul Develaux. It was on Montald’s advice that Masson decided to leave Belgium and travel to France. In 1912, Masson moved to Paris and attended the illustrious Parisian art college, École Nationale Supérieure des Beaux-Arts where he attended Paul Albert Baudoin’s studio to study fresco painting. In 1914 he was awarded a scholarship from the École des Beaux-Arts and this allowed him to travel to Italy, along with his fellow art student, Maurice Albert Loutreuil. Whilst in Italy, Masson studied the art of fresco and discovered the works of Paolo Uccello.
These were exciting times for the youth of the day. Art Nouveau, Impressionism and Symbolism were dominating the art scene and the music of Wagner and the thoughts of Nietzsche were often foremost in their minds. Like many of his fellow art students of the time, André was a person who railed against convention and authority and had many run-ins with the police. He embraced vegetarianism and would often be seen walking bare-footed along the streets. He avidly read the great works of literature and philosophy and became a follower of the German poet and philosopher, Friedrich Nietzsche.
In 1914, France entered the First World War and there was a call to arms. Many of the young eagerly put themselves forward to support their country. Some, like Masson, looked on the fight that lay ahead in terms of a grand Wagnerian battle with little concern about their own mortality. In an interview he gave the American magazine Newsweek in 1965 André Masson said that when war was declared he volunteered because he wanted to experience “the Wagnerian aspects of battle”. Like many who marched off patriotically to the front line, they were mere “cannon fodder” and would never return home. Although Masson, an infantryman, was not killed in the war, in April 1917, he was badly injured during the Second Battle of the Aisne when several French army battalions stormed the German lines on the Chemin des Dames ridge. (It is interesting to note that one of the German soldiers at this battle was Adolph Hitler!)
The battle was short-lived and, for the French, it ended catastrophically in a matter of a few weeks. Thousands of French troops were slaughtered. Many others mutinied and the career of the French army’s Commander-in Chief, Richard Nivelle was destroyed. The attack, which Masson had taken part in proved disastrous and he was gravely wounded and lay helpless on the battlefield all night and it was not until the following day that stretcher bearers were able to reach him and take him to a field hospital. The wound to his chest and abdomen was of such severity that Masson remained in hospital for the next two years. Not only did he suffer horrendous physical injuries but the battle and his witnessing the death and maiming of many of his colleagues left him mentally scarred and he had to undergo a long period of psychiatric rehabilitation to treat the devastating effect it all had on his mind. His patriotic rush to serve his country resulted in constant physical pain, nightmares and insomnia for the rest of his life and he was advised by psychiatrists to stay away from the noise and chaos of cities.
In April 1919 Masson went to Céret, a town which lies in the Pyrénées foothills in south-west France. Céret was, around this time, a popular meeting place for artists, such as Picasso, Modigliani, Andre Derrain and Matisse. Whilst living there Masson met Odette Cabale, who became his wife. Odette became pregnant and Masson decides to return to Paris where his parents could assist her. In 1920 their daughter Lily was born. Masson sets up a studio at 45 Rue Blomet in Paris which soon became a local meeting place for aspiring artists as well as some influential people such as the author Ernest Hemmingway and the writer and art collector Gertrude Stein.
In 1924 the German-born art historian, art collector and art dealer, Daniel-Henri Kahnweiler, organised Masson’s first solo exhibition at his Galerie Simon. One of the viewers at the exhibition was André Breton and he bought a work by Masson entitled The Four Elements. Breton was the founder of the Surrealist Movement and later that year published Manifeste du surrealism, his Surrealist Manifesto, in which he had defined surrealism as “pure psychic automatism”. Masson, who had been invited to join Breton’s group of Surrealists, was influenced by the ideas Breton had put forward and began to experiment with “automatic drawing” or automatism. Automatism was a way of creating drawings in which artists smother conscious management of the movements of their hand, and by doing so, allow their unconscious mind to take over. Breton and his Surrealists believed automatism in art was a higher form of behaviour. For them, automatism could express the creative force of what they believed was the unconscious in art. Masson’s work could be categorised as a semi-abstract variety of Surrealism, which is experimental use with unusual, such as sand. His so-called sand pictures were works which his automatic drawing would be first put on the canvas using glue. Then before the glue had dried he would sprinkle coloured sand over it. The canvas would then be shaken and the sand would only remain on the glue. One of his most famous and most successful “sand paintings” is Battle of the Fishes, which he completed in 1926 and is now housed in the MOMA in New York. I read a piece about this work which described it as:
” a work which a primordial eroticism is revealed through an imagery of conflict and metamorphosis, poetically equating the submarine imagery with its physical substance…”
Is that how you see it ????????
In 1925, Masson participated in the first Surrealist exhibition, at the Galerie Pierre, alongside Picasso, Ernst, Klee, Man Ray. However, André Masson fell out with Breton and his Surrealists mainly due to Breton’s authoritarian leadership of the group and his dogmatic attitude. Masson also came round to the fact that automatism was becoming too much of a constraint on his art and so in 1929 he severed ties with the group. It was that same year that Masson and his wife Odette parted company after almost ten years of marriage.
Masson spent some time in the Provencal hills around the town of Grasse, where he met Matisse. In 1934 Masson returns to Paris but that February he is alarmed by the right-wing Fascist riots which take place in the city that February. He decided to flee the turmoil that has beset the French capital and headed south to Spain and the city of Barcelona. He was accompanied by Rose Maklès, sister of the wife of his best friend and the well-known author Georges Bataille. In December 1934 André and Rose married in Tossa de Mar on the Costa Brava, and in June 1935 their son Diego was born, later, in September 1936. a second son Luis was born. Masson’s decision to relocate to Spain, to avoid the chaos of riots in Paris, was an unfortunate one as in October 1934 the Spanish city was hit by a violent insurrection of its people and Masson and his family became trapped in a friend’s house which was at the heart of the city and which was being subjected to constant shelling and sniper fire. This was just the scenario his psychiatrists had told him to avoid when he was discharged from hospital at the end of the First World War. The situation deteriorated further in 1936 with the start of the Spanish Civil War and Masson and his family quickly headed back to France. His return to France also coincided with his return to the Surrealist fold as he and André Breton settled their differences and the following January, Masson exhibited works at the Surrealist Exposition of Paris which was held at Georges Wildenstein’s Galérie Beaux-Arts.
The year 1939 was marked by the start of the Second World War and in January 1940 the German army marched into Paris. Masson found himself in a precarious situation. His artwork had already been deemed as degenerate by the Nazis. The Nazis looked upon the Surrealist Movement and its artists as having close ties to the Communists and to top all that, Masson’s wife Rose was Jewish. He realised that for he and his family, in order to survive, had to flee France. From Paris they headed south to Auvergne and then on to Marseille. Here a group of Americans led by Varian Fry, a journalist, had set up a European Rescue Committee which helped Jews and Germans blacklisted by the Nazi authorities to escape to the USA. Varian Fry hid the refugees at the Villa Air-Bel, a chateau on the outskirts of Marseille and then took them via Spain to neutral Portugal, or shipped them from Marseille to Martinique and from there on to the USA, which was Masson escape route.
André Masson and his family, along with some of his artwork, landed in America in 1941 and one would think that his troubles were over but alas the US Customs thought differently as when they examined his drawings they declared five of them to be pornographic and tore them to pieces right in front of the artist’s eyes !!! For a short while he lived in New York before moving to Connecticut. In 1945, with the war over, the Masson family returned to France, where they lived for a while with his wife’s sister, Simone. In 1947 they moved to the small town of Le Tholonet, which lies close to Aix-en-Provence in southern France. He continued to paint and received many lucrative commissions including one from the French Culture Ministry to paint the ceiling at the Parisian Théatre Odéon. A series of solo and retrospective exhibitions of his work are held all over Europe and America. He visited Rome and Venice in the 1950’s and from these trips, he produced a beautiful series of coloured lithographs of Italian landscapes.
Masson’s wife Rose died in August 1986 and Masson himself died in Paris in October 1987 aged 91.
The painting of André Masson I have chosen today is entitled Gradiva which he completed in 1939. So who is Gradiva? Gradiva, is Latin for “the woman who walks” and in the Vatican Museum, there is a Roman bas-relief (a projecting image with a shallow overall depth), of Gradiva. This sculpture depicts a young robed woman who we see raising the hems of her skirts so as to be able to stride forward at pace. This sculpture was the basis for the novel written by the German author Wilhelm Jensen, entitled Gradiva. He originally published his fictional tale, it in a serialised form, in the Viennese newspaper, Neue Freie Presse in 1902.
It is the story of Norbert Hanold, a young archaeologist who became totally obsessed with a woman who did not even exist. He had visited the Vatican museum when he was struck by the beauty of a bas-relief of young Roman woman, very light on her feet, whom he baptized “Gradiva” (she who walks). He purchased a reproduction of the sculpture, which he hung on the wall of his workroom. He becomes fixated by the image and mystery of this enigmatic young woman. One night he dreams that he is in Pompeii in AD 79, just before the eruption of Mt. Vesuvius. There he meets Gradiva. He desperately tries to warn her about the horrific events that are about to occur, but he finds himself powerless to rescue her. After waking, he is overcome by the longing to meet Gradiva. He immediately sets off for Pompeii, where he meets a young woman, very much alive, whom he believes is Gradiva. In the course of the meetings that follow, he tries to rationalise his fixation for the girl by interpreting signs such as the fact that Gradiva appears at noon, the ghost hour, and other such signs. Gradiva, in turn, seeks to cure him by gradually revealing her identity to him. Through this adventure, Norbert finally sees Gradiva for who she really is: his neighbour and childhood friend Zoe Bertgang (“Bertgang” is the German equivalent of “Gradiva”), who also travelled to Pompeii. For years he had not seen her and had no desire to see her, but without realising it Norbert was still in love with her and he had substituted his love for Zoe with his love for Gradiva, the young woman of the bas-relief. Happily, his fixation for Gradiva finally yields to reality, and Norbert is cured.
In 1906, Sigmund Freud had been made aware of this story by Carl Jung , who believed Freud would be interested in the dream sequences of the story. Freud, who frequently cited his Interpretation of Dreams which he published in 1900, suggested in his review of Jensen’s novel that even dreams invented by an author could be analyzed by the same method as real ones. He fastidiously analyzed the two dreams which were the basis of Jensen’s story, and linked them to happenings in Norbert’s life. By doing this Freud attempted to demonstrate that dreams were substitute wish fulfilments and established that they constituted a return of the repressed. According to the pschoanalysist, the source of Norbert Hanold’s fixation was his repression of his own sexuality, which caused him to forget, his past love, Zoe Bertgang, in order to keep him from recognizing her. This he termed as “negative hallucinations”. Freud concluded that the way Zoe treated Norbert when they met in Pompeii was in the manner of a good psychoanalyst, cautiously bringing to consciousness what Norbert forgot through repression.
As an interesting footnote to the Freud story, four months after he published his essay on Gradiva and Jensen’s story, he visited Rome and during the trip he went to see the bas-relief representing “Gradiva” at the museum of the Vatican, the very same one that had inspired Jensen to write his story. Just as Norbert Hainold, the character in Jensen’s story had done, Freud bought a copy of the bas-relief of Gradiva and hung it in his office in Vienna, at the foot of his divan. There it remained until he left Vienna, and took it with him to London in 1938, where it can be found on the wall of his London study which forms part of the Freud Museum.
In today’s featured painting, Masson iconography for Gradiva (The Metamorphosis of Gradiva) is a Freudian illustration drawn directly from the Jensen story. In the painting we see a large woman, half flesh, half marble sprawled on a marble plinth, the base of which is starting to crumble. Her legs are splayed apart and between them we see a beef steak and a gaping shell-like vagina. To the right of her, on the wall in the background, we see the erupting volcano. To the left of her we see a large crack in the side wall signifying that the building she is in, is about to collapse. Another strange addition to the painting is a swarm of bee-like creatures which seem to swarm in arc-like fashion behind the figure of the woman, similar to the arc formed by the way her marble arm arches over her head. Why depict bees? The whole of the painting is bathed in a flickering reddish light which highlights a clump of poppies which can be seen in the left foreground of the work. I have tried to explain some of the iconography of this painting but I will leave you to try and figure out if there are more hidden meanings to what you see before you.
Finally for those of you who would like to read the complete version of Wilhelm Jensen’s Gradiva then you can get a copy from Amazon.com:
http://www.amazon.com/Gradiva-Pompeiian-Fancy-Classic-Reprint/dp/B0094OOP36
or Amazon.co.uk:
http://www.amazon.co.uk/Gradiva-Pompeiian-Fancy-Classic-Reprint/dp/B0094UEIIW/ref=sr_1_1?ie=UTF8&qid=1364119670&sr=8-1
I must apologise for the length of this blog but once I got started researching the life of the painter, the painting itself and the story of Gradiva I was loathed to cut anything out. Not being a master of the art of précising, I don’t think I would make a good journalist !!!!!
|
||||
correct_award_00093
|
FactBench
|
3
| 3
|
https://observervoice.com/20-january-remembering-johannes-vilhelm-jensen-on-birthday-33523/
|
en
|
Johannes Vilhelm Jensen: Danish Nobel Laureate in Literature and Master of Prose
|
[
"https://observervoice.com/wp-content/uploads/2023/09/cropped-OV-Logo.png",
"https://observervoice.com/wp-content/uploads/2023/09/cropped-OV-Logo.png",
"https://observervoice.com/wp-content/uploads/2020/09/Khasi_Girls-jpg-webp-150x150.webp",
"https://observervoice.com/wp-content/uploads/2020/09/Matrilineal-Societies-of-India-scaled-jpg-webp-150x150.webp",
"https://observervoice.com/wp-content/uploads/2024/06/ph202183101-150x150.png",
"https://observervoice.com/wp-content/uploads/2024/06/ph202183101-150x150.png",
"https://observervoice.com/wp-content/uploads/2024/06/ph202183101-150x150.png",
"https://observervoice.com/wp-content/uploads/2024/06/ph202183101-150x150.png",
"https://observervoice.com/wp-content/uploads/2024/06/ph202183101-150x150.png",
"https://observervoice.com/wp-content/uploads/2022/07/Pi-Approximation-Day-jpg-webp-150x150.webp",
"https://observervoice.com/wp-content/uploads/2024/02/Celebrating-Sarawak-Day-150x150.png",
"https://observervoice.com/wp-content/uploads/2023/07/National-Mango-Day-150x150.jpg",
"https://observervoice.com/wp-content/uploads/2023/03/Randal-Cremer-150x150.jpg",
"https://observervoice.com/wp-content/uploads/2024/03/Badri-Narayan-1-150x150.jpg"
] |
[] |
[] |
[
""
] | null |
[
"OV Digital Desk"
] |
2024-01-17T22:33:26+00:00
|
Johannes Vilhelm Jensen was a Danish author. In 1944, Johannes Vilhelm Jensen was awarded the Nobel Prize in a Literature.
|
en
|
Observer Voice
|
https://observervoice.com/20-january-remembering-johannes-vilhelm-jensen-on-birthday-33523/
|
Johannes Vilhelm Jensen (20 January 1873 – 25 November 1950) was a Danish author. In 1944, Johannes Vilhelm Jensen was awarded the Nobel Prize in Literature.
Life and Career
Johannes Vilhelm Jensen was born on 20 January 1873, in Farsø, Denmark. He studied at various universities in Denmark, including the University of Copenhagen, where he initially pursued a medical degree but later shifted his focus to literature and philosophy. Johannes Vilhelm Jensen had a distinguished career as a writer:
He began his writing career as a poet and later ventured into various literary genres, including novels, essays, and short stories. Jensen is known for his imaginative and experimental style of writing, which incorporated elements of naturalism, symbolism, and modernism. One of his most famous works is the novel “The Fall of the King” (1900-1901), which is part of his “Kongens Fald” (The Fall of the King) trilogy. This trilogy explores the history of Denmark and its transition from paganism to Christianity.
Jensen’s literary works often reflected his interest in history, mythology, and human evolution. Johannes Vilhelm Jensen passed away on November 25, 1950, in Copenhagen, Denmark.
Award and Legacy
Johannes Vilhelm Jensen was awarded the Nobel Prize in Literature in 1944 for his contribution to the development of Scandinavian literature. He was the first Danish author to receive this prestigious award. Johannes Vilhelm Jensen’s legacy is primarily based on his literary contributions, which are characterized by their unique style and exploration of themes related to history, mythology, and human evolution.
His “Kongens Fald” trilogy remains a significant work in Danish literature, offering an imaginative and thought-provoking perspective on Denmark’s historical and cultural transformation. Jensen’s innovative approach to writing had an influence on subsequent generations of Danish and Scandinavian writers.
|
|||||
correct_award_00093
|
FactBench
|
1
| 80
|
https://unesdoc.unesco.org/ark:/48223/pf0000068085
|
en
|
[] |
[] |
[] |
[
""
] | null |
[] | null | null | ||||||||||
correct_award_00093
|
FactBench
|
3
| 21
|
https://www.facebook.com/photo.php%3Ffbid%3D350249800902425%26id%3D100077522237019%26set%3Da.183958540864886
|
en
|
Facebook
|
https://static.xx.fbcdn.net/rsrc.php/yv/r/B8BxsscfVBr.ico
|
https://static.xx.fbcdn.net/rsrc.php/yv/r/B8BxsscfVBr.ico
|
[
"https://facebook.com/security/hsts-pixel.gif?c=3.2.5"
] |
[] |
[] |
[
""
] | null |
[] | null |
Sieh dir auf Facebook Beiträge, Fotos und vieles mehr an.
|
de
|
https://static.xx.fbcdn.net/rsrc.php/yv/r/B8BxsscfVBr.ico
|
https://www.facebook.com/login/
| ||||
correct_award_00093
|
FactBench
|
0
| 74
|
https://www.nationsencyclopedia.com/Europe/Denmark-FAMOUS-DANES.html
|
en
|
Famous danes
|
[
"https://www.nationsencyclopedia.com/bs/i/nations-sm.png",
"https://a.advameg.com/matomo.php?idsite=4&rec=1"
] |
[] |
[] |
[
""
] | null |
[] | null |
en
| null |
Denmark's greatest classic writer and the founder of Danish literature is Ludvig Holberg (1684–1754), historian, philologist, philosopher, critic, and playwright, whose brilliant satiric comedies are internationally famous. Another important dramatist and poet is Adam Gottlob Oehlenschlaeger (1779–1850). The two most celebrated 19th-century Danish writers are Hans Christian Andersen (1805–75), whose fairy tales are read and loved all over the world, and the influential philosopher and religious thinker Søren Kierkegaard (1813–55). Nikolaj Frederik Severin Grundtvig (1783–1872), noted theologian and poet, was renowned for his founding of folk high schools, which brought practical education to the countryside. The leading European literary critic of his time was Georg Morris Brandes (Cohen, 1842–1927), whose Main Currents in 19th-Century European Literature exerted an influence on two generations of readers. Leading novelists include Jens Peter Jacobsen (1847–85); Martin Anderson Nexø (1869–1954), author of Pelle the Conquerer (1906–10) and Ditte (1917–21); and Johannes Vilhelm Jensen (1873–1950), who was awarded the Nobel Prize for literature in 1944 for his series of novels. Karl Adolph Gjellerup (1857–1919) and Henrik Pontoppidan (1857–1943) shared the Nobel Prize for literature in 1917. Isak Dinesen (Karen Blixen, 1885–1962) achieved renown for her volumes of gothic tales and narratives of life in Africa. Jeppe Aaksjaer (1866–1930), poet and novelist, is called the Danish Robert Burns. A great film artist is Carl Dreyer (1889–1968), known for directing The Passion of Joan of Arc, Day of Wrath, and Ordet . Famous Danish musicians include the composers Niels Gade (1817–90) and Carl Nielsen (1865–1931), the tenors Lauritz Melchior (1890–1973) and Aksel Schiøtz (1906–75), and the soprano Povla Frijsh (d.1960). Notable dancers and choreographers include August Bournonville (1805–79), originator of the Danish ballet style; Erik Bruhn (1928–86), who was known for his classical technique and was director of ballet at the Royal Swedish Opera House and of the National Ballet of Canada; and Fleming Ole Flindt (b.1936), who has directed the Royal Danish Ballet since 1965. The sculptor Bertel Thorvaldsen (1770–1844) is the artist of widest influence.
Notable scientists include the astronomers Tycho Brahe (1546–1601) and Ole Rømer (1644–1710); the philologists Ramus Christian Rask (1787–1832) and Otto Jespersen (1860–1943); the physicist Hans Christian Ørsted (1777–1851), discoverer of electromagnetism; Nobel Prize winners for physics Niels Bohr (1885–1962) in 1922 and his son Aage Niels Bohr (b.1922) and Benjamin Mottelson (b.1926) in 1975; Niels Rybert Finsen (b.Faroe Islands, 1860–1904), August Krogh (1874–1949), Johannes A. G. Fibiger (1867–1928), and Henrik C. P. Dam (1895–1976), Nobel Prize-winning physicians and physiologists in 1903, 1920, 1926, and 1944, respectively. Frederik Bajer (1837–1922) was awarded the Nobel Prize for peace in 1908. Knud Johan Victor Rasmussen (1879–1933), explorer and anthropologist born in Greenland, was an authority on Eskimo ethnology.
Queen Margrethe II (b.1940) became sovereign in 1972.
|
|||||||
correct_award_00093
|
FactBench
|
0
| 8
|
https://www.infoplease.com/encyclopedia/people/arts/scandinavian-lit/jensen-johannes-vilhelm
|
en
|
Jensen, Johannes Vilhelm
|
https://www.infoplease.com/themes/ip/favicon.ico
|
https://www.infoplease.com/themes/ip/favicon.ico
|
[
"https://www.infoplease.com/themes/ip/images/logo-white.svg",
"https://www.infoplease.com/themes/ip/images/mag-glass.svg",
"https://www.infoplease.com/themes/ip/images/close.svg",
"https://www.infoplease.com/sites/infoplease.com/files/styles/scale600w/public/2020-04/europe.jpg.webp?itok=ZYMzqB-y",
"https://www.infoplease.com/sites/infoplease.com/files/styles/scale600w/public/2020-04/middleeast.jpg.webp?itok=b3PkAiRE",
"https://www.infoplease.com/sites/infoplease.com/files/styles/scale600w/public/2020-04/Triumph_of_Achilles_in_Corfu_Achilleion.jpg.webp?itok=AJitHl3b",
"https://www.infoplease.com/sites/infoplease.com/files/styles/scale600w/public/2020-04/iStock-1163662016.jpg.webp?itok=5LuI3Ibr",
"https://www.infoplease.com/modules/contrib/custom_voice_search/images/mic.png",
"https://www.infoplease.com/sites/infoplease.com/files/inline-images/infoplease_white_logo_2.png",
"https://www.infoplease.com/sites/default/files/i_infopls_com/images/ip17_social_facebook_icon.png",
"https://www.infoplease.com/sites/default/files/i_infopls_com/images/ip17_social_twitter_icon.png",
"https://www.infoplease.com/sites/default/files/i_infopls_com/images/ip17_social_pinterest_icon.png",
"https://www.playwire.com/hubfs/Powered-by-Playwire-Badges/Ads-Powered-by-playwire-2021-standalone-small-300px.png",
"https://www.infoplease.com/modules/custom/custom_core_common/images/facebook.svg",
"https://www.infoplease.com/modules/custom/custom_core_common/images/twitter.svg",
"https://www.infoplease.com/modules/custom/custom_core_common/images/googlebookmarks.svg",
"https://www.infoplease.com/modules/custom/custom_core_common/images/email.svg",
"https://www.infoplease.com/modules/custom/custom_core_common/images/print.svg",
"https://www.infoplease.com/sites/infoplease.com/files/inline-images/sandbox-learning.png",
"https://www.infoplease.com/sites/default/files/inline-images/factmonster_logo.png",
"https://www.infoplease.com/sites/infoplease.com/files/inline-images/familyeducation_logo_1.png",
"https://www.infoplease.com/sites/default/files/inline-images/teachervision_logo.png",
"https://www.infoplease.com/sites/default/files/inline-images/sandbox_logo.png"
] |
[] |
[] |
[
""
] | null |
[
"Infoplease"
] |
2017-01-24T01:12:35-05:00
|
Jensen, Johannes Vilhelm yōhänˈəs vĭlˈhĕlm yĕnˈsən [key], 1873–1950, Danish writer. As a young man he studied medicine; his interest in biology and anthropology is obvious throughout his works. Jensen created a distinctive literary form in
|
en
|
/themes/ip/favicon.ico
|
InfoPlease
|
https://www.infoplease.com/encyclopedia/people/arts/scandinavian-lit/jensen-johannes-vilhelm
|
Current Events
View captivating images and news briefs about critical government decisions, medical discoveries, technology breakthroughs, and more. From this page, you'll see news events organized chronologically by month and separated into four categories: World News, U.S. News, Disaster News, and Science & Technology News.
We also collect a summary of each week's events, from one Friday to the next, so make sure you check back every week for fascinating updates on the world around to help keep you updated on the latest happenings from across the globe!
Current Events 2023
Check out the November News and Events Here:
|