Text stringlengths 1 9.41k |
|---|
For example, the following:
```
>>> res = [x + y for x in [0, 1, 2] for y in [100, 200, 300]]
>>> res
[100, 200, 300, 101, 201, 301, 102, 202, 302]
```
has the same effect as this substantially more verbose equivalent:
```
>>> res = []
>>> for x in [0, 1, 2]:
... |
for y in [100, 200, 300]:
... |
res.append(x + y)
...
>>> res
[100, 200, 300, 101, 201, 301, 102, 202, 302]
```
Although list comprehensions construct lists, remember that they can iterate over any
sequence or other iterable type. |
Here’s a similar bit of code that traverses strings instead
of lists of numbers, and so collects concatenation results:
```
>>> [x + y for x in 'spam' for y in 'SPAM']
['sS', 'sP', 'sA', 'sM', 'pS', 'pP', 'pA', 'pM',
'aS', 'aP', 'aA', 'aM', 'mS', 'mP', 'mA', 'mM']
```
Finally, here is a much more complex list co... |
The if clauses filter out items in each sequence iteration. Here is the equivalent statement-based code:
```
>>> res = []
>>> for x in range(5):
... if x % 2 == 0:
... for y in range(5):
... |
if y % 2 == 1:
... |
res.append((x, y))
...
```
**|**
-----
```
>>> res
[(0, 1), (0, 3), (2, 1), (2, 3), (4, 1), (4, 3)]
```
Recall that if you’re confused about what a complex list comprehension does, you can
always nest the list comprehension’s for and if clauses inside each other (indenting
successively further to the right) ... |
The result is longer, but perhaps clearer.
The map and filter equivalent would be wildly complex and deeply nested, so I won’t
even try showing it here. |
I’ll leave its coding as an exercise for Zen masters, ex-Lisp
programmers, and the criminally insane....
###### List Comprehensions and Matrixes
Not all list comprehensions are so artificial, of course. |
Let’s look at one more application to stretch a few synapses. One basic way to code matrixes (a.k.a. multidimensional
arrays) in Python is with nested list structures. |
The following, for example, defines two
3 × 3 matrixes as lists of nested lists:
```
>>> M = [[1, 2, 3],
... [4, 5, 6],
... [7, 8, 9]]
>>> N = [[2, 2, 2],
... [3, 3, 3],
... |
[4, 4, 4]]
```
Given this structure, we can always index rows, and columns within rows, using normal
index operations:
```
>>> M[1]
[4, 5, 6]
>>> M[1][2]
6
```
List comprehensions are powerful tools for processing such structures, though, because
they automatically scan rows and columns for us. |
For instance, although this structure
stores the matrix by rows, to collect the second column we can simply iterate across
the rows and pull out the desired column, or iterate through positions in the rows and
index as we go:
```
>>> [row[1] for row in M]
[2, 5, 8]
>>> [M[row][1] for row in (0, 1, 2)]
[2, 5, 8]... |
The
following expression uses range to generate the list of offsets and then indexes with the
row and column the same, picking out M[0][0], then M[1][1], and so on (we assume
the matrix has the same number of rows and columns):
**|**
-----
```
>>> [M[i][i] for i in range(len(M))]
[1, 5, 9]
```
Finally, with a ... |
The following first builds a flat list that contains the result of multiplying the
matrixes pairwise, and then builds a nested list structure having the same values by
nesting list comprehensions:
```
>>> [M[row][col] * N[row][col] for row in range(3) for col in range(3)]
[2, 4, 6, 12, 15, 18, 28, 32, 36]
>>> [[M... |
It’s equivalent
to this statement-based code:
```
>>> res = []
>>> for row in range(3):
... tmp = []
... for col in range(3):
... tmp.append(M[row][col] * N[row][col])
... |
res.append(tmp)
...
>>> res
[[2, 4, 6], [12, 15, 18], [28, 32, 36]]
```
Compared to these statements, the list comprehension version requires only one line
of code, will probably run substantially faster for large matrixes, and just might make
your head explode! |
Which brings us to the next section.
###### Comprehending List Comprehensions
With such generality, list comprehensions can quickly become, well, incomprehensible, especially when nested. |
Consequently, my advice is typically to use simple `for`
loops when getting started with Python, and map or comprehensions in isolated cases
where they are easy to apply. |
The “keep it simple” rule applies here, as always: code
conciseness is a much less important goal than code readability.
However, in this case, there is currently a substantial performance advantage to
the extra complexity: based on tests run under Python today, map calls are roughly twice
as fast as equivalent for lo... |
Recent Python releases have sped up the simple `for loop statement, for example. |
Usually,`
though, list comprehensions are still substantially faster than for loops and even faster than map (though
```
map can still win for built-in functions). |
To time these alternatives yourself, see the standard library’s time
```
module’s time.clock and time.time calls, the newer timeit module added in Release 2.4, or this chapter’s
upcoming section “Timing Iteration Alternatives” on page 509.
**|**
-----
comprehensions run at C language speed inside the interpreter, ... |
However, map and list comprehensions are worth knowing and
using for simpler kinds of iterations, and if your application’s speed is an important
consideration. |
In addition, because map and list comprehensions are both expressions,
they can show up syntactically in places that for loop statements cannot, such as in the
bodies of lambda functions, within list and dictionary literals, and more. |
Still, you should
try to keep your map calls and list comprehensions simple; for more complex tasks, use
full statements instead.
###### Why You Will Care: List Comprehensions and map
Here’s a more realistic example of list comprehensions and map in action (we solved
this problem with list comprehensions in Chapter 1... |
Recall that the file readlines method returns lines with \n end
```
of-line characters at the ends:
```
>>> open('myfile').readlines()
['aaa\n', 'bbb\n', 'ccc\n']
```
If you don’t want the end-of-line characters, you can slice them off all the lines in a
single step with a list comprehension or a map call (map... |
The map call
is slightly longer than the list comprehension, but neither has to manage result list
construction explicitly.
A list comprehension can also be used as a sort of column projection operation. |
Python’s standard SQL database API returns query results as a list of tuples much like the
following—the list is the table, tuples are rows, and items in tuples are column values:
```
listoftuple = [('bob', 35, 'mgr'), ('mel', 40, 'dev')]
```
A for loop could pick up all the values from a selected column manually,... |
In particular, two
language constructs delay result creation whenever possible:
- Generator functions are coded as normal def statements but use yield statements
to return results one at a time, suspending and resuming their state between each.
- Generator expressions are similar to the list comprehensions of the p... |
As we’ll see, both of these ultimately
perform their delayed-results magic by implementing the iteration protocol we studied
in Chapter 14.
###### Generator Functions: yield Versus return
In this part of the book, we’ve learned about coding normal functions that receive input
parameters and send back a single result ... |
It is also possible, however, to
write functions that may send back a value and later be resumed, picking up where they
left off. |
Such functions are known as generator functions because they generate a sequence of values over time.
Generator functions are like normal functions in most respects, and in fact are coded
with normal def statements. |
However, when created, they are automatically made to
implement the iteration protocol so that they can appear in iteration contexts. |
We
studied iterators in Chapter 14; here, we’ll revisit them to see how they relate to
generators.
**|**
-----
###### State suspension
Unlike normal functions that return a value and exit, generator functions automatically
suspend and resume their execution and state around the point of value generation.
Because o... |
Because the state
that generator functions retain when they are suspended includes their entire local
scope, their local variables retain information and make it available when the functions
are resumed.
The chief code difference between generator and normal functions is that a generator
_yields a value, rather than r... |
When resumed, the function continues execution immediately after the last yield run. |
From the function’s perspective, this allows its code
to produce a series of values over time, rather than computing them all at once and
sending them back in something like a list.
###### Iteration protocol integration
To truly understand generator functions, you need to know that they are closely bound
up with the ... |
As we’ve seen, iterable objects
define a __next__ method, which either returns the next item in the iteration, or raises
the special StopIteration exception to end the iteration. |
An object’s iterator is fetched
with the iter built-in function.
Python for loops, and all other iteration contexts, use this iteration protocol to step
through a sequence or value generator, if the protocol is supported; if not, iteration
falls back on repeatedly indexing sequences instead.
To support this protocol,... |
When called, they return a generator object that supports the iteration
interface with an automatically created method named __next__ to resume execution.
Generator functions may also have a return statement that, along with falling off the
end of the def block, simply terminates the generation of values—technically, b... |
From the caller’s`
perspective, the generator’s __next__ method resumes the function and runs until either
the next yield result is returned or a StopIteration is raised.
The net effect is that generator functions, coded as def statements containing yield
statements, are automatically made to support the iteration pro... |
This includes the generator
objects we are using here. |
In 3.0 this method is renamed to __next__.
The next built-in function is provided as a convenience and portability
tool: `next(I) is the same as` `I.__next__() in 3.0 and` `I.next() in 2.6.`
Prior to 2.6, programs simply call I.next() instead to iterate manually.
###### Generator functions in action
To illustrate gen... |
The following code defines a
generator function that can be used to generate the squares of a series of numbers over
time:
```
>>> def gensquares(N):
... for i in range(N):
```
`... |
yield i ** 2` _# Resume here later_
```
...
```
This function yields a value, and so returns to its caller, each time through the loop;
when it is resumed, its prior state is restored and control picks up again immediately
after the yield statement. |
For example, when it’s used in the body of a for loop, control
returns to the function after its yield statement each time through the loop:
`>>> for i in gensquares(5):` _# Resume the function_
`... |
print(i, end=' : ')` _# Print last yielded value_
```
...
0 : 1 : 4 : 9 : 16 :
>>>
```
To end the generation of values, functions either use a return statement with no value
or simply allow control to fall off the end of the function body.
If you want to see what is going on inside the for, call the generator f... |
For convenience, the next(X) built-in calls an object’s
```
X.__next__() method for us:
```
`>>> next(x)` _# Same as x.__next__() in 3.0_
```
0
```
`>>> next(x)` _# Use x.next() or next() in 2.6_
```
1
>>> next(x)
4
>>> next(x)
9
>>> next(x)
```
**|**
-----
```
Traceback (most recent call last):
... |
If the object to be iterated over does not support this protocol, for loops
instead use the indexing protocol to iterate.
Note that in this example, we could also simply build the list of yielded values all at
once:
```
>>> def buildsquares(n):
... |
res = []
... for i in range(n): res.append(i ** 2)
... |
return res
...
>>> for x in buildsquares(5): print(x, end=' : ')
...
0 : 1 : 4 : 9 : 16 :
```
For that matter, we could use any of the for loop, map, or list comprehension techniques:
```
>>> for x in [n ** 2 for n in range(5)]:
... |
print(x, end=' : ')
...
0 : 1 : 4 : 9 : 16 :
>>> for x in map((lambda n: n ** 2), range(5)):
... |
print(x, end=' : ')
...
0 : 1 : 4 : 9 : 16 :
```
However, generators can be better in terms of both memory use and performance. |
They
allow functions to avoid doing all the work up front, which is especially useful when
the result lists are large or when it takes a lot of computation to produce each value.
Generators distribute the time required to produce the series of values among loop
iterations.
Moreover, for more advanced uses, generators ... |
In one sense, threading is more general (producers can run truly independently and post
results to a queue), but generators may be simpler to code. |
See the second footnote in Chapter 17 for a brief
introduction to Python multithreading tools. |
Note that because control is routed explicitly at `yield and`
```
next calls, generators are also not backtracking, but are more strongly related to coroutines—formal concepts
```
that are both beyond this chapter’s scope.
**|**
-----
###### Extended generator function protocol: send versus next
In Python 2.5, a... |
The send
method advances to the next item in the series of results, just like __next__, but also
provides a way for the caller to communicate with the generator, to affect its operation.
Technically, yield is now an expression form that returns the item passed to send, not
a statement (though it can be called either w... |
The
expression must be enclosed in parentheses unless it’s the only item on the right side
of the assignment statement. |
For example, X = yield Y is OK, as is X = (yield Y) + 42.
When this extra protocol is used, values are sent into a generator `G by calling`
```
G.send(value). |
The generator’s code is then resumed, and the yield expression in the
```
generator returns the value passed to send. |
If the regular G.__next__() method (or its
```
next(G) equivalent) is called to advance, the yield simply returns None. For example:
>>> def gen():
... for i in range(10):
... X = yield i
... |
print(X)
...
>>> G = gen()
```
`>>> next(G)` _# Must call next() first, to start generator_
```
0
```
`>>> G.send(77)` _# Advance, and send value to yield expression_
```
77
1
>>> G.send(88)
88
2
```
`>>> next(G)` _# next() and X.__next__() send None_
```
None
3
```
The send method can be used, ... |
In addition, generators in 2.5 also support a throw(type) method to raise an exception inside
the generator at the latest yield, and a close method that raises a special Generator
```
Exit exception inside the generator to terminate the iteration. |
These are advanced fea
```
tures that we won’t delve into in more detail here; see reference texts and Python’s
standard manuals for more information.
Note that while Python 3.0 provides a `next(X) convenience built-in that calls the`
```
X.__next__() method of an object, other generator methods, like send, must be ca... |
This makes sense if you realize that these extra methods are implemented on built-in generator objects only,
whereas the __next__ method applies to all iterable objects (both built-in types and
user-defined classes).
**|**
-----
###### Generator Expressions: Iterators Meet Comprehensions
In all recent versions of ... |
Syntactically, generator expressions are just like normal list comprehensions, but they are enclosed in
parentheses instead of square brackets:
`>>> [x ** 2 for x in range(4)]` _# List comprehension: build a list_
```
[0, 1, 4, 9]
```
`>>> (x ** 2 for x in range(4))` _# Generator expression: make an iterable_
```
... |
print('%s, %s' % (num, num / 2.0))
...
0, 0.0
1, 0.5
4, 2.0
9, 4.5
```
As we’ve already learned, every iteration context does this, including the sum, map, and
```
sorted built-in functions; list comprehensions; and other iteration contexts we learned
```
about in Chapter 14, such as the any, all, and list ... |
Extra parentheses are required, however, in the second call to sorted:
```
>>> sum(x ** 2 for x in range(4))
14
>>> sorted(x ** 2 for x in range(4))
[0, 1, 4, 9]
>>> sorted((x ** 2 for x in range(4)), reverse=True)
[9, 4, 1, 0]
>>> import math
>>> list( map(math.sqrt, (x ** 2 for x in range(4))) )
[0.... |
They may also run slightly slower in practice, so they are probably
best used only for very large result sets. |
A more authoritative statement about performance, though, will have to await the timing script we’ll code later in this chapter.
###### Generator Functions Versus Generator Expressions
Interestingly, the same iteration can often be coded with either a generator function or
a generator expression. |
The following generator expression, for example, repeats each
character in a string four times:
`>>> G = (c * 4 for c in 'SPAM')` _# Generator expression_
`>>> list(G)` _# Force generator to produce all results_
```
['SSSS', 'PPPP', 'AAAA', 'MMMM']
```
The equivalent generator function requires slightly more code, ... |
for c in S:
... |
yield c * 4
...
>>> G = timesfour('spam')
```
`>>> list(G)` _# Iterate automatically_
```
['ssss', 'pppp', 'aaaa', 'mmmm']
```
Both expressions and functions support both automatic and manual iteration—the
prior list call iterates automatically, and the following iterate manually:
```
>>> G = (c * 4 for c in ... |
For example,
using the prior section’s generator expression, a generator’s iterator is the generator
itself (in fact, calling iter on a generator is a no-op):
```
>>> G = (c * 4 for c in 'SPAM')
```
`>>> iter(G) is G` _# My iterator is myself: G has __next___
```
True
```
If you iterate over the results stream ma... |
for c in S:
... |
yield c * 4
...
```
`>>> G = timesfour('spam')` _# Generator functions work the same way_
```
>>> iter(G) is G
True
>>> I1, I2 = iter(G), iter(G)
>>> next(I1)
'ssss'
>>> next(I1)
'pppp'
```
`>>> next(I2)` _# I2 at same position as I1_
```
'aaaa'
```
This is different from the behavior of some built... |
Once you know about list comprehensions, generators, and other
iteration tools, it turns out that emulating many of Python’s functional built-ins is both
straightforward and instructive.
For example, we’ve already seen how the built-in zip and map functions combine iterables and project functions across them, respecti... |
With multiple sequence arguments, map projects the function across items taken from each sequence in much the
same way that zip pairs them up:
```
>>> S1 = 'abc'
>>> S2 = 'xyz123'
```
`>>> list(zip(S1, S2))` _# zip pairs items from iterables_
```
[('a', 'x'), ('b', 'y'), ('c', 'z')]
```
**|**
-----
_# zip pa... |
In the preceding chapter, for example, we saw a function
that emulated the map built-in for a single sequence argument. |
It doesn’t take much
more work to allow for multiple sequences, as the built-in does:
_# map(func, seqs...) workalike with zip_
```
def mymap(func, *seqs):
res = []
for args in zip(*seqs):
res.append(func(*args))
return res
print(mymap(abs, [−2, −1, 0, 1, 2]))
print(mymap(pow, [1, 2, 3], [2, 3,... |
We can code our map more concisely as
an equivalent one-line list comprehension:
**|**
-----
_# Using a list comprehension_
```
def mymap(func, *seqs):
return [func(*args) for args in zip(*seqs)]
print(mymap(abs, [−2, −1, 0, 1, 2]))
print(mymap(pow, [1, 2, 3], [2, 3, 4, 5]))
```
When this is run the resu... |
Both of the preceding mymap versions build result lists all at once,
though, and this can waste memory for larger lists. |
Now that we know about generator
_functions and expressions, it’s simple to recode both these alternatives to produce results_
on demand instead:
_# Using generators: yield and (...)_
```
def mymap(func, *seqs):
res = []
for args in zip(*seqs):
yield func(*args)
def mymap(func, *seqs):
return (fu... |
They produce the same results if we wrap
them in list calls to force them to produce their values all at once:
```
print(list(mymap(abs, [−2, −1, 0, 1, 2])))
print(list(mymap(pow, [1, 2, 3], [2, 3, 4, 5])))
```
No work is really done here until the list calls force the generators to run, by activating
the iteratio... |
The generators returned by these functions themselves, as well
as that returned by the Python 3.0 flavor of the zip built-in they use, produce results
only on demand.
###### Coding your own zip(...) and map(None, ...)
Of course, much of the magic in the examples shown so far lies in their use of the zip
built-in to p... |
You’ll also note that our `map`
workalikes are really emulating the behavior of the Python 3.0 map—they truncate at
the length of the shortest sequence, and they do not support the notion of padding
results when lengths differ, as map does in Python 2.X with a None argument:
```
C:\misc> c:\python26\python
>>> map(... |
Notice the use of the all and
```
any built-ins here—these return True if all and any items in an iterable are True (or
```
equivalently, nonempty), respectively. |
These built-ins are used to stop looping when
any or all of the listified arguments become empty after deletions.
Also note the use of the Python 3.0 keyword-only argument, pad; unlike the 2.6 map,
our version will allow any pad object to be specified (if you’re using 2.6, use a
```
**kargs form to support this option... |
When these
```
functions are run, the following results are printed—a zip, and two padding maps:
```
[('a', 'x'), ('b', 'y'), ('c', 'z')]
[('a', 'x'), ('b', 'y'), ('c', 'z'), (None, '1'), (None, '2'), (None, '3')]
[('a', 'x'), ('b', 'y'), ('c', 'z'), (99, '1'), (99, '2'), (99, '3')]
```
These functions aren’t a... |
As before, though, while our zip and map workalikes currently build
and return result lists, it’s just as easy to turn them into generators with yield so that
they each return one piece of their result set at a time. |
The results are the same as before,
but we need to use list again to force the generators to yield their values for display:
_# Using generators: yield_
```
def myzip(*seqs):
seqs = [list(S) for S in seqs]
while all(seqs):
yield tuple(S.pop(0) for S in seqs)
```
**|**
-----
```
def mymapPad(*seqs... |
Armed with these_
lengths, it’s easy to code nested list comprehensions to step through argument index
ranges:
_# Alternate implementation with lengths_
```
def myzip(*seqs):
minlen = min(len(S) for S in seqs)
return [tuple(S[i] for S in seqs) for i in range(minlen)]
def mymapPad(*seqs, pad=None):
maxl... |
The outer comprehensions here step through argument
index ranges, and the inner comprehensions (passed to tuple) step through the passedin sequences to pull out arguments in parallel. |
When they’re run, the results are as
before.
Most strikingly, generators and iterators seem to run rampant in this example. |
The
arguments passed to min and max are generator expressions, which run to completion
before the nested comprehensions begin iterating. |
Moreover, the nested list comprehensions employ two levels of delayed evaluation—the Python 3.0 range built-in is an
iterable, as is the generator expression argument to tuple.
In fact, no results are produced here until the square brackets of the list comprehensions
request values to place in the result list—they for... |
To turn these functions themselves into generators instead of list builders, use
parentheses instead of square brackets again. |
Here’s the case for our zip:
_# Using generators: (...)_
```
def myzip(*seqs):
minlen = min(len(S) for S in seqs)
```
**|**
-----
```
return (tuple(S[i] for S in seqs) for i in range(minlen))
print(list(myzip(S1, S2)))
```
In this case, it takes a list call to activate the generators and iterators to ... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.