Issue
I had a code below in Python 3.2 and I wanted to run it in Python 2.7. I did convert it (have put the code of missing_elements
in both versions) but I am not sure if that is the most efficient way to do it. Basically what happens if there are two yield from
calls like below in upper half and lower half in missing_element
function? Are the entries from the two halves (upper and lower) appended to each other in one list so that the parent recursion function with the yield from
call and use both the halves together?
def missing_elements(L, start, end): # Python 3.2
if end - start <= 1:
if L[end] - L[start] > 1:
yield from range(L[start] + 1, L[end])
return
index = start + (end - start) // 2
# is the lower half consecutive?
consecutive_low = L[index] == L[start] + (index - start)
if not consecutive_low:
yield from missing_elements(L, start, index)
# is the upper part consecutive?
consecutive_high = L[index] == L[end] - (end - index)
if not consecutive_high:
yield from missing_elements(L, index, end)
def main():
L = [10, 11, 13, 14, 15, 16, 17, 18, 20]
print(list(missing_elements(L, 0, len(L)-1)))
L = range(10, 21)
print(list(missing_elements(L, 0, len(L)-1)))
def missing_elements(L, start, end): # Python 2.7
return_list = []
if end - start <= 1:
if L[end] - L[start] > 1:
return range(L[start] + 1, L[end])
index = start + (end - start) // 2
# is the lower half consecutive?
consecutive_low = L[index] == L[start] + (index - start)
if not consecutive_low:
return_list.append(missing_elements(L, start, index))
# is the upper part consecutive?
consecutive_high = L[index] == L[end] - (end - index)
if not consecutive_high:
return_list.append(missing_elements(L, index, end))
return return_list
Solution
If you don't use the results of your yields,* you can always turn this:
yield from foo
… into this:
for bar in foo:
yield bar
There might be a performance cost,** but there is never a semantic difference.
Are the entries from the two halves (upper and lower) appended to each other in one list so that the parent recursion function with the yield from call and use both the halves together?
No! The whole point of iterators and generators is that you don't build actual lists and append them together.
But the effect is similar: you just yield from one, then yield from another.
If you think of the upper half and the lower half as "lazy lists", then yes, you can think of this as a "lazy append" that creates a larger "lazy list". And if you call list
on the result of the parent function, you of course will get an actual list
that's equivalent to appending together the two lists you would have gotten if you'd done yield list(…)
instead of yield from …
.
But I think it's easier to think of it the other way around: What it does is exactly the same the for
loops do.
If you saved the two iterators into variables, and looped over itertools.chain(upper, lower)
, that would be the same as looping over the first and then looping over the second, right? No difference here. In fact, you could implement chain
as just:
for arg in *args:
yield from arg
* Not the values the generator yields to its caller, the value of the yield expressions themselves, within the generator (which come from the caller using the send
method), as described in PEP 342. You're not using these in your examples. And I'm willing to bet you're not in your real code. But coroutine-style code often uses the value of a yield from
expression—see PEP 3156 for examples. Such code usually depends on other features of Python 3.3 generators—in particular, the new StopIteration.value
from the same PEP 380 that introduced yield from
—so it will have to be rewritten. But if not, you can use the PEP also shows you the complete horrid messy equivalent, and you can of course pare down the parts you don't care about. And if you don't use the value of the expression, it pares down to the two lines above.
** Not a huge one, and there's nothing you can do about it short of using Python 3.3 or completely restructuring your code. It's exactly the same case as translating list comprehensions to Python 1.5 loops, or any other case when there's a new optimization in version X.Y and you need to use an older version.
Answered By - abarnert
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.