Coding With Fun
Home Docker Django Node.js Articles Python pip guide FAQ Policy

Scrapy calls parse(), parse() calls func() using yield


May 30, 2021 Article blog


Table of contents


Calling the parse() method in the scrapy method, the other function funnc() in the parse() method, func needs to return Toem, using the following method:

# Calling other Yield functions in PARSE (), you need to write to write a loop and Yield it.

def parse():

# Correct call

for item in parse_comment(a, b):

yield item

# 错 误 调, unable to get the data

parse_comment(a, b)

def parse_comment(a,b):

for a in b:

yield c

yield

The function of yield is to turn a function into a generator, and a function with yeeld is no longer a normal function. The Python interpreter treats it as a generator, and a single call (such as fab(5)) does not execute the fab function, but returns an iterable object!

When the for loop executes, each loop executes the code inside the fab function, and when executed to yeeld b, the fab function returns an iteration value, and on the next iteration, the code continues from the next statement of yield b, and the function's local variable looks exactly the same as it did before the last interrupt execution, so the function continues to execute until it encounters yield again.

Conclusion: Yield should be used in a loop so that the generator can be used.

def fab(max):

n, a, b = 0, 0, 1

while n < max:

# print b

yield b

# print b

a, b = b, a + b

n = n + 1

print(fab(5))  # 输出:<generator object fab at 0x00000000069D8A68>

for n in fab(5):

Print N # sequentially 1, 1, 2, 3, 5

# For functions containing Yield, external to an iterative manner, when the function is completed, Generator automatically throws the StopIitration exception, indicating that iteration is completed.

# In the For loop, there is no need to handle the stopiteration exception, and the loop will end properly.

def ff(max):

a,b = 0,1

Yield max # yield is not in the loop, here is the function is finally finally returned, equivalent to return

for n in ff(5):

Print N # Output: 5

Recommended lessons: Python static reptiles, Python Scrapy network crawlers