How to use 'yield' in Python

Learn how to use Python's yield keyword. Explore different methods, real-world applications, common errors, and debugging tips.

How to use 'yield' in Python
Published on: 
Wed
Mar 25, 2026
Updated on: 
Fri
Mar 27, 2026
The Replit Team

Python's yield keyword creates generators—functions that produce items one at a time. This method is perfect for memory-efficient iteration over large datasets.

In this article, you'll explore techniques and tips to use yield effectively. You'll find real-world applications and advice to debug your code so you can master Python generators for your projects.

Understanding the basics of yield

def simple_generator():
yield 1
yield 2
yield 3

for value in simple_generator():
print(value)--OUTPUT--1
2
3

The simple_generator function shows how yield creates a generator. Unlike a regular function, it doesn't run to completion. Instead, it pauses its execution at each yield statement and saves its state.

  • When the for loop first calls the generator, the function runs until it hits yield 1 and sends that value back.
  • It then freezes until the loop's next iteration, where it resumes to produce 2, and so on.

This process of lazy evaluation—producing values only when requested—is what makes generators so memory-efficient.

Core generator concepts

Grasping a few core concepts will help you move beyond simple examples and unlock the full potential of generators in your own Python projects.

Using yield to create simple generators

def countdown(n):
while n > 0:
yield n
n -= 1

gen = countdown(3)
print(next(gen))
print(next(gen))
print(next(gen))--OUTPUT--3
2
1

The countdown function is a generator that yields numbers from n down to 1. When you call countdown(3), you get a generator object, but the function's code doesn't run yet. You can pull values from it manually using the built-in next() function.

  • The first call to next(gen) starts the function, which runs until it yields 3 and pauses.
  • The second call resumes execution, decrementing n before yielding 2.
  • This continues until the generator has no more values to produce.

Lazy evaluation with generators

def squares(n):
print("Generating squares up to", n)
for i in range(1, n+1):
print(f"Computing square of {i}")
yield i * i

for square in squares(3):
print(f"Got: {square}")--OUTPUT--Generating squares up to 3
Computing square of 1
Got: 1
Computing square of 2
Got: 4
Computing square of 3
Got: 9

The output from the squares function shows lazy evaluation in action. Notice how the "Computing" and "Got" messages are interleaved. The generator doesn't compute all the squares at once and store them in memory.

  • Instead, it calculates a value only when the for loop requests it.
  • After yielding the value, the function pauses its state until the next value is needed.

This "just-in-time" approach is incredibly efficient for large datasets, as it only holds one value in memory at a time.

Converting generators to other sequence types

def letters():
yield 'a'
yield 'b'
yield 'c'

gen = letters()
print(list(gen))
print(tuple(letters()))
print(''.join(letters()))--OUTPUT--['a', 'b', 'c']
('a', 'b', 'c')
abc

While generators are memory-efficient, you'll sometimes need all the values at once. You can easily convert a generator's output into other sequence types by passing it to constructors like list() or tuple(). These functions will automatically iterate through the generator and collect all its items.

  • The code shows how list(gen) consumes the letters generator to build a list.
  • Keep in mind that this process exhausts the generator—it can only be iterated over once.
  • To create the tuple and the string, new generator objects are made by calling letters() again each time.

Advanced generator techniques

Building on these core concepts, you can write more concise code with generator expressions, chain generators using yield from, and send data back using send().

Generator expressions for concise code

# Create a generator of squared numbers
squares_gen = (x**2 for x in range(1, 5))
print(next(squares_gen))
print(next(squares_gen))
print(list(squares_gen)) # Convert remaining items to list--OUTPUT--1
4
[9, 16]

Generator expressions offer a compact way to create generators. Their syntax looks a lot like list comprehensions, but you use parentheses instead of square brackets. The expression (x**2 for x in range(1, 5)) doesn't build a full list in memory. Instead, it creates a generator object that produces values on demand.

  • You can pull values individually using next(), as the code does to fetch 1 and then 4.
  • When you later call list(squares_gen), it only collects the remaining items, [9, 16]. This happens because the generator has already been partially consumed, demonstrating its stateful nature.

Using yield from for delegation

def subgenerator():
yield 'X'
yield 'Y'
yield 'Z'

def delegating_generator():
yield from range(3)
yield from subgenerator()

print(list(delegating_generator()))--OUTPUT--[0, 1, 2, 'X', 'Y', 'Z']

The yield from expression provides a clean way to delegate part of a generator's work to another generator or iterable. In the example, the delegating_generator acts as a manager, chaining other iterables together.

  • First, it uses yield from range(3) to produce all values from that range.
  • Once that's done, it passes control to subgenerator(), yielding all of its items until it's exhausted.

This technique lets you combine multiple generators into a single output stream without writing extra loops, making your code more readable.

Bidirectional communication with send()

def echo_generator():
received = yield "Ready to receive"
while True:
received = yield f"You sent: {received}"

gen = echo_generator()
print(next(gen)) # Prime the generator
print(gen.send("Hello"))
print(gen.send("Python"))--OUTPUT--Ready to receive
You sent: Hello
You sent: Python

The send() method transforms a generator into a coroutine, enabling two-way communication. It doesn’t just produce values; it can also accept them. You must first prime the generator by calling next(), which runs the code up to the first yield expression and prepares it to receive data.

  • The initial next(gen) call starts the generator, which yields "Ready to receive" and then pauses.
  • When you use gen.send("Hello"), the generator resumes, and the value "Hello" is assigned to the received variable.
  • The generator then processes this input and yields its response, "You sent: Hello".

Move faster with Replit

Replit is an AI-powered development platform that transforms natural language into working applications. Describe what you want to build, and Replit Agent creates it—complete with databases, APIs, and deployment. It's a powerful way to turn the generator concepts you've learned into production-ready tools.

For the generator techniques we've explored, Replit Agent can turn them into production tools:

  • Build a real-time log file analyzer that processes massive files line by line without loading them entirely into memory.
  • Create a data pipeline that uses yield from to chain multiple data sources, like fetching user activity from different APIs and merging them into a single stream.
  • Deploy an interactive data validation tool where you can send rules to a running generator using send() to check data streams on the fly.

You can describe your application idea, and Replit Agent will handle the coding, testing, and debugging for you. Turn the generator patterns from this article into a working application by trying Replit Agent.

Common errors and challenges

Even with their benefits, generators have a few quirks that can catch you off guard if you're not prepared for them.

  • Forgetting that generators are exhausted after iteration. A generator can only be iterated over once. After a for loop consumes all its values or you convert it to a list, it’s empty. If you try to loop over it again, you’ll get nothing. To reuse the sequence, you must create a new generator object by calling the function again.
  • Debugging infinite generators with safeguards. Infinite generators are useful, but they can easily cause your program to hang during development. When debugging, it’s a good practice to add a safeguard, like a counter that breaks the loop after a certain number of iterations, to prevent accidental infinite loops.
  • Handling StopIteration exceptions when manually iterating. While for loops handle it for you, calling next() on a generator after it's exhausted raises a StopIteration exception. If you’re manually iterating, you need to wrap your next() calls in a try...except StopIteration block to gracefully handle the end of the sequence and prevent your program from crashing.

Forgetting that generators are exhausted after iteration

A generator can only be iterated over once. After you've consumed all its items, for instance in a for loop, it becomes exhausted. Trying to use it again will yield nothing. The code below shows what happens when you attempt to reuse a generator after its first run.

def numbers():
yield 1
yield 2
yield 3

gen = numbers()
for num in gen:
print(num)

# Try to use the generator again
print("Let's try again:")
for num in gen:
print(num) # Nothing will print here

The first loop iterates through the gen object until it's depleted. Because gen isn't recreated, the second loop has nothing to iterate over and produces no output. The code below shows how to get a fresh sequence.

def numbers():
yield 1
yield 2
yield 3

# First iteration
gen = numbers()
for num in gen:
print(num)

# Create a new generator instance for second iteration
print("Let's try again:")
gen = numbers() # Create a fresh generator
for num in gen:
print(num)

To fix this, you must create a new generator object. The solution works by calling numbers() again and reassigning the result to gen. This provides a fresh, un-exhausted generator for the second loop to iterate over. You'll often encounter this when you need to process the same generated data multiple times, as each pass requires a new instance of the generator.

Debugging infinite generators with safeguards

Infinite generators are powerful tools for creating endless sequences, but they pose a significant risk. If you iterate over one without a clear exit condition, your program can hang indefinitely. The code below shows what happens when this goes wrong.

def infinite_counter():
i = 0
while True:
yield i
i += 1

# This will run forever if not controlled
for num in infinite_counter():
print(num) # Dangerous! No exit condition

The for loop requests values from infinite_counter(), but the generator's while True condition means it never stops producing them. This creates an unstoppable loop. The following code demonstrates a simple way to control the iteration.

def infinite_counter():
i = 0
while True:
yield i
i += 1

# Add a break condition
for num in infinite_counter():
print(num)
if num >= 5: # Add a stopping condition
break

To safely handle an infinite generator like infinite_counter(), you must control the iteration from the outside. The solution adds a simple safeguard inside the for loop.

  • An if statement checks if the yielded number has reached a certain threshold.
  • Once num >= 5 is true, the break statement terminates the loop, preventing it from running forever.

This is a crucial debugging technique when working with potentially endless data streams.

Handling StopIteration exceptions when manually iterating

While for loops automatically handle the end of a generator's sequence, manually calling next() requires more care. If you call it on an exhausted generator, your program will crash with a StopIteration exception. The code below shows this error in action.

def three_items():
yield "one"
yield "two"
yield "three"

gen = three_items()
print(next(gen))
print(next(gen))
print(next(gen))
print(next(gen)) # This will raise StopIteration

The three_items generator is exhausted after the third next() call. The fourth call finds no more items to return, triggering the exception. The following code demonstrates how to manage this situation gracefully.

def three_items():
yield "one"
yield "two"
yield "three"

gen = three_items()
try:
print(next(gen))
print(next(gen))
print(next(gen))
print(next(gen)) # This would raise StopIteration
except StopIteration:
print("Generator exhausted")

The solution is to wrap your next() calls within a try...except StopIteration block. This structure allows you to anticipate the end of the generator's sequence and handle it gracefully instead of letting your program crash.

When the three_items generator is exhausted, the except block catches the error and runs your fallback code. You'll need this pattern anytime you manually advance a generator with next() rather than relying on a for loop.

Real-world applications

Understanding the mechanics and pitfalls of yield sets the stage for applying it to real-world challenges like large file processing and data pipelines.

Using yield for memory-efficient CSV processing

When working with large CSV files, using a generator to process each row individually is a highly memory-efficient approach.

def parse_csv(filename):
for line in open(filename, 'r'):
yield line.strip().split(',')

# Example with simulated data
def sample_csv():
data = ['John,42,Engineer', 'Sara,39,Doctor', 'Mike,29,Designer']
for line in data:
yield line

# Process each record one at a time
for record in sample_csv():
name, age, profession = record.split(',')
if int(age) > 30:
print(f"{name} is {age} years old")

The parse_csv function is a generator that processes a file line by line, thanks to the yield keyword. This avoids loading the entire file into memory. For this example, the sample_csv function simulates this behavior using a predefined list of strings.

  • The for loop requests one record at a time from the generator.
  • Each record is immediately split, and its values are used to check a condition—in this case, if the age is over 30.
  • This "one-by-one" processing makes generators perfect for handling large datasets without consuming significant memory.

Building a data monitoring pipeline with generators

Chaining generators lets you build efficient data pipelines that fetch, process, and monitor data in distinct, memory-friendly stages.

def data_source():
"""Simulate data from a sensor"""
import random
for _ in range(5): # Limit to 5 readings for example
yield random.randint(0, 100)

def process_data(data_stream):
"""Process incoming data"""
for value in data_stream:
yield value * 1.8 + 32 # Convert to Fahrenheit

def alert_system(processed_stream, threshold=100):
"""Generate alerts for values exceeding threshold"""
for value in processed_stream:
if value > threshold:
yield f"ALERT: Value {value:.1f} exceeds threshold!"
else:
yield f"Normal: {value:.1f}"

# Connect the pipeline components
readings = data_source()
processed = process_data(readings)
alerts = alert_system(processed)

# Display alerts
for alert in alerts:
print(alert)

This example showcases a data pipeline built by chaining three distinct generators together. Each function represents a stage in the process, making the logic clean and modular.

  • The data_source() generator simulates raw sensor readings.
  • process_data() consumes those readings, converts them, and yields the results to the next stage.
  • Finally, alert_system() checks the processed data against a threshold.

When the final loop pulls from alerts, it triggers the entire chain. Data flows through each step on demand, with each generator running just enough to produce the next value.

Get started with Replit

Turn your knowledge of yield into a real tool. Give Replit Agent a prompt like, “Build a log file analyzer that processes large files line by line” or “Create a real-time data pipeline that yields alerts.”

The agent writes the code, tests for errors, and deploys your app. Start building with Replit to turn your ideas into working software.

Get started free

Create and deploy websites, automations, internal tools, data pipelines and more in any programming language without setup, downloads or extra tools. All in a single cloud workspace with AI built in.

Get started for free

Create & deploy websites, automations, internal tools, data pipelines and more in any programming language without setup, downloads or extra tools. All in a single cloud workspace with AI built in.