Python generators support more than just iteration; they enable coroutine-like behaviro through bidirectional data flow. The send() method allows values to flow in to a paused generator, while throw() and close() provide exception handling and lifecycle management. Additionally, yield from (available since Python 3.3) simplifies delegation to sub-generators.
Bidirectional Data Flow with send()
Generators can receive external data through the send() method. When a generator executes yield, it emits a value to the caller and pauses. The next send() call resumes execution, passing a value that becomes the result of the yield expression.
Consider a configurable data transformer:
def data_transformer():
"""
Bidirectional generator pattern.
Note: To receive n values via send(), the generator needs n+1 yield expressions.
"""
# First yield initializes the generator; receives None or first send
config = yield "INIT"
print(f"Configuration received: {config}")
# Second yield receives second send value
data = yield "CONFIGURED"
print(f"Processing data: {data}")
# Third yield allows retrieval of final value
yield "COMPLETED"
return config
if __name__ == "__main__":
transformer = data_transformer()
# Prime the generator (must send None or use next() first)
status = transformer.send(None)
print(f"Status: {status}")
# Send configuration
status = transformer.send("mode=aggressive")
print(f"Status: {status}")
# Send data to process
result = transformer.send("raw_data_packet")
print(f"Final status: {result}")
Key observations:
- The first
send()must beNone(or usenext()) to advance to the first yield - Each subsequent
send()provides the value for the previousyieldexpression - To retrieve
nvalues throughsend(), the generator requiresn+1yield points
Generator Lifecycle Management
Generators implement the Generator protocol with close() and throw() methods:
def resource_handler():
try:
while True:
resource = yield "WAITING"
print(f"Acquired: {resource}")
except GeneratorExit:
print("Cleanup: releasing resources")
raise # Re-raise to properly close generator
except ValueError as e:
print(f"Error handled: {e}")
yield "RECOVERED"
gen = resource_handler()
gen.send(None)
# Inject exception
gen.throw(ValueError, "Invalid resource")
# Terminate generator
gen.close()
Delegating to Sub-generators with yield from
The yield from syntax establishes a bidirectional channel between the caller and a sub-generator, automatically handling send(), throw(), and return values.
Consider a distributed task aggregator:
aggregated_results = {}
def worker_node(task_type):
"""Sub-generator processing specific task types."""
total = 0
logs = []
while True:
task = yield
if task is None: # Sentinel value for completion
break
print(f"[{task_type}] Executing: {task}")
total += task
logs.append(task)
return total, logs
def coordinator(task_type):
"""Delegate generator managing worker lifecycle."""
while True:
# yield from transparently forwards send() to worker_node
# and captures the return value upon StopIteration
aggregated_results[task_type] = yield from worker_node(task_type)
print(f"Batch {task_type} finalized")
def main_controller():
workloads = {
"compute": [10, 20, 30],
"io": [5, 15, 25],
"memory": [100, 200]
}
for job_type, tasks in workloads.items():
print(f"\nDispatching {job_type} jobs")
coord = coordinator(job_type)
coord.send(None) # Prime
for task in tasks:
coord.send(task)
coord.send(None) # Signal completion
print(f"\nFinal aggregation: {aggregated_results}")
The yield from construct:
- Forwards
send()andthrow()calls to the sub-generator - Captures the sub-generator's return value (available in Python 3.3+ via PEP 380)
- Automaticaly handles
StopIteration
Recursive Generator Delegation
yield from enables elegant recursive generator patterns for tree-like data structures:
def exponential_growth(seed, threshold):
"""Recursively yields powers until threshold exceeded."""
squared = seed ** 2
yield squared
if squared < threshold:
# Delegate to recursive call without manual iteration
yield from exponential_growth(squared, threshold)
# Generate sequence: 4, 16, 256, 65536 (stops when > 1000 next would exceed)
for value in exponential_growth(2, 100000):
print(value)
This approach eliminates the need for explicit loops to flatten recursive generator results, maintaining constant stack depth regardless of recursion level.