It was 2:00 AM. Coffee in one hand, keyboard under siege from a frenzied flurry of keystrokes, I was debugging a script that had to process hundreds of files.
Everything looked fine... until it wasn’t.
for file in file_list:
f = open(file, 'r')
data = f.read()
# Do something with data
No f.close()? Oops. Rookie mistake—or was it?
💥 The Crash
After a few hundred iterations, my script died. 💀
The error?
"Too many open files."
I'd forgotten one of the golden rules of file handling in Python:
Always close what you open.
Except I did open… I just never closed.
💡 Enter with open(...) as
What if I told you there’s a smarter, more Pythonic way?
for file in file_list:
with open(file, 'r') as f:
data = f.read()
# File is automatically closed after this block
✅ Cleaner
✅ Safer
✅ No memory leaks
✅ No dangling file handles
✅ No need to manually call close()
⚙️ Why it works:
with open(...)
uses Python’s context manager under the hood.
Once the block is done—whether it completes normally or hits an error—Python makes sure the file is closed properly. It’s like having a mini garbage collector just for your files.
🚫 Never Again...
After that late-night blunder, I made a vow:
No more open-without-with.
Now, every time I open a file, it’s inside a with
.
No leaks. No regrets.
🔚 Moral of the Story:
When handling files in Python, remember this line:
“With great power (open()) comes great responsibility (close())—unless you're using with.” 😎