Laravel Get Vs Cursor
# Laravel's get() vs cursor(): Why Your Export Crashed at 3 AM Ever had a background job crash with "Allowed memory size exhausted"? Yeah, me too. Let me explain what's happening and how to fix it....
Gurpreet Kait
Author
# Laravel's get() vs cursor(): Why Your Export Crashed at 3 AM Ever had a background job crash with "Allowed memory size exhausted"? Yeah, me too. Let me explain what's happening and how to fix it....
Gurpreet Kait
Author
Ever had a background job crash with "Allowed memory size exhausted"? Yeah, me too. Let me explain what's happening and how to fix it.
You write this innocent-looking code:
$users = User::with('orders')->get();
foreach ($users as $user) {
// export to CSV
}
Works great in development with 50 users. Crashes spectacularly in production with 100,000 users.
get() - The Greedy ApproachWhen you call get(), Laravel says: "Cool, let me grab EVERYTHING and put it in a nice Collection for you."
$users = User::query()->get();
What happens in memory:
Database: "Here's 100,000 users!"
PHP Memory (128MB limit):
┌─────────────────────────────────────────────────┐
│ User 1 │ User 2 │ User 3 │ ... │ User 100,000 │
└─────────────────────────────────────────────────┘
↑
💥 BOOM - Memory exhausted
It's like going grocery shopping and trying to carry everything in your arms at once. Works for 5 items, not for 500.
cursor() - The Smart ApproachWhen you call cursor(), Laravel says: "Let me fetch one at a time as you need them."
$users = User::query()->cursor();
What happens in memory:
PHP Memory:
┌─────────┐
│ User 1 │ → Process → Garbage collected
└─────────┘
┌─────────┐
│ User 2 │ → Process → Garbage collected
└─────────┘
┌─────────┐
│ User 3 │ → Process → Garbage collected
└─────────┘
...continues forever without crashing
It's like having a conveyor belt. One item comes, you deal with it, it moves on.
The Export That Crashed:
// DON'T DO THIS for large datasets
public function export()
{
$orders = Order::with('customer', 'items')->get(); // 💀 Loads EVERYTHING
$csv = fopen('orders.csv', 'w');
foreach ($orders as $order) {
fputcsv($csv, [
$order->id,
$order->customer->name,
$order->total,
]);
}
fclose($csv);
}
The Export That Works:
// DO THIS instead
public function export()
{
$orders = Order::with('customer', 'items')->cursor(); // ✅ Streams one at a time
$csv = fopen('orders.csv', 'w');
foreach ($orders as $order) {
fputcsv($csv, [
$order->id,
$order->customer->name,
$order->total,
]);
}
fclose($csv);
}
Same code, just swapped get() for cursor(). That's it.
cursor() uses something called a LazyCollection, which is built on PHP generators. Let me explain generators because they're actually pretty cool.
A generator is a function that can pause and resume. Instead of returning one value and dying, it can yield multiple values over time.
Normal function - Returns everything at once:
function getAllNumbers(): array
{
$numbers = [];
for ($i = 1; $i <= 1000000; $i++) {
$numbers[] = $i;
}
return $numbers; // 💀 1 million integers in memory
}
// Uses ~32MB of memory
$numbers = getAllNumbers();
Generator function - Yields one at a time:
function getAllNumbers(): Generator
{
for ($i = 1; $i <= 1000000; $i++) {
yield $i; // Pause here, give this value, continue when asked
}
}
// Uses basically no memory
$numbers = getAllNumbers();
foreach ($numbers as $n) {
// Only ONE number in memory at any time
echo $n;
}
Think of yield like a bookmark:
function countToThree(): Generator
{
echo "Starting...\n";
yield 1; // 📌 Pause here, return 1
echo "Continuing...\n";
yield 2; // 📌 Pause here, return 2
echo "Almost done...\n";
yield 3; // 📌 Pause here, return 3
echo "Finished!\n";
}
$counter = countToThree();
foreach ($counter as $num) {
echo "Got: $num\n";
}
// Output:
// Starting...
// Got: 1
// Continuing...
// Got: 2
// Almost done...
// Got: 3
// Finished!
The function literally pauses at each yield and resumes when you ask for the next value.
// BAD: Loads entire file into memory
function readFile(string $path): array
{
return file($path); // 💀 10GB file = 10GB memory
}
// GOOD: Reads line by line
function readFile(string $path): Generator
{
$handle = fopen($path, 'r');
while (!feof($handle)) {
yield fgets($handle); // One line at a time
}
fclose($handle);
}
// Process a 10GB log file with minimal memory
foreach (readFile('huge.log') as $line) {
if (str_contains($line, 'ERROR')) {
echo $line;
}
}
Laravel wraps generators in a LazyCollection so you get the familiar Collection methods:
// This returns a LazyCollection
$users = User::cursor();
// You can chain methods - they execute lazily!
$users
->filter(fn($user) => $user->is_active)
->map(fn($user) => $user->email)
->each(fn($email) => sendNewsletter($email));
// Nothing runs until you iterate or call a terminal method
The key insight: filter() and map() on a LazyCollection don't create new arrays. They just add more processing steps to the pipeline.
| Scenario | Use | Why |
|---|---|---|
| Display 20 users on a page | get() |
Small dataset, need it all |
| Export 100K rows to CSV | cursor() |
Stream to file, never hold all in memory |
| Count total records | get()->count() or just count() |
cursor() would iterate everything |
| Need to access items twice | get() |
cursor() can only iterate once |
| Background job processing | cursor() |
Jobs often deal with large datasets |
| API response with 50 items | get() |
JSON encoding needs the full array anyway |
| ETL pipeline | cursor() |
Extract-Transform-Load is sequential |
// Small dataset, need Collection features
$users = User::where('active', true)->get();
$count = $users->count();
$first = $users->first();
$grouped = $users->groupBy('role');
// Large dataset, sequential processing
User::where('active', true)->cursor()->each(function ($user) {
// Process one at a time
dispatch(new SendNewsletterJob($user));
});
// Even better for very large jobs: chunk
User::where('active', true)->chunk(1000, function ($users) {
// Process in batches of 1000
foreach ($users as $user) {
// ...
}
});
get() = Load everything into memory. Fast for small data. Crashes on large data.cursor() = Stream one record at a time. Constant memory. Use for large datasets.cursor() or chunk().That export job that crashed at 3 AM? Just change get() to cursor() and go back to sleep.
Subscribe to get more Laravel tutorials and development tips like this one
No spam, ever. Unsubscribe at any time.