Laravel

How to manage large data sets efficiently using Laravel’s chunk() and cursor() methods?

December 3, 2025

download ready
Thank You
Your submission has been received.
We will be in touch and contact you soon!

Laravel's chunk() processes records in fixed-size batches (e.g., 1000 at a time) using LIMIT/OFFSET, while cursor() streams one record at a time via generators for minimal memory use on massive datasets.​

Use chunk(1000, callback) for batch operations like exports or jobs where you need related data with with(); switch to cursor() for memory-critical tasks like 1M+ record updates since it avoids loading collections entirely. Always pair with select() for essential columns only and chunkById() on mutable tables to prevent skips/duplicates.

Code

// chunk() - batch processing (medium datasets)
User::select('id', 'email')->chunk(1000, function ($users) {
    foreach ($users as $user) {
        $user->update(['last_login' => now()]);
    }
});

// cursor() - memory efficient streaming (large datasets)
foreach (User::select('id', 'email')->cursor() as $user) {
    // Process one by one - no memory buildup
    processUser($user);
}

// chunkById() - safe for changing tables
User::chunkById(500, function ($users) {
    // Stable ID-based chunks
});
Hire Now!

Need Help with Laravel Development ?

Work with our skilled laravel developers to accelerate your project and boost its performance.
**Hire now**Hire Now**Hire Now**Hire now**Hire now