Laravel commands and queue jobs: when to use them?

anwar_nairi

Anwar

Posted on May 26, 2022

Laravel commands and queue jobs: when to use them?

Hi and welcome to this blog post where I would like to provide you with my experience of using commands and queue jobs within Laravel.

Running background tasks

Commands and queue jobs help deferring tasks for a later time. The main advantage is to be able to not block the end user with some long running tasks.

Think of it as ticket log, when you just stack them, and someone else is able to take the first in, process it, and move on to the next task.

For example, let us imagine your web app have a contact page. It has a text input for the email, another for the subject, and a textarea for the content.

When the form is submited, it should:

  1. Send an email to your company mailbox
  2. Send a confirmation email to the web user

Let us also imagine you are using GMail SMTP to send emails. On a low traffic day, this might be quite fast for the user to receive its confirmation.

But if there is a thousands of contact sent per minutes, this might put your server in high risk of failure because you have an high CPU usage. GMail SMTP could also rate limit you and this means chances your emails will fail to send at some point.

This can also worsen the experience of other users navigating to other pages since the server will be busy both serving web pages and sending emails.

Lastly, your users might have to wait longer before their page has finished loading before seeing the success confirmation page.

To fix this issue, you can ask your server to stack 2 tasks, one to send an email to your company inbox, another to send the confirmation email to the web user. This way, you do not block the loading of the success page for the web user, and the server will now be able to defer the task in a later time.

Commands

Commands in Laravel acts as a cron job. In fact, Laravel requires to setup a "master" command (called the Scheduler), in order to regularily check for commands to run.

You can control how often you want your command to run:

// app/Console/Kernel.php

namespace App\Console;

use Illuminate\Console\Scheduling\Schedule;
use Illuminate\Foundation\Console\Kernel as ConsoleKernel;

class Kernel extends ConsoleKernel
{
  protected function schedule(Schedule $schedule): void
  {
    $schedule
      ->command('contact-email:send')
      ->everyTenMinutes();
  }
}
Enter fullscreen mode Exit fullscreen mode

This commands will take any contact emails that have not been sent yet, and send them.

// app/Console/Commands/ContactEmailSend.php

namespace App\Console\Commands;

use App\Models\ContactEmail;
use App\Mail\ContactConfirmation;
use App\Mail\Contact;
use Illuminate\Console\Command;

class ContactEmailSend extends Command
{
  protected $signature = 'contact-email:send';

  protected $description = 'Send contact email and confirmation emails.';

  public function handle()
  {
    $contactEmails = ContactEmail::where("status", "pending")->get();

    foreach ($contactEmails as $contactEmail) {
      $recipient = $contactEmail->email;
      $subject = $contactEmail->subject;
      $content = $contactEmail->content;

      // For the web user
      Mail::to($recipient)->send(new ContactConfirmation($subject, $content));

      // For the company
      Mail::to("contact@your-company.com")->send(new Contact($recipient, $subject, $content));

      $contactEmail->status = "sent";
      $contactEmail->save();
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Sometimes you can end up in situations when sending emails takes a long time (for example when there is a ton of emails to send). For the moment, our commands should run within 10 minutes or another cron job will start running.

It is not a wanted behavior because if the first cron job has not finished to send an email, the second cron job might take the same contact email and start to try to send it as well, leading to duplicate confirmation emails being received by your web users and your company.

To prevent this behavior, either increase the schedule frequency, or specify you want any pending cron job to finish before another can start.

// app/Console/Kernel.php

namespace App\Console;

use Illuminate\Console\Scheduling\Schedule;
use Illuminate\Foundation\Console\Kernel as ConsoleKernel;

class Kernel extends ConsoleKernel
{
  protected function schedule(Schedule $schedule): void
  {
    $schedule
      ->command('contact-email:send')
      ->everyTenMinutes()
      ->withoutOverlapping(); // <--
  }
}
Enter fullscreen mode Exit fullscreen mode

Setting up the Laravel Scheduler is easy since every servers comes with a cronjob runner (configurable using the crontab command).

Jobs

When commands are not bound to any specific time to send (meaning "perform these tasks as soon as possible"), we can see jobs as a stack of tasks to do. The server will take them one by one, when it has some time and resources.

This means you cannot determine in advance when the task will be performed.

In the other hand jobs requires a job runner to unstack them (Supervisor being the most popular when using Laravel - a documentation is available here to know how to set it up).

However, since they are unstacked one by one, you do not need to take care of command overlapping.

The other advantage is that you can scale them, because you can control how many job runners unstack your jobs. This means if you have a period of time you know you will have a lot of traffic, which will lead to lots of people to contact your company, you can decide to put more job runners to keep the stack low and help web users to receive their email confirmation faster.

If we keep the same example, here is a sneak peak of how to run a job:

// app/Http/Controllers/ContactController.php

namespace App\Http\Controllers;

use App\Jobs\SendContactEmail;
use App\Jobs\SendContactConfirmation;
use App\Http\Requests\StoreContactRequest;

class ContactController extends Controller
{
  public function store(StoreContactRequest $request)
  {
    $email = $request->email;
    $subject = $request->subject;
    $content = $request->content;

    // For the company
    dispatch(new SendContactEmail($email, $subject, $content));

    // For the web user
    dispatch(new SendContactConfirmation($email, $subject, $content));

    return redirect()->route("contact.success");
  }
}
Enter fullscreen mode Exit fullscreen mode

And here is how to send an email to the web user:

// app/Jobs/SendContactConfirmation.php

namespace App\Jobs;

use App\Mail\ContactConfirmation;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Illuminate\Support\Facades\Mail;

class SendContactConfirmation implements ShouldQueue
{
  use Dispatchable;
  use InteractsWithQueue;
  use Queueable;
  use SerializesModels;

  private string $email;
  private string $subject;
  private string $content;

  public function __construct(string $email, string $subject, string $content)
  {
    $this->email = $email;
    $this->subject = $subject;
    $this->content = $content;
  }

  public function handle()
  {
    Mail::to($this->email)->send(new ContactConfirmation($email, $subject, $content));
  }
}
Enter fullscreen mode Exit fullscreen mode

The good thing is if sending an email failed, you do not have to handle it in the controller since it is done on the job. Also, in this case, the job will be put in the "failed_jobs" table, and you can decide to retry it later manually.

If you often have failed emails within a job, and you know the SMTP server often goes down and up quick, you could decide to ask Laravel to retry the job:

// app/Jobs/SendContactConfirmation.php

namespace App\Jobs;

use Illuminate\Contracts\Queue\ShouldQueue;

class SendContactConfirmation implements ShouldQueue
{
  // ...

  public $tries = 2; // retry another time before puting the job in "failed_jobs", for a total of 2 tries

  // ...
}
Enter fullscreen mode Exit fullscreen mode

You can do advanced integrity checks like making sure the job is unique in the stack by specifying a unique id, or making sure two jobs will not overlap using the WithoutOverlapping job middleware, etc...

The other advantage with being able to retry your job manually or automatically is to see the state of your jobs.

Using Laravel Telescope coupled with the QUEUE_CONNECTION=database env variable (called "QUEUE_DRIVER" in older Laravel version), you will be able to know exactly how many tasks are waiting in the queue, how many failed. This makes it easiers to know if you have to give a check to see what's wrong or just keep an eye open on the process.

Takeway

My personal experience have taught me I would often go to creating a command when in fact it would have been more suited to use a queued job. Often by lazyness, most of the time by lack of understanding the key differences.

I still like to use commands, but now I dedicate them for real scheduling tasks like updating a list of disposable emails (https://github.com/Propaganistas/Laravel-Disposable-Email).

In the end our goal is to lower the load for the HTTP requests, and deffer what's possible to deffer in the background. That is why I think the optimal setup for a server would be:

  • a fleet of servers dedicated to serve HTTP responses
  • a fleet of servers dedicated to unstack jobs
  • a server to run scheduled cron tasks

Since scheduled tasks obeys to a time, you cannot take the risk of having 4 servers running the same cron job (risky).

However, you see we can easily scale up to a theorically infinite number of servers to unstack jobs. That is why most of the time you will want to find a way to deffer to the job queue because you are able to put more servers during a high traffic period, and then lower the number of servers when things get more quiet.

Some giants of the industry have noticed the same issues of scale with scheduled tasks, like Fathom Analytics

We used to have a backlog, as we used INSERT ON DUPLICATE KEY UPDATE for our summary tables. This meant we couldn't just run queries as soon as data came in, as you get into a DEADLOCK war, and it causes a lot of problems (believe me, had them). And the problem we had was that we had to put sites into groups to run multiple cron jobs side by side, aggregating the data in isolated (by group) processes. But guess what? Cron jobs don't scale, and we were starting to see bigger pageview backlogs each day. Well, now we're in SingleStore, data is fully real-time. So if you view a page on your website, it will appear in your Fathom dashboard with zero delays.

Keep reading on Building the world's fastest website analytics blog post of Fathom Analytics.

Conclusion

I hope I helped you demistifying the key differences, and I am very curious to know your opinion if you have already crossed situations when you had to make a choice between one or another (bonus point if you had some scale issues with it!).

As always, thanks for reading through, and happy task scheduling!

Photo by Glenn Carstens-Peters on Unsplash

💖 💪 🙅 🚩
anwar_nairi
Anwar

Posted on May 26, 2022

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related