Django's Signal Dispatcher is a really powerful feature that you should never use. Ok, it has valid use cases, but they may be rarer than you think.

First, to dispel a misconception about signals, they are not executed asynchronously. There is no background thread or worker to execute them. Like most of Django, they are fully "synchronous". I suspect this misconception stems from the event-driven coding pattern they use which is often seen in asynchronous languages/libraries.

The event-driven pattern of signals looks neat at first, but often I see them mentioned as a source of frustration. It's not the author complaining, but the poor soul who inherits maintenance of the code and only learns of the existence of signals when trying to track down unexpected behavior.

Consider the following example using signals:

# File: models.py from django.db import models class Pizza ( models . Model ): has_pepperoni = models . BooleanField ( default = False ) class ToppingSales ( models . Model ): name = models . CharField ( max_length = 100 , unique = True ) units_sold = models . PositiveIntegerField ( default = 0 )

# File: signals.py from django.db.models.signals import post_save from django.db.models import F from django.dispatch import receiver from .models import Pizza def pizza_saved_handler ( sender , instance , created , ** kwargs ): if created and instance . has_pepperoni : ToppingSales . objects . filter ( name = 'pepperoni' ) . update ( units_sold = F ( 'units_sold' ) + 1 )

# File: app.py from django.apps import AppConfig from django.db.models.signals import post_save class PizzeriaConfig ( AppConfig ): def ready ( self ): from .models import Pizza from .signals import pizza_saved_handler post_save . connect ( pizza_saved_handler , sender = Pizza )

Compare that to an example without signals:

# File: models.py class Pizza ( models . Model ): has_pepperoni = models . BooleanField ( default = False ) def save ( self , * args , ** kwargs ): created = self . pk is None super ( Pizza , self ) . save ( * args , ** kwargs ) if created and self . has_pepperoni : ToppingSales . objects . filter ( name = 'pepperoni' ) . update ( units_sold = F ( 'units_sold' ) + 1 ) class ToppingSales ( models . Model ): name = models . CharField ( max_length = 100 , unique = True ) units_sold = models . PositiveIntegerField ( default = 0 )

This is a contrived example, but it gets the point across. With signals, we have to spread the logic across three different files. Worse, the standard for putting signals in a signals module is not always followed, especially in older code bases. Simply finding where the signals are defined and connected can be a challenge. The example without signals is not only less code, but far easier to read, understand, and test. You can see all the side-effects of Pizza model creation, simply by reading the code of the Pizza model.

But breaking up the code is a good thing, right? I agree, but not when it inhibits readability. The intent here is not to define all the logic in the save method, but to leave a breadcrumb for future developers so they can see what code is executed when a save occurs. With that in mind, a function or method call in the save method will help developers follow the path the code takes. That may look like this:

class Pizza ( models . Model ): has_pepperoni = models . BooleanField ( default = False ) def _update_toppings ( self , created = False ): if created and self . has_pepperoni : ToppingSales . objects . filter ( name = 'pepperoni' ) . update ( units_sold = F ( 'units_sold' ) + 1 ) def save ( self , * args , ** kwargs ): created = self . pk is None super ( Pizza , self ) . save ( * args , ** kwargs ) self . _update_toppings ( created )

By simply overriding your model's save and delete methods, you can accomplish almost anything the {pre,post}_{save,delete} signals can and make your code easier to read in the process. Similarly, request_{started,finished} could be accomplished by middleware instead. With the new class-based login views, even the django.contrib.auth signals can be replaced with a straight-forward subclass.

So... Never use signals?

Not quite. Signals are still valuable in a few limited scenarios.

Extending the functionality of a third-party library. If you're using an app with models where you don't control the code, signals are a more robust alternative to monkey-patching for taking specific actions on save or delete. Delete signals work on the delete queryset method. If you'd like the same actions to happen on both single object and queryset deletes, a signal is probably your best bet. Note: Due to the way the SQL is generated, this does not apply to the update queryset method (with save signals). Another frequent source of confusion/bugs. When you need to apply the same signal handler to many models. If you just have a few, I'd still favor overriding the save / delete method and calling a shared function. For more than a few, copy/pasting the save method everywhere becomes more error-prone and signals look like a better option. Avoiding tight cross-application dependencies. If you need to cross application boundaries, especially when the application may be used in multiple projects, signals may be better for looser coupling between the apps. Be careful with this one though. Use it because you need to, not because you think you might want it in the future.

If you must use signals, think twice and determine if you really need them or are just being clever. Consider your future self and the violent psychopath who will be maintaining your code. By documenting them copiously, using some logging, and leave lots of breadcrumbs you can use them safely and prevent future frustration.