Celery
The Celery integration adds support for the Celery Task Queue System.
Just add CeleryIntegration()
to your integrations
list:
import sentry_sdk
from sentry_sdk.integrations.celery import CeleryIntegration
sentry_sdk.init(
dsn='https://examplePublicKey@o0.ingest.sentry.io/0',
integrations=[
CeleryIntegration(),
],
# Set traces_sample_rate to 1.0 to capture 100%
# of transactions for performance monitoring.
# We recommend adjusting this value in production,
traces_sample_rate=1.0,
)
Additionally, the Sentry Python SDK will set the transaction on the event to the task name, and it will improve the grouping for global Celery errors such as timeouts.
The integration will automatically report errors from all celery jobs.
Generally, make sure that the call to init
is loaded on worker startup, and not only in the module where your tasks are defined. Otherwise, the initialization happens too late and events might end up not being reported.
Standalone Setup
If you're using Celery standalone, there are two ways to set this up:
Initializing the SDK in the configuration file loaded with Celery's
--config
parameterInitializing the SDK by hooking it to either the
celeryd_init
orworker_init
signalsCopiedimport sentry_sdk from celery import Celery, signals app = Celery("myapp") #@signals.worker_init.connect @signals.celeryd_init.connect def init_sentry(**_kwargs): sentry_sdk.init(dsn="...")
Setup With Django
If you're using Celery with Django in a conventional setup, have already initialized the SDK in your settings.py
file, and have Celery using the same settings with config_from_object
, you don't need to initialize the SDK separately for Celery.
Verify
To verify if your SDK is initialized on worker start, you can pass debug=True
to see extra output when the SDK is initialized. If the output appears during worker startup and not only after a task has started, then it's working properly.
Note on distributed tracing
Sentry uses custom message headers for distributed
The fix for the custom headers propagation issue was introduced to Celery project (PR) starting with version 5.0.1. However, the fix was not backported to versions 4.x.
Options
To set options on CeleryIntegration
to change its behavior, add it explicitly to your sentry_sdk.init()
:
import sentry_sdk
from sentry_sdk.integrations.celery import CeleryIntegration
sentry_sdk.init(
dsn="https://examplePublicKey@o0.ingest.sentry.io/0",
# ...
integrations=[
CeleryIntegration(
monitor_beat_tasks=True,
exclude_beat_tasks=["unimportant-task", "payment-check-.*"],
),
],
)
You can pass the following keyword arguments to CeleryIntegration()
:
propagate_traces
Propagate Sentry
tracingThe process of logging the events that took place during a request, often across multiple services.information to the Celery task. This makes it possible to link errors in Celery tasks to the function that triggered the Celery task. If this is set toFalse
, errors in Celery tasks can't be matched to the triggering function.The default is
True
.monitor_beat_tasks
:Turn auto-instrumentation on or off for Celery Beat tasks using Sentry Crons.
See Celery Beat Auto Discovery to learn more.
The default is
False
.exclude_beat_tasks
:A list of Celery Beat tasks that should be excluded from auto-instrumentation using Sentry Crons. Only applied if
monitor_beat_tasks
is set toTrue
.The list can contain strings with the names of tasks in the Celery Beat schedule to be excluded. It can also include regular expressions to match multiple tasks. For example, if you include
"payment-check-.*"
every task starting withpayment-check-
will be excluded from auto-instrumentation.See Celery Beat Auto Discovery to learn more.
The default is
None
.
Our documentation is open source and available on GitHub. Your contributions are welcome, whether fixing a typo (drat!) to suggesting an update ("yeah, this would be better").
- Package:
- pypi:sentry-sdk
- Version:
- 1.24.0
- Repository:
- https://github.com/getsentry/sentry-python