Process Email queue

Your software My Mautic version is: 6.0.2
My PHP version is: 8.3
My Database type and version is: MySQL/MariaDB

Your problem My problem is:

We’re trying to configure Mautic for queue-based email processing using Symfony Messenger with Doctrine transport (database queue). We followed the guide at https://kb.mautic.org/article/configuring-doctrine-for-email-queue-management-in-mautic-with-cron-jobs.html

Current configuration:

Complete cron jobs setup:

# Segments and campaigns
0,15,30,45 * * * * php /var/www/html/bin/console mautic:segments:update --env=prod
5,20,35,50 * * * * php /var/www/html/bin/console mautic:campaigns:update --env=prod
* * * * * php /var/www/html/bin/console mautic:campaigns:trigger --batch-limit=300 --env=prod

# Email queue processing
*/5 * * * * php /var/www/html/bin/console mautic:broadcasts:send --channel=email --limit=5000 --batch=500 --env=prod
* * * * * /var/www/html/bin/mautic_consume_email.sh

# Import/Export
*/5 * * * * php /var/www/html/bin/console mautic:import --env=prod
*/7 * * * * php /var/www/html/bin/console mautic:contacts:scheduled_export --env=prod

# Maintenance
0 * * * * php /var/www/html/bin/console mautic:custom-field:create-column --env=prod
0 2 * * * php /var/www/html/bin/console mautic:maintenance:cleanup --days-old=365 --env=prod

Lock file script (mautic_consume_email.sh) runs:

bash

messenger:consume email --time-limit=55 --memory-limit=512M --limit=2000 --env=prod

Email provider: Amazon SES (API mode, region eu-west-1, limits: 50/second, 100,700/day)

The problem:

When we send a mass email (30K+ contacts):

  1. :white_check_mark: mautic:broadcasts:send enqueues messages to messenger_messages table

  2. :white_check_mark: Mautic UI shows “31,571 Sent”

  3. :white_check_mark: messenger:consume worker runs (1 worker only, lock file working)

  4. :cross_mark: Messages get stuck in the database with status “failed”

  5. :cross_mark: Only 1,959 emails reach Amazon SES (verified in CloudWatch)

  6. :cross_mark: 29,612 messages remain blocked in messenger_messages table (93.8% stuck)

Database query shows:

sql

SELECT COUNT(*) FROM messenger_messages WHERE queue_name='email';
-- Result: ~29,000 messages stuck
```

**Worker verification:**
- Only 1 `messenger:consume` process running (verified with `ps aux`)
- Worker logs show successful execution (exit code 0)
- Lock file creates and removes correctly every minute

**The messages are enqueued but get blocked in the database instead of being processed and sent to SES.**

These errors are showing in the log:

Previously we had:
```
Doctrine\DBAL\Exception\LockWaitTimeoutException: 
SQLSTATE[HY000]: General error: 1205 Lock wait timeout exceeded

This was caused by duplicate workers. We fixed it with a lock file script, but messages still get stuck in the database with status “failed” even with only 1 worker running.

Steps I have tried to fix the problem:

  1. Implemented lock file bash script to prevent duplicate workers

  2. Verified only 1 messenger:consume process runs at a time (no duplicates)

  3. Commented out competing cronjobs (mautic:messages:send, messenger:consume mautic_webhooks)

  4. Cleaned stuck messages with TRUNCATE TABLE messenger_messages

  5. Verified SES limits are not being hit (50/second, 100K/day)

  6. Made a new test send today - same problem: messages enqueue but get stuck in database

Questions:

  1. Why are messages being marked as “failed” in messenger_messages even though the worker reports success (exit code 0)?

  2. Is Doctrine transport suitable for 30K+ message volumes, or should we use Redis/RabbitMQ?

  3. Are our worker parameters appropriate? (--time-limit=55 --limit=2000 running every minute)

  4. How can we debug why messages are getting stuck instead of being processed?

The core issue: Messages successfully enqueue to the database but then get blocked there with “failed” status instead of being sent to Amazon SES.

Any help would be greatly appreciated!

YOU have configured aws smtp or api ?

I have AWS SES + API configured.

Then it should work ,