Messenger:consume redis concurrency for better queued emails performance. Error: "Could not acknowledge redis message"

Your software
My Mautic version is: 5.0.3
My PHP version is: 8.1.2
My Database type and version is: mysql 8.0.36-0ubuntu0.22.04.1 for Linux on x86_64

Your problem
My problem is:
I am trying to send 500k emails ( channels → emails )
I am trying to increase throughput of my worker:

php /var/www/mtc/bin/console messenger:consume email --memory-limit=8280M

I managed to use supervisorctl to run this command in 1 thread successfully - thanks to documentations - but it’s very slow (around 4-5 emails per second) and I would like to run it in 5 threads instead to match my allowed Amazon SES speed (100+ emails per second).

These errors are showing in the log:
Symfony\Component\Messenger\Exception\TransportException
Could not acknowledge redis message “1584889235265–0”.

Steps I have tried to fix the problem:

I am having trouble setting this up in mautic environment. I am wondering if anyone managed to do it?

here are the docs describing how ENV vars can be used for symfony messenger.conf file (which is absent in case of mautic setup) to achieve redis concurrency:

I would like to understand how I can incorporate this approach into mautic use case.

I would like to clarify that the error “Could not acknowledge redis message” arises only when numprocs=2 is used in supervisorctl (to have several instances of the worker)

After struggling with RAM usage on my server (64GB + 120GB swap couldn’t bear with Mautic spool on Redis!)
I have managed to implement a bash script with backpressure algorithm, which I think is potentially a better approach than cron, because spool queue generation script and worker (which actually sends emails via SES) are adapting to each other: 1000 emails are generated and then the script checks how spool emails count reduces before sending another chunk of emails.

Attention: this email requires a small patch to /app/bundles/ChannelBundle/Command/SendChannelBroadcastCommand.php:
(because I couldnt find an easy way to use SQL to retrieve “pending” emails count for a particular email item.

        $table->render();
 // new lines which allow awk to extract amount of emails put into spool. 
// @TODO: find a console command to extract "pending" emails count from email and replace this!
        $output->writeln('');
        $output->writeln('emails_sent:' . $counts['success']);

        $this->completeRun();

The sending script:

#!/bin/bash

# This is the email ID from [mautic]/s/emails: make sure the email is published and published data is in past!
val1=1246

while true; do
    # Run the PHP command directly and stream its output, including both stdout and stderr
    output=$(sudo -uwww-data -s -- php -d memory_limit=-1 /var/www/mtc/bin/console mautic:broadcasts:send --batch=50 --limit=1000 --id=$val1 2>&1 | tee /dev/tty)
    php_exit_status=${PIPESTATUS[0]}  # Capture the exit status of the PHP command

    # Check if PHP command execution was successful
    if [ $php_exit_status -ne 0 ]; then
        echo "PHP command exited with error code $php_exit_status"
        exit $php_exit_status
    fi

    # Extract the number of emails sent from the output
    emails_sent=$(echo "$output" | awk -F ':' '/^emails_sent/{gsub(/ /, "", $2); print $2}')

    # Print debug
    echo "[$(date --iso-8601=seconds)] emails_put_to_queue: $emails_sent"

    # Exit with '0' success if emails_sent is =0 or undefined or less than 0
    if [ -z "$emails_sent" ] || [ "$emails_sent" -le 0 ]; then
        echo "Exiting with success because emails_sent is undefined, equal to 0, or less than 0"
        exit 0
    fi

    while true; do
        # Get the length of the messages stream, extracting the integer part
        len=$(redis-cli --raw XLEN messages)

        echo "[$(date --iso-8601=seconds)] emails_in_queue: $len"

        # Break the loop if the stream length is less than 50
        if [ "$len" -lt 50 ]; then
            break
        fi

        # Wait for 30 seconds before checking again
        sleep 30
    done
done

Usage:

I put this script into mautic folder into send.sh file and then run this script via tmux as bash ./send.sh (or allow members of my team to execute via our Jenkins pipeline) - I don’t recommend to run it directly on remote server for 100K+ subs because ssh connection tends to break…

I managed to achieve ~26K emails delivered per hour via AWS SES with this script (with single worker thread), which is a nice improvement compared to ~12K emails per hour which we used to get via older naive scripts which occasionally spammed our /spool directory (Mautic 3) which regularly put our Ubuntu server on its knees…

I would like to note that the script is targeted at Mautic 5 + Redis email queue.

If I manage to setup Redis run concurrently in 3 threads via supervisorctl, I think I would get another performance increase.

Wow not a lot of responses here, I am wondering if anybody from mautic team even reading these forums from time to time?..

I have finally managed to achieve 50K emails per hour rate today - after more tuning and switching to Doctrine driver for messenger (instead of Redis). This is a huge boost compared to what I have started from.

Wow, can you describe what you did? Make some kind of tutorial? Thanks!

Sure, compared to my first post in this thread here is what changed:

  1. I installed symfony mailer doctrine driver
    Messenger: Sync & Queued Message Handling (Symfony 5.x Docs)

  2. I changed Mautic queue mail settings to doctrine

  1. I set supervisorctl to run consume command in 2 threads

  2. I changed my send.sh script to check doctrine queue instead of redis queue:

#!/bin/bash

# put this script in mautic project folder root and run to send single email.
# pass email ID via --id=xxx. This is the email ID from [mautic]/s/emails: make sure the email is published and published data is in past!


val1=""

# Parse command line arguments for --id
for arg in "$@"
do
    case $arg in
        --id=*)
        val1="${arg#*=}"
        shift # Remove --id= from processing
        ;;
    esac
done

# Check if val1 (email ID) is not set and exit if required
if [ -z "$val1" ]; then
    echo "id required, pass as --id=333"
    exit 1
fi

echo "running for email id: $val1"

while true; do
    # Run the PHP command directly and stream its output, including both stdout and stderr
    output=$( php -d memory_limit=-1 /var/www/mtc/bin/console mautic:broadcasts:send --batch=30 --limit=2000 --id=$val1 2>&1 | tee /dev/tty)
    php_exit_status=${PIPESTATUS[0]}  # Capture the exit status of the PHP command

    # Check if PHP command execution was successful
    if [ $php_exit_status -ne 0 ]; then
        echo "PHP command exited with error code $php_exit_status"
        exit $php_exit_status
    fi

    # Extract the number of emails sent from the output
    emails_sent=$(echo "$output" | awk -F ':' '/^emails_sent/{gsub(/ /, "", $2); print $2}')

    # Print debug
    echo "[$(date --iso-8601=seconds)] emails_put_to_queue: $emails_sent"

    # Exit with '0' success if emails_sent is =0 or undefined or less than 0
    if [ -z "$emails_sent" ] || [ "$emails_sent" -le 0 ]; then
        echo "Exiting with success because emails_sent is undefined, equal to 0, or less than 0"
        exit 0
    fi

    while true; do
        # Get the length of the messages stream, extracting the integer part
        len=$(./bin/console doctrine:query:sql --no-ansi -- "SELECT count(*) from messenger_messages" | awk '/[0-9]+/{print $1; exit}')

        echo "[$(date --iso-8601=seconds)] emails_in_queue: $len"

        # Break the loop if the stream length is less than 50
        if [ "$len" -lt 50 ]; then
            break
        fi

        # Wait for 30 seconds before checking again
        sleep 30
    done
done

so now I run the script like this:
bash ./send.sh --id=xxxx (from my tmux session or our team Jenkins instance)

Suprisingly, Doctrine queue is working noticeably faster compared to Redis, even in single thread. I think the reason this is happening is that our emails have a lot of links and Mautic uses very weird approach to generated email links; which results that each rendered email put in queue is huge, I mean, HUGE: 1 entry can take up to megabytes, this is a very, very large serialized object if we talk about newsletter for 1M recipients. Redis chokes in write and read. Mysql 8 + Doctrine manages to process this better.

1 Like