After struggling with RAM usage on my server (64GB + 120GB swap couldn’t bear with Mautic spool on Redis!)
I have managed to implement a bash script with backpressure algorithm, which I think is potentially a better approach than cron, because spool queue generation script and worker (which actually sends emails via SES) are adapting to each other: 1000 emails are generated and then the script checks how spool emails count reduces before sending another chunk of emails.
Attention: this email requires a small patch to /app/bundles/ChannelBundle/Command/SendChannelBroadcastCommand.php
:
(because I couldnt find an easy way to use SQL to retrieve “pending” emails count for a particular email item.
$table->render();
// new lines which allow awk to extract amount of emails put into spool.
// @TODO: find a console command to extract "pending" emails count from email and replace this!
$output->writeln('');
$output->writeln('emails_sent:' . $counts['success']);
$this->completeRun();
The sending script:
#!/bin/bash
# This is the email ID from [mautic]/s/emails: make sure the email is published and published data is in past!
val1=1246
while true; do
# Run the PHP command directly and stream its output, including both stdout and stderr
output=$(sudo -uwww-data -s -- php -d memory_limit=-1 /var/www/mtc/bin/console mautic:broadcasts:send --batch=50 --limit=1000 --id=$val1 2>&1 | tee /dev/tty)
php_exit_status=${PIPESTATUS[0]} # Capture the exit status of the PHP command
# Check if PHP command execution was successful
if [ $php_exit_status -ne 0 ]; then
echo "PHP command exited with error code $php_exit_status"
exit $php_exit_status
fi
# Extract the number of emails sent from the output
emails_sent=$(echo "$output" | awk -F ':' '/^emails_sent/{gsub(/ /, "", $2); print $2}')
# Print debug
echo "[$(date --iso-8601=seconds)] emails_put_to_queue: $emails_sent"
# Exit with '0' success if emails_sent is =0 or undefined or less than 0
if [ -z "$emails_sent" ] || [ "$emails_sent" -le 0 ]; then
echo "Exiting with success because emails_sent is undefined, equal to 0, or less than 0"
exit 0
fi
while true; do
# Get the length of the messages stream, extracting the integer part
len=$(redis-cli --raw XLEN messages)
echo "[$(date --iso-8601=seconds)] emails_in_queue: $len"
# Break the loop if the stream length is less than 50
if [ "$len" -lt 50 ]; then
break
fi
# Wait for 30 seconds before checking again
sleep 30
done
done
Usage:
I put this script into mautic folder into send.sh
file and then run this script via tmux
as bash ./send.sh
(or allow members of my team to execute via our Jenkins pipeline) - I don’t recommend to run it directly on remote server for 100K+ subs because ssh connection tends to break…
I managed to achieve ~26K emails delivered per hour via AWS SES with this script (with single worker thread), which is a nice improvement compared to ~12K emails per hour which we used to get via older naive scripts which occasionally spammed our /spool directory (Mautic 3) which regularly put our Ubuntu server on its knees…
I would like to note that the script is targeted at Mautic 5 + Redis email queue.
If I manage to setup Redis run concurrently in 3 threads via supervisorctl, I think I would get another performance increase.