Mautic Community Forums

Best way to import 4+ million emails into mautic

I have a list of over 4 million contacts I’d like to get into Mautic. I’m thinking the best way is to just connect directly to the database and import them into the leads table. Would this be the best approach or is there a better way?



I would try it with Mautic API: < GitHub - mautic/api-library: Mautic API Library >

There are two options here.

  1. Is to break it down into smaller files and use the custom import (there is a plugin for this that will do everything in the background)
  2. Like you said to do an insert directly into the leads table. I am however unsure if there are any other indexes that need to get updated when a lead is created - but I am very interested to hear your experience once you have done it, as we often have this issue of onboarding huge amounts of data.
    This usually requires changing php.ini to allow for greater file size uploads, a pretty strong server with enough CPU as mysql ultimately dies and there is a need to monitor the Job, and to keep re-running the cron.

In my experience regular import is faster then the API.
I would go with regular import by csv.
The question is: how long it will take to build a segment from 4 million contacts and what Hardware will support this size of DB.

What plugin are you referring to?

You guys have some interesting points.

@joeyk I wasn’t thinking about the segment building. Is there a way to monitor that other than hitting the segments page and refreshing?

Custom import is the plug-in I use as mentioned by @mikew . I have used dozens of times to import 80-100 million contacts in a single instance with 60 datapoints per line. I run parallel imports on files split into 500 lines. In my experience smaller files are faster to process than larger.

Real world is 5 parallel imports gives me 15-28 rows per second according to Mautic generated import history performance numbers.

What kind of hardware is your Mautic instance running? Database server been tuned?

1 Like

Run the segment building command and see how it runs: mautic:segments:update
You need a very well tuned db server to serve that amount.

You can run segments:check-builders to see query times on segments without actually running them

Compare output of query builders for given segments

mautic:segments:check-builders [options]

-i, --segment-id[=SEGMENT-ID] Set the ID of segment to process
–skip-old Skip old query builder
–bypass-locking Bypass locking.
-t, --timeout=TIMEOUT If getmypid() is disabled on this system, lock files will be used. This option will assume the process is dead after the specified number of seconds and will execute anyway. This is disabled by default.
-x, --lock_mode=LOCK_MODE Allowed value are “pid” or “flock”. By default, lock will try with pid, if not available will use file system [default: “pid”]
-f, --force Deprecated; use --bypass-locking instead.
-h, --help Display this help message
-q, --quiet Do not output any message
-V, --version Display this application version
–ansi Force ANSI output
–no-ansi Disable ANSI output
-n, --no-interaction Do not ask any interactive question
-e, --env=ENV The Environment name. [default: “prod”]
–no-debug Switches off debug mode.
-v|vv|vvv, --verbose Increase the verbosity of messages: 1 for normal output, 2 for more verbose output and 3 for debug

Thanks for the info, will keep it in mind for the future.