質問

I have an online software that sends emails to Amazon SES. Currently I have a cron job that sends the emails via the SMTP with phpmailer to send the messages. Currently I have to max the send limit to around 300 every minute to make sure my server doesn't time out. We see growth and eventually I'd like to send out to 10,000 or more.

Is there a better way to send to Amazon SES, or is this what everyone else does, but with just more servers running the workload?

Thanks in advance!

役に立ちましたか?

解決

You can try using the AWS SDK for PHP. You can send emails through the SES API, and the SDK allows you to send multiple emails in parallel. Here is a code sample (untested and only partially complete) to get you started.

<?php

require 'vendor/autoload.php';

use Aws\Ses\SesClient;
use Guzzle\Service\Exception\CommandTransferException;

$ses = SesClient::factory(/* ...credentials... */);

$emails = array();
// @TODO SOME SORT OF LOGIC THAT POPULATES THE ABOVE ARRAY

$emailBatch = new SplQueue();
$emailBatch->setIteratorMode(SplQueue::IT_MODE_DELETE);

while ($emails) {
    // Generate SendEmail commands to batch
    foreach ($emails as $email) {
        $emailCommand = $ses->getCommand('SendEmail', array(
            // GENERATE COMMAND PARAMS FROM THE $email DATA
        ));
        $emailBatch->enqueue($emailCommand);
    }

    try {
        // Send the batch
        $successfulCommands = $ses->execute(iterator_to_array($emailBatch));
    } catch (CommandTransferException $e) {
        $successfulCommands = $e->getSuccessfulCommands();
        // Requeue failed commands
        foreach ($e->getFailedCommands() as $failedCommand) {
            $emailBatch->enqueue($failedCommand);
        }
    }

    foreach ($successfulCommands as $command) {
        echo 'Sent message: ' . $command->getResult()->get('MessageId') . "\n";
    }
}

// Also Licensed under version 2.0 of the Apache License.

You could also look into using the Guzzle BatchBuilder and friends to make it more robust.

There are a lot of things you will need to fine tune with this code, but you may be able to achieve higher throughput of emails.

他のヒント

If anyone is looking for this answer, its outdated and you can find the new documentation here: https://docs.aws.amazon.com/aws-sdk-php/v3/guide/guide/commands.html

use Aws\S3\S3Client;
use Aws\CommandPool;

// Create the client.
$client = new S3Client([
    'region'  => 'us-standard',
    'version' => '2006-03-01'
]);

$bucket = 'example';
$commands = [
    $client->getCommand('HeadObject', ['Bucket' => $bucket, 'Key' => 'a']),
    $client->getCommand('HeadObject', ['Bucket' => $bucket, 'Key' => 'b']),
    $client->getCommand('HeadObject', ['Bucket' => $bucket, 'Key' => 'c'])
];

$pool = new CommandPool($client, $commands);

// Initiate the pool transfers
$promise = $pool->promise();

// Force the pool to complete synchronously
$promise->wait();

Same thing can be done for SES commands

Thank you for your answer. It was a good starting point. @Jeremy Lindblom

My problem is now that i can't get the Error-Handling to work. The catch()-Block works fine and inside of it

$successfulCommands

returns all the succeed Responses with Status-Codes, but only if an error occurs. For example "unverified address" in Sandbox-Mode. Like a catch() should work. :)

The $successfulCommands inside the try-Block only returns:

SplQueue Object
(
    [flags:SplDoublyLinkedList:private] => 1
    [dllist:SplDoublyLinkedList:private] => Array
    (
    )
)

I can't figure it out how to get the real Response from Amazon with Status-Codes etc.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top