Advertisements
RSS

Tag Archives: NodeJS

How to use AWS Simple Email Service (SES) from TypeScript on NodeJS example

In our application flows we use AWS Simple Email Service to send emails to our users. Since the documentation and examples of AWS SES are not that clear, it might take some trial and error to figure out which of the parameters are mandatory and, more important, obsolete. The AWS examples here crash if you actually configure the parameters with empty strings or null like mentioned there.

This TypeScript code below initiates a AWS SES connection and uses an earlied AWS CloudFront generated Email template (see example below somewhere). I found it quite surprising that you can actually refer to the template by name and not ARN only.

import * as AWS from 'aws-sdk';
import * as https from 'https';

const ses = new AWS.SES({
    httpOptions: {
        agent: new https.Agent({
            keepAlive: true
        })
    }
});

/**
 * Send email through AWS SES Templates
 */
export async function sendMail(email: string, name: string): Promise(String) {

    try {
        // Create SES sendTemplatedEmail templateData content
        const templatedata = {
            parameter_name: name
        };
        // console.debug(`sendMail templatedata: ${JSON.stringify(templatedata)}`);

        // Create SES sendTemplatedEmail full message
        const params = {
            Destination: {
                ToAddresses: [ email ]
            },
            Source: 'noreply@terra10.io',
            Template: 'myFanceTerra10EmailTemplate',
            TemplateData: JSON.stringify(templatedata)
        };
        // console.debug(`sendMail param: ${JSON.stringify(params)}`);

        const sesResponse = await ses.sendTemplatedEmail(params).promise();
        console.log(`sendMail requestId: ${sesResponse.$response.requestId} and messageId: ${sesResponse.MessageId}`);
        return 'OK';
    } catch (e) {
        console.error(`sendMail unexpected: ${e.message}`);
        return 'something fancy error handling';
    }
}

Here is an example AWS CloudFormation resource for the AWS SES email template. You can use the always handy sub function to prevent any complex character escaping or unreadable 1-line HTML. I love it for EC2 UserData and for stuff like this:

Resources:
  SesTemplateTerra10:
    Type: AWS::SES::Template
    Properties:
      Template:
        TemplateName: myFanceTerra10EmailTemplate
        SubjectPart: My Subject
#       TextPart: "Nobody uses this anymore right ???"
        HtmlPart:
          Fn::Sub: |
            <img src="http://terra10.nl/img/logo.png">
            <h1>Sir/Madam {{parameter_name}},</h1>
            <p>Ho ya doin ?</p>
            <p>cheerio,</p>
            <strong>the T10 crew</strong>

In case you use the Serverless Framework (you should) for Serverless deployments the following code is necessary in your Serverless.yaml. This allows your Lambda function to use the Email template on runtime. In our case the domain is hosted on AWS Route53 as well which saves you some problems.

- Effect: Allow
  Action:
  - ses:SendTemplatedEmail
  Resource:
  - "arn:aws:ses:eu-west-1:*:identity/terra10.io"

Hope it helps!

References

  • Link to the original article due to RSS feeds @ https://jvzoggel.com
  • Sending Email Using Amazon SES
Advertisements
 
Leave a comment

Posted by on 26-11-2018 in Uncategorized

 

Tags: , , , , , ,

Using AWS Key Management (KMS) to encrypt and decrypt in AWS Lambda (NodeJS)

AWS Key Management (KMS) is a fully managed service that makes it easy to create and control encryption keys on AWS which can then be utilised to encrypt and decrypt data in a safe manner. The service leverages Hardware Security Modules (HSM) under the hood which in return guarantees security and integrity of the generated keys.

You can enable AWS KMS to store your data-at-rest on many AWS storage solutions (like DynamoDB and EBS). However in our Lambda functions we would like to encrypt (and decrypt) certain values on runtime. So we needed some code to encrypt and decrypt. Took me some while to figure it out, so here it is.

Here is the function to encrypt any string against the AWS KMS Key. You can create a KMS Key through AWS IAM in the console or (much better) AWS CloudFormation.

const kmsClient = new AWS.KMS({region: 'eu-west-1'});

/**
 * Encrypt
 */
async function encryptString(text: string): Promise<string> {

    const paramsEncrypt = {
        KeyId: 'arn:aws:kms:eu-west-1:........',
        Plaintext: new Buffer(text)
    };

    const encryptResult = await kmsClient.encrypt(paramsEncrypt).promise();
    // The encrypted plaintext. When you use the HTTP API or the AWS CLI, the value is Base64-encoded. Otherwise, it is not encoded.
    if (Buffer.isBuffer(encryptResult.CiphertextBlob)) {
        return Buffer.from(encryptResult.CiphertextBlob).toString('base64');
    } else {
        throw new Error('Mayday Mayday');
    }
}

And we can use the result (an encrypted string) to store it in DynamoDB, Aurora or send it wherever we want. So in the end we would like to decrypt it as well. So …

/**
 * Decrypt
 */
async function decryptEncodedstring(encoded: string): Promise<string> {

    const paramsDecrypt: AWS.KMS.DecryptRequest = {
        CiphertextBlob: Buffer.from(encoded, 'base64')
    };

    const decryptResult = await kmsClient.decrypt(paramsDecrypt).promise();
    if (Buffer.isBuffer(decryptResult.Plaintext)) {
        return Buffer.from(decryptResult.Plaintext).toString();
    } else {
        throw new Error('We have a problem');
    }
}

Hope it helps!

References

  • Example code including test function available in my Github as well

 

 
Leave a comment

Posted by on 05-11-2018 in AWS

 

Tags: , , , , ,

How to setup unit testing for AWS Lambda serverless functions (on NodeJS) ?

We use AWS Lambda serverless functions combined with TypeScript and NodeJS which results in an extreme powerful developer toolset. Due to the fact that functions contain isolated logic they are ideal for automated unit testing in CI/CD pipelines. So eventually looking at our options we decided to use the features of mocha, chai and nock combined. This resulted in a very easy and powerful solution for unit testing.

I’m sharing this after a chat at a meetup where the use of EKS instead of Lambda even for really simple functions was advocated due to the fact that serverless was hard to isolate (run local) and was hard to setup any unit testing. I beg to differ because:

So let’s go …

Our example function

is a simple function for retrieving a single record from a AWS DynamoDB table

// Handler for serverless framework
export async function handler(event: APIGatewayProxyEvent, _context: Context, callback: Callback) {
    try {
        callback(undefined, await getWerkgever(event));
    } catch (err) {
        callback(err);
    }
}

// Main logic
export async function getRecord(event: APIGatewayProxyEvent) { 
  .... 
  const id = '1'; // pointless, but good enough for this example
  const queryParams = { TableName: process.env.dynamotable, Key: { id } }; 
  const result = await documentClient.get(queryParams).promise(); 
  if (result.Item) { 
    return { statusCode: 200, headers, body: JSON.stringify(result.Item) }; 
  } else {
    return { statusCode: 404, headers, body: undefined };
  }
};

So what happens here ?

  • We deliberately split the logic between handler (for the serverless framework) and the main logic
  • We need to export the main logic for use in our unit tests (and local development)

Using Mocha & Nock

Since we are running node we can use both mochaJS and nock for our unit testing. Setting up the specification file (.spec) for our simple function we first run the test with.

import { APIGatewayProxyEvent } from 'aws-lambda';
import { expect } from 'chai';
import * as nock from 'nock';
import { getRecord } from './getRecord';

process.env.dynamotable = 'myTable';

describe('getRecord', () => {

    it('UT001 - getRecord with valid response', async() => {
        
        nock.recorder.rec();
        
        const event: APIGatewayProxyEvent = {
            body: '',
            headers: {},
            httpMethod: 'GET',
            isBase64Encoded: false,
            path: '',
            pathParameters: {},
            queryStringParameters: undefined,
            stageVariables: {},
            requestContext: {},
            resource: '' };

        const response = await getRecord(event);
        expect(response.statusCode).to.equal(200);
    });

});

So what happened ?

  • We set the environment variables (like the DynamoDB table) which normally is done by AWS Lambda
  • We configure nock.recorder for auditing the upcoming execution
  • We define a dummy APIGatewayProxyEvent which has some mandatory elements which we leave mostly empty or undefined
  • We define the call to our AWS Lambda function and since we isolated the main logic we can call this directly
  • If your AWS profile used on your dev machine has enough IAM grants the code can execute against AWS DynamoDB (we use a special user for this to keep it clean)

By running mocha with the Nock recorder we can see the actual callout to AWS DynamoDB from our developer machine

<-- cut here -->

nock('https://dynamodb.eu-west-1.amazonaws.com:443', {"encodedQueryParams":true})
.get('/', {"TableName":"myTable","Key":{"id":{"S":"1"}}})
.reply(200,{Item: {name: {S: 'myName'}, id: {S: '1'}});
......... (much stuff)

So with nock we actually recorded the https call to DynamoDB. Which now, we can easily use with nock to mock the response during unit testing. So next change the code in our spec file with the info from nock recorder:

// nock.recorder.rec();
nock('https://dynamodb.eu-west-1.amazonaws.com:443')
    .get('/' )
    .reply(200,{Item: {name: {S: 'myName'}, id: {S: '1'}});

So what happened ?

  • Disabled the recorder, we don’t need it anymore
  • Setup nock to catch the HTTPS GET call to the dynamoDB endpoint
  • Configured nock to reply with a 200 and the specified Item record (you can also use reply with files)

Using Chai

With this basis setup we can execute unit tests in our pipeline with Mocha where nock will handle the mocking of the endpoints. With some little Chai magic we can define expectations in our specification file to make sure all message logic of our function is done properly and the HTTP reply is as expected.

expect(response.statusCode).to.equal(200);
expect(response.headers).to.deep.include({ 'Content-Type': 'application/json' });
expect(JSON.parse(response.body)).to.deep.equal({ name: 'myName', id: '1'});

And there is more

With this it’s easy to catch all outbound HTTPS requests and mock different responses (0 records, multiple records, etc etc) for extensive unit testing. The possibilities are endless, so hope it helps …

 
Leave a comment

Posted by on 26-10-2018 in Uncategorized

 

Tags: , , , , , , , ,