Category Archives: Uncategorized

How to use AWS Simple Email Service (SES) from TypeScript on NodeJS example

In our application flows we use AWS Simple Email Service to send emails to our users. Since the documentation and examples of AWS SES are not that clear, it might take some trial and error to figure out which of the parameters are mandatory and, more important, obsolete. The AWS examples here crash if you actually configure the parameters with empty strings or null like mentioned there.

This TypeScript code below initiates a AWS SES connection and uses an earlied AWS CloudFront generated Email template (see example below somewhere). I found it quite surprising that you can actually refer to the template by name and not ARN only.

import * as AWS from 'aws-sdk';
import * as https from 'https';

const ses = new AWS.SES({
    httpOptions: {
        agent: new https.Agent({
            keepAlive: true

 * Send email through AWS SES Templates
export async function sendMail(email: string, name: string): Promise(String) {

    try {
        // Create SES sendTemplatedEmail templateData content
        const templatedata = {
            parameter_name: name
        // console.debug(`sendMail templatedata: ${JSON.stringify(templatedata)}`);

        // Create SES sendTemplatedEmail full message
        const params = {
            Destination: {
                ToAddresses: [ email ]
            Source: '',
            Template: 'myFanceTerra10EmailTemplate',
            TemplateData: JSON.stringify(templatedata)
        // console.debug(`sendMail param: ${JSON.stringify(params)}`);

        const sesResponse = await ses.sendTemplatedEmail(params).promise();
        console.log(`sendMail requestId: ${sesResponse.$response.requestId} and messageId: ${sesResponse.MessageId}`);
        return 'OK';
    } catch (e) {
        console.error(`sendMail unexpected: ${e.message}`);
        return 'something fancy error handling';

Here is an example AWS CloudFormation resource for the AWS SES email template. You can use the always handy sub function to prevent any complex character escaping or unreadable 1-line HTML. I love it for EC2 UserData and for stuff like this:

    Type: AWS::SES::Template
        TemplateName: myFanceTerra10EmailTemplate
        SubjectPart: My Subject
#       TextPart: "Nobody uses this anymore right ???"
          Fn::Sub: |
            <img src="">
            <h1>Sir/Madam {{parameter_name}},</h1>
            <p>Ho ya doin ?</p>
            <strong>the T10 crew</strong>

In case you use the Serverless Framework (you should) for Serverless deployments the following code is necessary in your Serverless.yaml. This allows your Lambda function to use the Email template on runtime. In our case the domain is hosted on AWS Route53 as well which saves you some problems.

- Effect: Allow
  - ses:SendTemplatedEmail
  - "arn:aws:ses:eu-west-1:*:identity/"

Hope it helps!


  • Link to the original article due to RSS feeds @
  • Sending Email Using Amazon SES
Leave a comment

Posted by on 26-11-2018 in Uncategorized


Tags: , , , , , ,

How to setup unit testing for AWS Lambda serverless functions (on NodeJS) ?

We use AWS Lambda serverless functions combined with TypeScript and NodeJS which results in an extreme powerful developer toolset. Due to the fact that functions contain isolated logic they are ideal for automated unit testing in CI/CD pipelines. So eventually looking at our options we decided to use the features of mocha, chai and nock combined. This resulted in a very easy and powerful solution for unit testing.

I’m sharing this after a chat at a meetup where the use of EKS instead of Lambda even for really simple functions was advocated due to the fact that serverless was hard to isolate (run local) and was hard to setup any unit testing. I beg to differ because:

So let’s go …

Our example function

is a simple function for retrieving a single record from a AWS DynamoDB table

// Handler for serverless framework
export async function handler(event: APIGatewayProxyEvent, _context: Context, callback: Callback) {
    try {
        callback(undefined, await getWerkgever(event));
    } catch (err) {

// Main logic
export async function getRecord(event: APIGatewayProxyEvent) { 
  const id = '1'; // pointless, but good enough for this example
  const queryParams = { TableName: process.env.dynamotable, Key: { id } }; 
  const result = await documentClient.get(queryParams).promise(); 
  if (result.Item) { 
    return { statusCode: 200, headers, body: JSON.stringify(result.Item) }; 
  } else {
    return { statusCode: 404, headers, body: undefined };

So what happens here ?

  • We deliberately split the logic between handler (for the serverless framework) and the main logic
  • We need to export the main logic for use in our unit tests (and local development)

Using Mocha & Nock

Since we are running node we can use both mochaJS and nock for our unit testing. Setting up the specification file (.spec) for our simple function we first run the test with.

import { APIGatewayProxyEvent } from 'aws-lambda';
import { expect } from 'chai';
import * as nock from 'nock';
import { getRecord } from './getRecord';

process.env.dynamotable = 'myTable';

describe('getRecord', () => {

    it('UT001 - getRecord with valid response', async() => {
        const event: APIGatewayProxyEvent = {
            body: '',
            headers: {},
            httpMethod: 'GET',
            isBase64Encoded: false,
            path: '',
            pathParameters: {},
            queryStringParameters: undefined,
            stageVariables: {},
            requestContext: {},
            resource: '' };

        const response = await getRecord(event);


So what happened ?

  • We set the environment variables (like the DynamoDB table) which normally is done by AWS Lambda
  • We configure nock.recorder for auditing the upcoming execution
  • We define a dummy APIGatewayProxyEvent which has some mandatory elements which we leave mostly empty or undefined
  • We define the call to our AWS Lambda function and since we isolated the main logic we can call this directly
  • If your AWS profile used on your dev machine has enough IAM grants the code can execute against AWS DynamoDB (we use a special user for this to keep it clean)

By running mocha with the Nock recorder we can see the actual callout to AWS DynamoDB from our developer machine

<-- cut here -->

nock('', {"encodedQueryParams":true})
.get('/', {"TableName":"myTable","Key":{"id":{"S":"1"}}})
.reply(200,{Item: {name: {S: 'myName'}, id: {S: '1'}});
......... (much stuff)

So with nock we actually recorded the https call to DynamoDB. Which now, we can easily use with nock to mock the response during unit testing. So next change the code in our spec file with the info from nock recorder:

// nock.recorder.rec();
    .get('/' )
    .reply(200,{Item: {name: {S: 'myName'}, id: {S: '1'}});

So what happened ?

  • Disabled the recorder, we don’t need it anymore
  • Setup nock to catch the HTTPS GET call to the dynamoDB endpoint
  • Configured nock to reply with a 200 and the specified Item record (you can also use reply with files)

Using Chai

With this basis setup we can execute unit tests in our pipeline with Mocha where nock will handle the mocking of the endpoints. With some little Chai magic we can define expectations in our specification file to make sure all message logic of our function is done properly and the HTTP reply is as expected.

expect(response.headers).to.deep.include({ 'Content-Type': 'application/json' });
expect(JSON.parse(response.body)).to.deep.equal({ name: 'myName', id: '1'});

And there is more

With this it’s easy to catch all outbound HTTPS requests and mock different responses (0 records, multiple records, etc etc) for extensive unit testing. The possibilities are endless, so hope it helps …

Leave a comment

Posted by on 26-10-2018 in Uncategorized


Tags: , , , , , , , ,

How to determine your AWS Lambda@Edge regions and find your CloudWatch logs

You must review AWS CloudWatch log files in the correct region to see the log files created when CloudFront executed your Lambda function. I found this very usefull bash AWS CLI based bash command here which allows to determine a list of Regions where your Lambda@Edge function have received traffic so storing for future (personal) reference.

for region in $(aws --output text  ec2 describe-regions | cut -f 3) 
    for loggroup in $(aws --output text  logs describe-log-groups --log-group-name "/aws/lambda/us-east-1.$FUNCTION_NAME" --region $region --query 'logGroups[].logGroupName')
        echo $region $loggroup

You can just leave FUNCTION_NAME empty to get a list of all functions.


Leave a comment

Posted by on 25-10-2018 in Uncategorized


Tags: , , , , , ,

How to fix Git error ‘invalid active developer path’ after MacOS update

After upgrading to macOS Mojave,  I tried running GIT from Terminal or IDE but it kept giving the following error:

xcrun: error: invalid active developer path (/Library/Developer/CommandLineTools), missing xcrun at: /Library/Developer/CommandLineTools/usr/bin/xcrun.

The problem is that you are using XCode and you explicit have to agree to the license agreement. So open Terminal, and run the following:

xcode-select --install

This will download and install xcode developer tools and fix the problem. As a follow on step, you may need to reset the path to Xcode if you have several versions or want the command line tools to run without Xcode.

xcode-select --switch /Applications/
xcode-select --switch /Library/Developer/CommandLineTools



Posted by on 27-09-2018 in Uncategorized


Tags: , ,

How to connect to the CEPH Object Gateway S3 API with Java

We use a CEPH storage solution and specifically want to use the Ceph Object Gateway with the S3 API through a Java client. The API is based on the AWS S3 standard however requires some special tweaking to work. Took me some effort to get a working connection, so here to share:


We can use either the new AmazonS3ClientBuilder

package nl.rubix.s3;

import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.AmazonClientException;
import com.amazonaws.AmazonServiceException;
import com.amazonaws.client.builder.AwsClientBuilder;
import com.amazonaws.ClientConfiguration;
import com.amazonaws.Protocol;
import com.amazonaws.SDKGlobalConfiguration;

public class AmazonS3ClientBuilder
  public static void main(String[] args)
    String accessKey = "XXXXX";
    String secretKey = "XXXXX";

    // Our firewall on DEV does some weird stuff so we disable SSL cert check
    if (SDKGlobalConfiguration.isCertCheckingDisabled())
      System.out.println("Cert checking is disabled");
    // S3 Client configuration
    ClientConfiguration config = new ClientConfiguration();
    // Not the standard "AWS3SignerType", maar expliciet signerTypeV2

    // S3 Credentials
    BasicAWSCredentials credentials = new BasicAWSCredentials(accessKey,secretKey);
    // S3 Endpoint
    AwsClientBuilder.EndpointConfiguration endpointConfiguration = new
      AwsClientBuilder.EndpointConfiguration("", "");
    AmazonS3 s3 =
      .withCredentials(new AWSStaticCredentialsProvider(credentials))
    System.out.println(" Connection to the Rubix S3 ");
    try { 
       * List of buckets and objects in our account
       System.out.println("Listing buckets and objects");
       for (Bucket bucket : s3.listBuckets())
         System.out.println(" - " + bucket.getName() +" "
           + "(owner = " + bucket.getOwner()
           + " "
           + "(creationDate = " + bucket.getCreationDate());
         ObjectListing objectListing = s3.listObjects(new ListObjectsRequest()
         for (S3ObjectSummary objectSummary : objectListing.getObjectSummaries()) 
           System.out.println(" --- " + objectSummary.getKey() +" "
           + "(size = " + objectSummary.getSize() + ")" +" "
           + "(eTag = " + objectSummary.getETag() + ")");
     catch (AmazonServiceException ase)
       System.out.println("Caught an AmazonServiceException, which means your request made it to S3, but was rejected with an error response for some reason.");
       System.out.println("Error Message:    " + ase.getMessage());
       System.out.println("HTTP Status Code: " + ase.getStatusCode());
       System.out.println("AWS Error Code: " + ase.getErrorCode());
       System.out.println("Error Type: " + ase.getErrorType());
       System.out.println("Request ID: " + ase.getRequestId());
     catch (AmazonClientException ace)
       System.out.println("Caught an AmazonClientException, which means the client encountered "
       + "a serious internal problem while trying to communicate with S3,
       + "such as not being able to access the network.");
       System.out.println("Error Message: " + ace.getMessage());

or make it work with the older and depricated AmazonS3Client

package nl.rubix.s3;

import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.ClientConfiguration;
import com.amazonaws.SDKGlobalConfiguration;

public class BasicAWSCredentials
    public static void main(String[] args)
        String accessKey = "XXXXXXX";
        String secretKey = "XXXXXXX";

    if (SDKGlobalConfiguration.isCertCheckingDisabled())
        System.out.println("Cert checking is disabled");
    AWSCredentials credentials = new com.amazonaws.auth.BasicAWSCredentials(accessKey,secretKey);

    ClientConfiguration clientConfig = new ClientConfiguration();

    AmazonS3 conn = new AmazonS3Client(credentials, clientConfig);

    for (Bucket bucket : conn.listBuckets())
      System.out.println(" - " + bucket.getName() 
        + " "
        + "(owner = " + bucket.getOwner()
        + " "
        + "(creationDate = " + bucket.getCreationDate());

Hope it helps!

Leave a comment

Posted by on 03-07-2018 in Uncategorized


Tags: , , , , ,

How to dynamically generate XML Request in ReadyAPI / SOAPui ?

We have a functional test where we use a SOAP Request to start the processing of a couple files based on a URL in the request. For a negative test (all corrupt files) we got a batch of 500 files. So to prevent a lot of copy/paste work in my SOAP Request I wanted to generate the request dynamically. I did this before, couldn’t find/remember my own example so eventually when I got it working decided to share and store it here.

First some housekeeping

I always do some housekeeping in the init groovy step of my tests to generate an unique id (for correlation, etc etc) and more.

//Generate unique id - sequence
def v_sequence = new Date().time.toString()
// testRunner.testCase.setPropertyValue("sequence", v_sequence)
testRunner.testCase.testSuite.project.setPropertyValue("v_sequence", v_sequence)
// empty some variables
testRunner.testCase.testSuite.project.setPropertyValue("XML", "")
testRunner.testCase.testSuite.project.setPropertyValue("teller", "1")

Then the basic dataloop

Using an external datasource is the way to go if we want to load data. In the example I only use 1 field (url) in a text file with 500 lines:

The DataSource Loop function makes it possible to go back to the Groovy script “Generate XML Request” to build the request line by line.

The Groovy magic

Here is the groovy script that holds the logic. For each url in the dataloop we create an Document XML complex element which we add to the list of

import groovy.xml.StreamingMarkupBuilder
import groovy.xml.XmlUtil
import groovy.util.XmlSlurper
def sequence = context.expand( '${#Project#v_sequence}' )
def url = context.expand( '${DataSource#url}' )
def datumtijd = new Date().format("yyyy-MM-dd'T'HH:mm:ss")
def teller = context.expand( '${#Project#teller}' )'sequence = ' + sequence)'url = ' + url)
def filename = url.split('/').last()'filename = ' + filename)          
// Define all your namespaces here
// def nameSpacesMap = [soapenv: '',ns: 'nl.rubix.ohmy',]
def builder = new StreamingMarkupBuilder()
builder.encoding ='utf-8'
def xmlDocument = builder.bind
//          namespaces << nameSpacesMap
            // use it like ns.element            
                        Id('DOC.' + sequence + '.' + teller);
// XML buildup to get rid of irritating XML version string which probably can be done much nicer
def v_document = XmlUtil.serialize(xmlDocument);
v_document = v_document.substring(39)"XML = " + v_document);
def origineelXML = context.expand( '${#Project#XML}' )
origineelXML = origineelXML + v_document
testRunner.testCase.testSuite.project.setPropertyValue("XML", origineelXML)
// increase counter
teller = teller.toInteger() + 1
testRunner.testCase.testSuite.project.setPropertyValue("teller", teller.toString())

The SOAP Request

Now we have a variable on project level which stores the complete list of documents, which we can just use as normal in the request like this:

Leave a comment

Posted by on 21-03-2018 in Uncategorized


Tags: ,

Problem with Spring Boot Starter Web and FasterXML Jackson dependency

While working with Spring Boot and developing a combined REST/JSON & SOAP/XML (not sexy, I know) API I was able to build & compile but on runtime I had this error:

Error starting ApplicationContext. To display the auto-configuration report re-run your application with 'debug' enabled.
ERROR 2145 --- [ main] o.s.boot.SpringApplication : Application startup failed
org.springframework.context.ApplicationContextException: Unable to start embedded container; nested exception is org.springframework.boot.context.embedded.EmbeddedServletContainerException: Unable to start embedded Tomcat
at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.onRefresh( ...........
Caused by: java.lang.NoClassDefFoundError: com/fasterxml/jackson/databind/exc/InvalidDefinitionException

So Spring uses Jackson and the Jackson library is composed of three components: Jackson Databind, Core, and Annotation. I did not add anything specific to my maven pom.xml for Jackson so the dependency got inherited somewhere. So after some Google jobs I figured out the spring-boot-starter-parent uses some older FasterXML/Jackson libs which seem to screw things up.

jvzoggel$ mvn dependency:tree -Dincludes=com.fasterxml.jackson.*

[INFO] --- maven-dependency-plugin:2.10:tree (default-cli) @ springboot ---
[INFO] nl.rubix.api:springboot:jar:0.0.1-SNAPSHOT
[INFO] \- org.springframework.boot:spring-boot-starter-web:jar:1.5.10.RELEASE:compile
[INFO] \- com.fasterxml.jackson.core:jackson-databind:jar:2.8.10:compile
[INFO] +- com.fasterxml.jackson.core:jackson-annotations:jar:2.8.0:compile
[INFO] \- com.fasterxml.jackson.core:jackson-core:jar:2.8.10:compile

So by overriding the dependency in my pom.xml I could make sure a newer version of Jackson was used:

<!-- Jackson due to SpringBootStarterParent dependency problems -->

Problem solved.





Posted by on 04-02-2018 in Uncategorized


Tags: , ,