Advertisements
RSS

Tag Archives: Maven

Getting started with Apache Kafka

Apache Kafka is a publish-subscribe messaging solution rethought as a distributed commit log.

kafka-logo-wide

The original use case for Kafka was to be able to rebuild a user activity tracking pipeline as a set of real-time publish-subscribe feeds. This means site activity (page views, searches, or other actions users may take) is published to central topics with one topic per activity type. These feeds are available for subscription for a range of use cases including real-time processing, real-time monitoring, and loading into Hadoop or offline data warehousing systems for offline processing and reporting.

Some use cases for Kafka are stream processing, event sourcing, metrics and all other (large sets of) data that go from publisher to 1-n subscriber(s). A single Kafka broker can handle hundreds of megabytes of reads and writes per second from thousands of clients making it a very efficient (and also easy to scale) high volume messaging solution.

So actually Kafka is a good alternative for any more traditional (JMS / MQ) message broker. Message brokers are used for a variety of reasons (to decouple processing from data producers, to buffer unprocessed messages, etc). In comparison to most messaging systems Kafka has better throughput, built-in partitioning, replication, and fault-tolerance which makes it a good solution for large scale message processing applications. And this all, is free.

Getting Started

The Kafka website has an excellent quickstart tutorial here. Download the latest version here and work through the tutorial to send and receive your first messages from console.

Playing around with Java

First we create a test topic.

bin/kafka-topics.sh –create –zookeeper localhost:2181 –replication-factor 1 –partitions 1 –topic testIteration
Created topic “testIteration”.

The earlier versions of Kafka came with default serializer but that created lot of confusion. With 0.8.2, you would need to pick a serializer yourself from StringSerializer or ByteArraySerializer that comes with API or build your own. Since both our key and value in the example will be a string, we use the StringSerializer.

Use the following Apache Kafka library as a Maven dependency (pom.xml).

<dependencies>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
<version>0.10.0.1</version>
</dependency></dependencies>

The following lines of code produces / publishes 10 messages on the Kafka Topic.


public void ProduceIteration()
{
int amountMessages = 10; // 10 is enough for the demo

Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("serializer.class", "kafka.serializer.StringEncoder");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");

org.apache.kafka.clients.producer.Producer<String, String> producer = new KafkaProducer<String, String>(props);

for(int i = 1; i <= amountMessages; i++)
{
ProducerRecord<String, String> data = new ProducerRecord<String, String>("testIteration", Integer.toString(i), Integer.<em>toString</em>(i));
System.out.println ("Publish message " + Integer.toString(i) + " - " + data);
producer.send(data);
}

producer.close();
}

The messages can be received from the topic:

jvzoggel$ bin/kafka-console-consumer.sh –zookeeper localhost:2181 –topic testIteration –property print.key=true –property print.timestamp=true

CreateTime:1474354960268        1       1
CreateTime:1474354960284        2       2
CreateTime:1474354960284        3       3
CreateTime:1474354960285        4       4
CreateTime:1474354960285        5       5
CreateTime:1474354960285        6       6
CreateTime:1474354960285        7       7
CreateTime:1474354960285        8       8
CreateTime:1474354960285        9       9
CreateTime:1474354960285        10      10

 

Advertisements
 
3 Comments

Posted by on 20-09-2016 in Uncategorized

 

Tags: , , ,

Solving a TIBCO buildear bug for the use in Continuous Integration

Within Rubix we have a buildserver for the Oracle/Java developers. The server is configured in Amazon EC2 and runs on Ubuntu configured with Tomcat, Jenkins and Nexus. For standardisation and us moving to the Amazon EC2 cloud we wanted to move the Tibco builds to  the same server.

So after installing the necessary Tibco BusinessWorks 5 software on the server, moving all the source over to Linux we tried running our first Tibco build using Jenkins and the exec-maven-plugin to execute buildear.sh. Sadly it resulted in this error message:


Starting up...
java.lang.InternalError: Can't connect to X11 window server using '10.0.0.51:0.0' as the value of the DISPLAY variable.
at sun.awt.X11GraphicsEnvironment.initDisplay(Native Method)
at sun.awt.X11GraphicsEnvironment.access$200(Unknown Source)

There seems to be a (known) bug in the Tibco BusinessWorks buildear which “sometimes” requires a graphical interface when ran from the shell. In our situation this only seemed to occur when projects had XSDs imports in a used libraries.

The solution is to configure Jenkins to use XVFB plugin and emulates a “dumb” framebuffer using virtual memory.

Screen shot 2012-01-29 at 23.18.00

Thanks to collegues Roel, Maarten, Rubén and Anthony for resolving this issue.

 

References:

https://wiki.jenkins-ci.org/display/JENKINS/Xvfb+Plugin

 

 
Leave a comment

Posted by on 05-07-2014 in Tibco

 

Tags: , ,

Good explanation and necessary command examples from Mark Nelson on his blog RedStack how to get Oracle artifacts into a Maven repository and refer to these artifacts as a dependency during your builds.

RedStack

If you want to build an application to run on the Fusion Middleware platform, you may find that you want to use some Oracle provided JAR files in your build process.  These may be things like API’s (interfaces) that are needed to compile, or client libraries that need to be bundled into the deployable application, or perhaps tools that are needed during build time, e.g. WLST or appc.

If you are using Maven to manage your build, it would be ideal to be able to use Maven coordinates to reference these Oracle artifacts, just like you do for any other kind of dependency.  But Oracle does not provide POM files for these artifacts – so what can we do?

Maven provides the ability to load any artifact into a Maven repository and to give it a set of coordinates.  This means that you can easily add the Oracle provided artifacts…

View original post 316 more words

 
Leave a comment

Posted by on 07-08-2012 in Maven

 

Tags: , ,

Maven build using SOA/BPM resources from Oracle Service Registry

Very nice pom.xml example showing how to use SOA/BPM resources from the Oracle Service Registry during a Maven build by Phani Khrisna @ Mark Nelson’s RedStack blog

RedStack

Today, I feel very privileged to share with you a post from one of my readers, Phani Khrisna, who was kind enough to allow me to post this updated Maven POM which allows you to use resources in the Oracle Service Registry during the build.

Phani has also tidied up a small omission from earlier POMs, which a number of you have commented on.  This POM copies the SAR file into the target directory so that it will be published into the Maven repository as part of the build, rather than an almost empty file with the POM in it.  While we may not be able to use it as a dependency in another composite using just the Maven coordinates (like we can with Java artifacts for example), it at least makes it available to other developers.

Thank you Phani.  Enjoy!

View original post

 
Leave a comment

Posted by on 22-05-2012 in BPM, Oracle, SOA Suite

 

Tags: , , ,