Using Xapix with telemetry data from Kafka Event Streams and pushing notifications using REST APIs

August 18, 2020
Glenn Lea

Today cars and trucks generate massive amounts of data. In most cases, it can be made available for consumption by remote applications through telemetry.

Telemetry is an automatic collecting of measurements from remote devices. For example, car sharing vehicles or delivery trucks push a constant stream of multiple types of data such as location, speed, fuel levels, forward or reverse motion and much more, depending on the information required. Suppose you only want to make quick adjustments and only receive the data based on certain parameters such as the location or the speed of the vehicle.

In this tutorial, we will build a project in Xapix that will get two data points - vehicle location and speed - in a Kafka event stream. Then based on a condition - is the vehicle travelling faster than 60 miles per hour (MPH) - send a text notification via a Webhook to a browser. 

Let’s get started.

What we are going to build

Before starting a trip, it’s a good idea to know what we are going to expect along the way. What are the traffic conditions on the route I am planning on taking? What is the average speed of the traffic? Are cars exceeding the posted speed limits on the roads? This might indicate how heavy the amount of traffic is on our route, not that we plan on exceeding the speed limit. But if drivers are doing this frequently, it indicates low volume of traffic on my route. 

In this simplified project, we are going to receive a text notification if any of the “virtual cars” exceed a posted speed limit in a 60 mph speed limit zone.

To achieve this we need data. Lots of data. In fact, we need a stream of data from the virtual vehicles to know at each time point their speed in the 60 mph zone. For this we use a Kafka Event Stream. 

We also need a data source to which we send a request for a text notification. For this we use a Push REST API. 

After getting our data sources, we build a pipeline to handle the incoming Kafka data stream and then automatically send out text notifications based on our set criteria. 

Finally, we publish our project to make it available for testing using a Webhook to receive Push notifications in a browser.

Creating a Xapix account

First we need to create a Xapix account. When we do this, Xapix automatically creates an Organization in which we will create a Project. Within this project we onboard our data sources and build our pipeline. 

So, let’s start by going to xapix.io and clicking the Sign Up button.

Xapix Sign Up

On the Sign up for Xapix Cloud page that appears, provide some login information then click Sign up

Now, check your inbox of the email address you provided for an email from Xapix. In that email click the link Confirm my account.

You should now see an Organization overview page belonging to your new account.

Organization Overview page

Now we want to create a Project. We start with a blank project, but you might want to spend a few minutes taking a look at some demo projects by clicking Preview under the two demos. 

So, let’s go back to the Organization overview page. You can do that simply by going back in the browser. 

Now, click Blank Project to start a new project. Give the project a name and provide a description.  

Now that we have a project, we need to onboard data sources. This includes a Kafka Event Stream used to create a pipeline and a REST data source for text notification. 

Adding a REST Data Source

Let’s first add a REST data source. We use this data source to push notifications of vehicles exceeding the speed limits to a specific URL. 

A couple of things to take note of here. We are using POST to send to the address a text string. The address is a unique Webhook url from Xapix. To let Xapix know what to send, we need to set up a Name/Data Sample value. The name is fixed but the data sample is a placeholder to use within Xapix. The actual value will come from the results of processing the data in the pipeline. Remember to replace the default Name of the data source with the one provided. 

To add this data source, let’s first click Data Sources from the Home menu of our project. Then from the Add Data Source dropdown, select REST Data Source

On the Data Sources page click Setup Data Source Manually to open the New Data Source page. 

We are going to set up our data source here. We use POST, a webhook and a text string (literal) in the body parameter. We also add HTTP status codes.

For HTTP Method, select post. 

For Address, you need a unique webhook URL from Xapix. To generate this URL, simply visit this page: https://mobility.xapix.io/webhooks/ 

  • After clicking this URL, a Xapix Webhook page opens. On this page a unique URL for you has been generated. 
  • Copy the URL located under “Here's your unique URL that was created just now”. 
  • Back in your Xapix project paste this URL as the data source in Address. 

For Name, enter Text Notification via HTTP POST. 

For the body parameter, follow these steps:

  • Expand body area under Required Parameters.
  • Click New Property
  • Enter the name of the property, which is text.
  • We can now choose between an Object, Array or Literal. As we want to send a text string to the Webhook, select Literal.
  • Click Create
  • We want to add some sample data for the pipeline to work with initially. Click Set Data Sample. This will open an editor to enter formulas, text strings and so on. As we want to add a text string, enter, including the quotes, into the editor:
"Vehicle car-1234 is speeding at 123.45 mph."

Click Save to add this value to the text property.

The new REST data source should look like this (with your unique URL as Address).

Properties of the new data source

One last thing we need to do for this data source is add acceptable HTTP status codes. 

  • Expand Enable Authentication, Caching, Proxy or Insecure Access
  • In the Acceptable HTTP Status Code field, enter 200-500. 

That’s it. Now click Preview Data Source. A response shows up in the Response sidebar as JSON. This means the new REST data source has been added properly. We can now go ahead and save using Save Data Source to add the data source to the project. 

Adding a new Kafka Event Stream

We are now ready to add our Kafka Event Stream.This will also generate our basic pipeline. 

First, from the Project Dashboard (click Home to go there), click Create your first Endpoint. We want to create a Kafka Event Stream so go ahead and click the appropriate box.

Create your first endpoint

In the New Kafka Event Stream page that appears, enter the information exactly as shown below:

Topic: xapix_demo-simulated_vehicles

Consumer Group: xapix_demo-simulated_vehicles

Initial Position: Latest.

Kafka Server click the Plus sign at the end of the row to add a server by entering the following values for Name and Boot Servers:

Name: XapixDemoKafkaCluster

Boot Servers:  b-3.gen3-demo-stable.a74654.c3.kafka.eu-west-1.amazonaws.com:9092,b-1.gen3-demo-stable.a74654.c3.kafka.eu-west-1.amazonaws.com:9092,b-2.gen3-demo-stable.a74654.c3.kafka.eu-west-1.amazonaws.com:9092

The new Kafka Event Stream form should look like this.

New Kafka event stream

Click Save and then once the information has been saved, click Create Stream

The new Kafka Event Stream will appear in the Pipeline Dashboard. In this dashboard we will orchestrate the data sources, test our project and start running it “live” with the data stream from the vehicles.

We have now set up our Kafka Event Stream and a REST data source. We now need to build a Kafka Event Stream pipeline.  

Build a Kafka Event Stream pipeline

Let’s take a look at the default pipeline. Notice a few things here. First, we have two elements called Event and Sink. These serve as the two ends of the pipeline. The Event is the request, while the Sink just closes off the pipeline.

Default Kafka event stream pipeline

Notice the Library. This is where we can find all data sources we onboarded into the project. It also has a number of special logical elements called Flow Units that we can add to the pipeline. These special units allow us to make decisions, transform data, merge data and so on. In our pipeline we use the Decision unit to filter out any vehicles travelling faster than the speed limit. 

So, let’s begin to build our Kafka Event Stream pipeline.

Adding data to a Kafka Event Stream

To build the pipeline, the first thing we need to do is add some sample data to the Event unit. Remember this represents the Kafka Event Stream so the sample data we are adding is an example of how one of these events can look like. On the pipeline click the Event unit to open the Mapping Editor. This is where you will add JSON for the Kafka Event Stream. 

Copy the following JSON and paste it into the Data Sample field in the Mapping Editor, replacing any existing data. Then click Update Data Sample.

{   "accelerator_pedal_position": 0,   "engine_speed": 780,   "vehicle_speed": 61,   "torque_at_transmission": 15,   "fuel_consumed_since_restart": 9.147213,   "odometer": 13974.579102,   "fuel_level": 81.304611,   "steering_wheel_angle": 2.300049,   "latitude": 40.759171,   "longitude": -73.988205,   "brake_pedal_status": true,   "transmission_gear_position": "first",   "door_status": "rear_left",   "button_state": "pressed",   "headlamp_status": false,   "windshield_wiper_status": null,   "ignition_status": null,   "timestamp": 1364325734.999,   "id": "car-165r407tc",   "brand": "Xapix Motors" }


The Mapping Editor should now look like this.

Kafka Event Stream added

Click Done to save the changes and close the Mapping Editor

Adding a REST data source to the pipeline

Adding a data source to a pipeline is easy. It’s a simple drag and drop action. Remember, we now have our Event unit containing data for a Kafka Event Stream. We also have our REST data source included in the Library

Let’s drag the data source Text Notifications via HTTP POST to the Pipeline Dashboard (alternatively, click on the data source once to add it to the pipeline). Don’t worry where it lands on the dashboard. We will connect it to the other units later. 

REST data source added to pipeline

Now drag a line from the REST data source to the Sink

Note: To drag a line on the Pipeline dashboard, put your cursor over the data source unit. Notice that it changes to a hand cursor. Click on the unit and then keeping the mouse clicked, drag towards the next unit, then release the mouse. Xapix automatically connects the two units. 

Connecting units using drag and drop function

Adding a Decision Flow Unit to the Pipeline 

All the data sources have been added to the pipeline (a Kafka Event Stream and a REST data source) and the REST data source is connected to the Sink. Now we need to do some logic. 

We need to set up a decision point where any vehicle exceeding the posted 60 mph speed limit triggers a text message. We do this through a Decision unit. So, let’s go ahead and add this to the pipeline. It’s quite simple to do. 

As we did with the REST data source, let’s go to the Library, scroll down until we see the Decision unit, then drag this to the Pipeline Dashboard (or, alternatively, click on the Decision unit once to add it). Again, don’t worry where it lands. 

Now, drag a line from the Event unit to the Decision unit. These two units are now connected. 

Let’s set up some logic. Click the Decision unit to open the Mapping Editor. Notice on the left is the data from the Event unit, which is the Kafka Event Stream. On the right is where we set up the logic. 

This is the logic we will build.

Decision logic

Click Create new branch, then enter the following:

  • Branch name: is-speeding
  • if field: event.body.vehicle_speed > 60
    Note: Make sure the field is empty first by deleting the default True value. 

Click Save, then click Done

The is-speeding branch appears on the Pipeline Dashboard.

Now that the logic for the Decision unit has been set up, let’s connect it to the REST Data source by dragging a line from the is-speeding branch to the REST data source.

Completed pipeline

Now, all we need to do is take a look at the Decision unit again. The logic should be:

If event.body.vehicle_speed > 60 then call the REST Data Source API Text Notification via HTTP Post


Let’s see the changes in the Mapping Editor. Click the Decision unit to show the logic.

Completed decision logic

Notice now that after we made a connection to the REST data source, a call is made to the data source Text Notification via HTTP Post if the result of vehicle speed is greater than 60 mph. 

Click Done to close this dialog. 

Defining a formula for the text notification message

We now need to provide a message to be sent to the Webhook through the REST data source. We do that using a formula editor. 

Open the Text Notification via HTTP Post unit to reveal the Mapping Editor. We need to provide a formula for the text parameter. Hover over the green bar under the text parameter and click Edit Formula

In the Formula editor that appears, enter the following:

'Vehicle ' & event.body.id & ' is speeding at ' & event.body.vehicle_speed & ' mph '

The formula editor should look like this.

Formula editor for the REST data source

Click Save to save your changes, then click Done. The Mapping Editor should now look like this. 

REST data source message

Now, when a text message is pushed to the Webhook site, the message defined here will be sent. 


Publishing the completed project

Now that we have completed our Kafka Event Stream pipeline, we can go ahead and publish the project. This means that the project is now available externally.

Click Publish Project, which is below the pipeline on the Pipeline Dashboard.

Viewing the results on mobility.xapix.io

We need to take a look at two URLs. The first URL opens a virtual visual vehicle simulator map. The second opens the text notification receiver, which is a Webhook. 

Let’s take a look at the vehicle visualization map.This is the source of our Kafka Event Stream.

Click the following URL to open the map:

https://mobility.xapix.io/trace-player/map/index.htm?topic=xapix_demo-simulated_vehicles

To start the stream of data to our project, we choose a trace file and a vehicle type, then click Start vehicle

In the example shown below, the trace type chosen is New York City uptown-west, in other words, the Uptown West Side of NYC. This is the area of Manhattan along the Hudson River further up the peninsula from the lower area. We can see several vehicles in this area now at the time of this screen capture.

Kafka event stream for project

Next, let’s take a look at the results of our pipeline logic in a browser. This is the address we entered when we set up the REST data source.

Click the following URL to open the Webhooks site for your project.

https://mobility.xapix.io/webhooks/


Webhook results

Any vehicles exceeding the 60 mph speed limit appear in this Webhook site. Currently, all traffic in Manhattan are travelling slow which probably means heavy traffic on the roads.

And that’s it. 

Congratulations on building a Project in Xapix that transforms and controls event streams and sends text notifications based on a defined logic in a Decision unit. 

Summary

This post showed you how to create a Xapix cloud account with a default organization, set up a project and add two types of data sources - a Kafka Event Stream and a REST Data source. It also showed you how to build a pipeline that obtains data from the event stream and how to send messages to a Webhook if the condition you defined is True. The results are then shown in a browser.

Copyright © 2020 Xapix, Inc. All rights reserved.