If you work in IT or on a developer team, you may find it hard to explain the benefits of data-heavy event streaming or industrial internet of things (IIoT) projects to external stakeholders. This is when a model or simulator comes in handy.
Creating a model of the IIoT solution you're proposing can help you demonstrate how it works and why it's valuable. Enabling stakeholders to see your solution in action helps everyone better understand the technical solution — this is key to getting their support. The model needs to be functional at a moment’s notice — you never know when people will stop by! It also needs to be easily adjusted and look professional.
In this post, we will start building such a factory IIoT model by building a product-reader sensor, which records data from various scenarios — such as a normal constant product flow, start-and-stop modes and different types of incidents.
We want to send the warmest thank you to the etventure tech guild team for their great support on building the model’s hardware and sensors! This wouldn’t be possible without your help.
We also share concepts and tips for developing a simulator software that replays the data streams at any point in time so programmers can use it for building their own prototypes. We already operate such a simulator software for vehicles – you can learn more in our ride hail blog post.
Factories have different types of product flows going in or out of their machines. To keep the factory model abstract and flexible enough to cover as many use cases as possible we use a paper strip with QR codes. Each QR code has JSON data with a specified format encoded but the product data content is fully flexible. This way when reading the QR codes we can simulate different types of products and incidents by just directly encoding them into QR.
For the product data sensor we use a Raspberry Pi 4 with a camera module and a Python script scanning the QR codes. Additionally, the Raspberry Pi runs a software very similar to our open source IIoT Server. The changes we made allow it to record locally and also stream all sensor events and API command logs via VPN to a Kafka cluster we operate.
We arrange our camera and Raspberry Pi so we can drag along the paperstrip with QR codes underneath the setup. The factory software records the events and at the same time streams them to our Kafka cluster in the cloud. The recorded events we store in a “trace file” to be replayed later by our factory simulator software.
We are using the following Ruby script to make HTML pages with SVG QR codes. Store the file locally on your computer as “./make_qr_strip.rb”.
Create a directory “./qr_data/” to store your different JSON files containing product data for your various scenarios. In our case it is named “factory_intro_1.json”. The format for each file is as follows:
Create a directory “./qr_strips/” the script stores your strips in. Finally, run `ruby ./make_qr_strip.rb <your qr_data_filename="">` and find your strip as an HTML file in folder “./qr_strips/”. Open the strip file in a browser and print it out.</your>
Turn off your Raspberry Pi 4 and plug your camera module into it’s camera slot. Start the Raspberry Pi and in your “Raspberry Pi Configuration” menu in tab “Interfaces” to enable the camera.
To install our QR code reader script you need Python3 and pip3 installed. Next, install the following software libraries for video stream and image processing and QR code reading via `pip3 install opencv-python imutils pyzbar`.
Store the following script locally, e.g. in a file “./product_reader.py” and start the QR code scanning product reader in a terminal with command `python3 ./product_reader.py`.
Put the QR code in front of your camera and your running script should output the data encoded in your QR code. On our model we use this script to send the scanned product data to our IIoT factory server software which then will record the event and send the data to our Kafka cluster.
As we do this we record a trace file that we can replay in the simulator later. It looks like this:
Next, you could either make a Xapix account and follow the final steps of this article or you could build a simulator script reading the trace file and all it’s events. For example, you could group all events of a certain time interval and then iterate over these event groups, stream them to your own Kafka cluster and pause for the time interval length in between.
If you are interested in using the Xapix simulator and use a graphical tool to create data pipelines for event streaming use cases, then we recommend trying our free Xapix Community Edition. It gives you access to our Kafka demo cluster the simulator streams to. This way you can build and deploy a simple text notification pipeline sending product data to a webhook to get started. Xapix can integrate many more APIs and services both from the cloud or your company’s intranet and you can build out many more complex pipelines starting from here.
To get started, create a Xapix account if you haven’t already. Set up a webhook test site receiving and displaying the product text notifications by clicking this link and waiting for 5 seconds to get redirected. Keep the URL from the webhook test site around and create a REST Data Source in Xapix Community Edition by clicking “Data Sources” on the left sidebar, then “Add Data Source” and next “Setup Data Source Manually”. Use the webhook URL as address here, set the HTTP method to “post” and make the body parameters look like this screenshot. And finally click “Save Data Source”.
When clicking the “Preview Data Source” button your webhook test site should receive a request and display its content. After you set this pipeline up successfully the product data coming from the simulator event stream will be displayed the same way on the webhook test site.
Next, make up a 6 digits and letters random identifier for your event stream and fill it in for <random_id> going forward. On the left sidebar below “Kafka Event Streams” click “Add New” and enter the topic “demo_iot_factory-product_simulator_<random_id>”, consumer group “iiotFactorySimulatorIntro” and initial position “Latest”.</random_id></random_id>
In the same menu create a new Kafka server named “XapixDemoKafkaCluster” with boot servers “b-3.gen3-demo-stable.a74654.c3.kafka.eu-west-1.amazonaws.com:9092,b-1.gen3-demo-stable.a74654.c3.kafka.eu-west-1.amazonaws.com:9092,b-2.gen3-demo-stable.a74654.c3.kafka.eu-west-1.amazonaws.com:9092”. Finally, click “Create Stream”.
Next, on the pipeline dashboard click on the “Event” unit and paste in the following data sample:
Connect the event unit with the webhook text notification unit and the webhook unit with the sink. Finally click on the webhook unit and map all attributes from the Kafka event into the webhook parameters.
Use the Trace File Player software to replay the factory trace files. It is not open source software but I can tell you that at the core it is a REST API web server receiving job descriptions for a queueing system. Jobs get picked up by worker instances of the queuing system. Each job picks up it’s trace file to replay and streams it’s recorded events into a Xapix demo Kafka cluster. This means many simulated machines can be run at the same time. The Trace File Player has some additional features that we will explain in a later post.
Got any questions? We are working on numerous follow-ups to this blog-series and will be posting updates regularly within our Discord community to help inspire you on building IIoT models and simulators. We would love to hear your ideas too and collaborate on this fun project.
Please contact us on our Discord community channel if you would like to discuss this tutorial or if you have questions or feedback about the Xapix Community Edition. Or really anything else as well. We look forward to hearing from you.
Oliver is a senior software developer and an API and data transformation enthusiast. He’s a co-founder of Xapix, writer and conference speaker on all things API. Most recently, he’s bringing his perspective and experience to the world of Industrial IoT.