top of page

Building timely and reliable data pipeline for time-series IoT Data Platform

KUGU Home is a market leader in digital building management. At their growing scale they strive to solve one of the biggest challenges in the IoT world: tracking and analyzing data in real time. To that end, KUGU wanted to improve and harden their data pipeline that ensures a reliable and engaging experience to their customers.

​

We helped KUGU refactor, reengineer and monitor the existing data pipeline to reduce the message loss by more than 99%.

​

Technologies: Telegraf, InfluxDB, Azure, Python, Ansible, mqtt, Grafana

What we did

The existing platform's most vulnerable point resided in the process of transmitting data from the MQTT broker to the InfluxDB database. In response, we undertook a reengineering effort, constructing a cutting-edge configuration that leverages the most up-to-date engineering techniques to successfully accomplish our objective.

Challenge

Messages from numerous sources, arriving at varying rates, converge at the broker. During peak hours, the pipeline experiences an influx of up to 1K messages per second, a number that continues to increase as the customer base expands. The objective entailed establishing a data pipeline capable of reliably, accurately, and timely processing this substantial volume of data, while also accommodating future capacity growth.

Solution

We introduced Telegraf, a powerful and adaptable tool widely recognized in the IoT industry, specifically designed to receive, transform, and transmit sensor data to their respective destinations.

 

Furthermore, to provide comprehensive insights into the pipeline's status at any given time, we developed a monitoring cockpit. This interface effectively displays the data reception rate and promptly alerts the team to any issues that may arise throughout the data pipeline.

Result

Our work yielded an impressive 99%+ increase in message retention. Additionally, the monitoring cockpit swiftly became a trusted reference point for the team, providing valuable feedback on the data pipeline's health.

 

In this project, we introduced Telegraf to establish data transformations and micro-batching. Its seamless integration with other pipeline components, such as time-series databases like InfluxDB, resulted in enhanced overall performance.

 

Consequently, both the data team and KUGU's customers experienced heightened confidence in the pipeline, leading to a significantly improved user experience.

 

As a direct outcome of our efforts, KUGU is now capable of offering their customers a more dependable and precise perspective on the assets.

Scott Williams, CPTO

Scott Williams

Data Max demonstrated remarkable expertise in reengineering and monitoring our data pipeline, resulting in a significant improvement in performance and reliability. Their proactive approach, deep understanding of our needs, and exceptional collaboration made them a highly recommended partner for any organization seeking help in data engineering in cloud.

​

By working with Data Max we were able to blend our engineering together to create a seamless and high impactful piece of work. We hope to work with them more in the future.

Discover how our data and AI experts can transform your business. Reach out to us today to explore your potential!

bottom of page