A key part of building a digital twin - or in our case, a digital twin platform - is the ability to model a slice of reality. This means, amoung many other challenges, the ability to capture, ingest and clean data in real-time so our digital twin is a true twin of the real-world asset, system or venue. And because we want to be able to connect to a lot of different data sources, from streaming IoT sensors to core business applications we needed a flexible and robust data pipeline. After trialing a number of options we settled on Kafka as its event-driven model is perfect for our needs. But we're still a tiny engineering team, and managing a Kafka instance was taking time away from our focus on building customer solutions. So at the very end of 2023 we migrated onto Confluent Cloud and got on with everything else.
When we starting talking to Confluent they told us about the Confluent Data Streaming Startup Challenge which sounded interesting, apart from the closing date being only 3 days away. We scurried and hustled and got our entry organised. 3 weeks later we found we'd made the shortlist of 10, from over a hundred entrants. Amazing!
Today! And the announcement we've made it to the final 3. As an early-stage startup this is huge for us. Along with the feedback we're getting from our first clients this is great validation that we're building something really exciting here. We've been really privileged to be working with some amazing clients who are using our platform to solve business critical problems around safety and asset utilisation and investment.
If you'd like to find out more or join us on the journey then sign up here.