Data Engineer - IrisX Analytics Product
Mission: Make IrisX Analytics a world-class customer product that drives consumption.
We're looking for a Data Engineer who builds data products, not just data pipelines. Our Data Foundation team at Trackunit builds and operates IrisX Analytics - a customer-facing analytics platform powered by Databricks, and part of IrisX - where we deliver platform as a service for the construction industry. Millions of machines stream telemetry in real time, and our job is to turn that into something customers use every day to run their business.
The challenge: we have one of the richest IoT datasets in the industry and a platform that customers depend on - but there's so much more to build. We need to make data sharing effortless, deliver real-time insights, and build the data products that make IrisX Analytics worth coming back to. You'll work closely with our data scientist, customer-facing teams, pilot customers, and Databricks to make that happen.
You'll be attached to one of our hubs in Aarhus, Aalborg, Copenhagen, or Kolding. We work remote-first, but the data team is mainly in Aarhus and Aalborg - so expect to join us there from time to time.
If you care about how customers experience data - not just how it flows - this is your chance to build something meaningful in one of the world's most essential industries!
What's in it for you?
Trackunit connects millions of machines across construction, and our data platform is at the center of it. We have big ambitions - recently recognized by Databricks as Industry Leader of the Year - and we're just getting started.
We invest in your growth. We take impact conversations, personal learning paths, and regular retrospectives seriously - not as a ritual, but because we genuinely want people to grow and teams to keep getting better at working together.
We invest heavily in AI - both in our product and in how we work. Our platform enables customers to run AI and ML on their data, and internally we make all the AI tooling you need available to be more effective in your daily work.
Across the organization - from HR to engineering - we're well supported by modern tools, and we continuously invest in keeping it that way.
Flexibility is real here. We work largely remote and trust you to structure your own work. When we need to come together, we do - but it's driven by purpose, not a policy. Let us know what you need to do your best work: we've got you covered.
Turning the Tide. At Trackunit, we aim to create and contribute to an inclusive workplace. And this priority starts with our hiring - we do not discriminate on the basis of gender identity, sexual orientation, personal expression, ethnicity, religious belief, or disability status. We only assess candidates on their qualifications and merit. Learn more about TIDE and other DE&I initiatives here.
Who are you ideally?
Databricks is your home turf. It powers everything we deliver to customers, and you should feel at home across the stack:
Storage & compute: Delta Lake, Spark, Databricks SQL
Governance & security: Unity Catalog, fine-grained access control
Data architecture: Medallion architecture (bronze/silver/gold), data modeling for different consumption patterns
Data sharing: Delta Sharing or similar cross-platform mechanisms
Languages: Python and SQL
You've built pipelines end-to-end - from ingestion through transformation to the layer customers actually query. You understand that great data isn't just correct - it's well-modeled, documented, fast to query, and delivered with low latency.
You think about scale from the start. When something works for one customer, you think about how to make it work for hundreds - designing for multi-tenancy, self-service onboarding, and monitoring.
Excited about AI-powered analytics. Genie, AI/BI, Databricks Agents, MCP - these aren't just announcements to you, they're opportunities. We want someone who sees new platform capabilities and immediately thinks about how customers can benefit.
Don't meet every single requirement? No worries - this is what we're looking for ideally, but if you're particularly drawn to this role but your past experience doesn't align perfectly with every qualification, we encourage you to apply.
Primary responsibilities:
Build and maintain end-to-end data pipelines that power IrisX Analytics - from consuming real-time telemetry data ingested via Kafka, through PySpark transformations, to the Gold-layer models and views that customers query every day. You'll work with Delta Lake, Databricks SQL, and Unity Catalog to keep latency low and data trustworthy - a continuous challenge as data volumes grow.
Stay on top of Databricks features and evaluate how new capabilities can improve the product we deliver to customers. You'll ensure new features are adopted in a way that works across hundreds of customer workspaces - not just in a demo.
Define golden paths for feature teams inside Trackunit to bring their data into IrisX Analytics. You'll make it easy for other teams to contribute data while ensuring the end result is consistent, well-governed, and delivers a great experience for customers.
Make it easy for customers to integrate IrisX data into their own platforms - from metadata to time-series telemetry. The goal is scalable, self-service integration - not custom one-offs.
Monitor data quality, pipeline health, and consumption across the platform. You'll build the observability that ensures customers can trust their data, and the metering that helps turn features into pay-as-you-go products.
Work closely with our data scientists, customer-facing teams, pilot customers, and Databricks to understand what customers need and translate that into data solutions that scale.
Our hiring process
Don't waste your time on writing the best possible cover letter for the job. We want you to create an impact that matters, and that's not in the cover letter.
Initial phone conversation with a colleague from our P&T team.
Interview with your potential new leader and relevant people from the team, so you'll get to know them and the team better.
Assignment-specific interview. We want you to get an insight into some of the concrete work tasks or projects related to the role. You will be given a case to prepare prior to the interview, and at the interview, you will present the case to relevant colleagues from across Trackunit, who you'll also work closely with in the job.
If needed, throughout the process we will obtain references from former employers and do background checks for level specific, if you have not provided these yourself.
Offer presentation and walk-through. We're lucky to have you!
Coming Together To Connect Construction
We're committed to construction - one of the largest industries in the world. Over the past two decades, Trackunit has been pioneering technological progress within construction. Today, we are not only a leading IoT provider but a thought leader, supporting and shaping the agenda for an entire industry on a global scale.
We believe in taking a people approach in everything we do. Being human-centric is not restricted to our products – it's a way of life at Trackunit. We're proud to be a truly global team. Our colleagues get together in hubs spread across the globe, but we embrace the idea of working remotely and in environments that inspire you. Everything we do, we do it to eliminate downtime and build the most useful industry for the world.
The question is: Are you in?
- Department
- Engineering
- Locations
- Denmark, Aarhus, Denmark, Aalborg, Denmark, Kolding, Denmark, Copenhagen
- Remote status
- Hybrid
- Employment type
- Full-time
- Application deadline:
- May 31, 2026
- Positions:
- 1
About Trackunit
Trackunit is the leading operating data platform connecting people, assets, and processes in construction.
A global technology company connecting the construction industry - by orchestrating data and infrastructure - to build the most useful industry for the world.
We connect construction through one platform to create a living, evolving ecosystem that delivers data and insights to the off-highway sector. With more than 2 million assets connected and more than 2 billion data points collected each day, we use technology to eliminate downtime, improve safety, and help customers improve the bottom line in a sustainable, cost-effective way.