Ensure real-time time-series data processing in Databricks (Scala) environments is robust and scalable.
Empower other departments by making data accessible and usable, contributing to Eneco’s digital innovations.
Lead technical decisions on cloud-based data solutions and advise the product team.
The One Planet strategy we are committed to here at Eneco sets an ambitious goal to be climate neutral by 2035. We want to achieve that goal for both us and our clients. To make it happen, we are dedicated to offering our customers innovative digital capabilities and smart solutions. Our Digital Core team is working towards creating an exceptional online customer experience, through modernizing the Eneco chat, app and web environments. We are striving to deliver a superior digital customer experience that will stimulate and make it easier for our customers to become greener, every day.
As a Data Engineer, you will play a crucial role in setting up and leading technical decisions for our cloud-based data platform. We are specifically looking for someone that will contribute to combination of Cloud Infrastructure setup, maintain API server, and develop streaming / batch Data processing pipeline. You will be working on an exciting IOT product (smart thermostat, energy insight, smart charging) for our consumers.
Must Have:
Previous experience of REST API development (e.g. Spring or FastAPI).
Understanding of streaming data ingestion and processing.
Previous experience working with MPP data platforms such as Spark. Working experience of using Databricks and Unity Catalog is a plus.
Proficiency in programming languages (Java, Scala, and Python).
Knowledge of software engineering best practices: code reviews, version control, testing, and CI/CD
Genuine interest in DevOps/SRE principles for production deployment.
Nice to Have:
Working experience with high volume time series data.
Knowledge of data modelling and architecture patterns.
Experience deploying applications to Kubernetes, with skills in monitoring (Grafana) and debugging.
Knowledge with cloud provider (e.g. AWS). Infrastructure as code (IAC) is a plus.
Experience with NoSQL databases (e.g. DynamoDB) and RDBMS (e.g. Postgres).
Proficiency in SQL and DBT (Data Build Tool) with Snowflake.
Familiarity or interest with MLOps and data science techniques.
Setting up projects and leading technical decisions involving real time time-series data in Databricks (scala) environments.
Empowering other departments by making data accessible and usable, driving Eneco’s digital innovations forward.
Design and implement cloud solutions to handle product requirements.
Shape the product by providing technical advice to the product manager or other team.
Ensuring our solutions are robust, scalable, and ready to meet future challenges
You will be working together with other Data Engineer, Machine Learning Engineers, Data Scientists and Data Analysts. Together, you will shape IOT products that will transform how our consumers use their energy. Within the team, we encourage learning, actively seek out collaboration, celebrate successes, and learn from failures.
Please reach out to our recruiter.
For the proper functioning and anonymous analysis of our website, we place necessary and functional cookies,
which have no consequences for your privacy.
We use more cookies, for example to make our website more relevant to you,
to make it possible to share content via social media and to show you relevant advertisements on third-party websites.
These cookies may collect data outside of our website. By clicking "Accept" By clicking you agree to the placing of these cookies.
You can find more information in our cookie policy.