You have a knack for statistics and data science and you are looking for a challenging environment where you can put your skills to practice? You want to take our product to the next level by optimising our data pipeline and be part of a rapidly growing startup? Check out this opportunity 👇
Who we are
Rock.estate is a data science company, specialised in the remote and real-time evaluation of buildings on a country-wide scale. We combine various forms of geodata with the latest AI and machine learning techniques.
Multiple insurers and banks already use our solution to understand the value and the risks of their customer’s houses. This enables them to digitise their product offering with frictionless customer journeys without compromising on risk management.
What you can expect
You will be joining our data science team where your responsibilities will include:
- Continuous improvement of our geodata lake (ETL pipelines, workflow management)
- Expansion of our machine learning pipelines (feature engineering, valuation models, ...)
- Deployment and maintenance of our solutions in a production environment
- R&D activities related to new data sources and features (algorithmic 3D modelling, image recognition, ...)
Our engineering team is rapidly evolving and besides your key responsibilities, you will also get the opportunity to interact with and actively contribute to other technical areas such as DevOps and backend development.
What we are offering you
- An opportunity to join a dynamic team on an exciting journey of building a company together in a space where FinTech, InsurTech and PropTech meet
- An environment where you are empowered to quickly take responsibility and make things happen
- Be part of our experienced, dynamic team striving for technical excellence
- A company culture where we put our employees first
- A flexible work schedule with room for part-time remote working
- A competitive salary & benefits
Required skills & experience
You could be the one we are looking for if you have experience with:
- You have a solid understanding of statistical & machine learning models
- You have experience in Python, Pandas and Sklearn
- You have worked with distributed version control (Git, Issue tracking, Branches, PR, CI/CD pipelines)
- You have a knack for being creative with data
- You are a team player and a good communicator
If you have experience with some or all of the following technologies we’ll put you on the top of our list:
- Geodata processing and analysis (PostGIS, GDAL, GeoPandas, QGIS, ...)
- Workflow management platforms (Airflow, Prefect)
- Distributed computing using Spark, Dask
- Operating workloads on cloud infrastructure
To apply please send us your resume and anything else that could help convince us that you are the right candidate for the job!