Data Engineer
Dwelly
Software Engineering, Data Science
Serbia
Posted on Feb 20, 2026
About Dwelly
Dwelly — a UK-based, AI-enabled lettings and property management platform, that is growing through a roll-up strategy acquiring estate agencies. The company leverages two arms: i) acquiring existing letting agencies, effectively buying its highly sticky, recurring revenue-type landlords portfolios, and then ii) building a top-notch technology to automate tenant management, payments, and post-rental property maintenance. The company seamlessly integrates AI services to automate all business processes within brick-and-mortar real estate agencies, integrating them into a tech-enabled digital letting platform in two months to radically improve the user experiences and increase efficiency of the business.
We’re a fast-growing, product-focused company, backed by top-tier investors and led by a team with deep experience in real estate, technology, and operations.
Position Summary
We’re looking for a Data Engineer to build our data engineering function. You will be the first hire on the team. The team is small, so this is a hands-on role. Most of your time will be spent building and maintaining production data pipelines.
Key Responsibilities:
Feel free to check out Dwelly Core Principles. That’s about what we believe in, how we operate and make decisions.
What we offer is not a fancy office or a static workplace. Instead, this is solving one of worlds’ most complex problems in the largest consumer industry in the world (residential rentals), to improve the experience for >30% of households (>5M in the UK, and >100M including EU and US) that live in rental homes.
This is about disrupting the largest, most antiquated industry in the world, with one of the strongest operational and technical teams that exist in the UK and the EU. We work hard, and we shoot for extremely ambitious results. But we want people to be proud of what they’ve built and be able to look back and say one day “hell yeah, that was me that did it all”.
Dwelly — a UK-based, AI-enabled lettings and property management platform, that is growing through a roll-up strategy acquiring estate agencies. The company leverages two arms: i) acquiring existing letting agencies, effectively buying its highly sticky, recurring revenue-type landlords portfolios, and then ii) building a top-notch technology to automate tenant management, payments, and post-rental property maintenance. The company seamlessly integrates AI services to automate all business processes within brick-and-mortar real estate agencies, integrating them into a tech-enabled digital letting platform in two months to radically improve the user experiences and increase efficiency of the business.
We’re a fast-growing, product-focused company, backed by top-tier investors and led by a team with deep experience in real estate, technology, and operations.
Position Summary
We’re looking for a Data Engineer to build our data engineering function. You will be the first hire on the team. The team is small, so this is a hands-on role. Most of your time will be spent building and maintaining production data pipelines.
Key Responsibilities:
- Data Architecture, Optimization & Compliance
- Design and maintain a unified data architecture: database schemas, data models, and micro-architecture solutions to ensure scalability and reliability.
- Optimize database performance at all levels: indexing, partitioning, clustering, and tuning configuration parameters.
- Ensure full compliance with GDPR, UK Data Protection Act, and other relevant regulations: data masking, consent management, retention policies, and privacy impact assessments
- Data Quality, Performance
- Optimize queries, schemas, and indexes where needed
- Set up basic data quality checks
- Support GDPR and UK data protection requirements, including:
- Data masking
- Access control
- Retention policies
- Production Data Pipelines
- Take data notebooks and calculation logic
- Turn them into reliable, production-ready pipelines
- Ensure scalability, reliability, and reproducibility
- Write clean, readable, maintainable code
- Have real experience supporting data pipelines in production
- Have worked with a data warehouse (BigQuery or similar)
- Have strong experience in GCP
- Understand orchestration, monitoring, and performance tuning
- Can make practical engineering decisions independently
- Strong communication skills and fluency in English.
- Startup mentality: resilience, adaptability, and ability to thrive in a fast-paced environment.
- Customer-centric mindset: focus on delivering value to end-users or clients.
- Strong problem-solving skills – ability to approach challenges logically and propose practical solutions.
- Experience with AWS, or Azure
- Experience with message queues or distributed systems
- Basic CI/CD for data pipelines
- The role is fully remote, providing flexibility and enabling seamless collaboration with our geographically distributed team.
- Competitive salary with the potential for equity options based on performance, recognising exceptional contributions to our integration success.
Feel free to check out Dwelly Core Principles. That’s about what we believe in, how we operate and make decisions.
What we offer is not a fancy office or a static workplace. Instead, this is solving one of worlds’ most complex problems in the largest consumer industry in the world (residential rentals), to improve the experience for >30% of households (>5M in the UK, and >100M including EU and US) that live in rental homes.
This is about disrupting the largest, most antiquated industry in the world, with one of the strongest operational and technical teams that exist in the UK and the EU. We work hard, and we shoot for extremely ambitious results. But we want people to be proud of what they’ve built and be able to look back and say one day “hell yeah, that was me that did it all”.
- Customer obsession rather than competitive focus
- Passion for invention
- Operational excellence
- Long-term thinking