Data Engineer
We’re strengthening our data platform core team with another data engineer. We have a unique, developed data platform infrastructure that we’re proud of. We’ve long since outgrown the “simple ETL” stage and have begun building a data platform as a product.
Our main differentiator is platform engineering instead of routine engineering. We don’t create routine pipelines—we build and develop infrastructure and services that enable companies to work with data using standardized tools.
We adhere to the Data Mesh principle: data platforms create user-friendly tools and a lower data layer, on which domain teams and analysts build their data marts and reports.
In addition to storage tasks, we have a lot of engineering work at the intersection of data and software development – this is best reflected in our architecture:
- Orchestration: 8,500+ tasks.
- BigQuery: 800+ TB of data, 10,000+ tables.
- In-house product for task management in Airflow.
- Real-time data export using CDC.
- High-load, horizontally scalable streaming service.
- Qdrant: vector data warehouse for RAG AI agents and semantic search.
- RAG AI workflow for information search and synthesis based on local LLM.
- Datahub – documentation and metadata.
- Service for sending events to marketing systems.
- Services for managing Tableau report updates.
- Monitoring and alerting services.
- And many more interesting components and services for working with the DE platform.
Future challenges
- Developing the company’s data platform.
- Creating new tools for analysts.
- Process ETL support.
- Setting up data integration with various services.
- Sending events to various analytics and marketing systems.
- Maintaining data quality and reliability.
- Supporting and creating analytical microservices.
- Implementing AI solutions in DE.
- Implementing and disseminating common approaches and practices in DE.
We expect
- At least 3 years of experience as a data engineer or in a similar position.
- Knowledge of SQL.
- Experience with Docker.
- Knowledge of Python processing software.
- Experience supporting backend analytics services.
- Experience in developing ETL processes.
- Experience with relational and columnar databases (design, queries, optimization).
- Knowledge of data orchestration tools.
- Excellent problem-solving skills and the ability to work both independently and in a team.
- Strong communication skills and the ability to explain technical concepts in simple terms.
- Experience using AI tools to improve productivity.
Advantages
- Experience in mentoring and training.
- Working with cloud data processing services.
- Experience with Kubernetes, Ansible.
- Working with Appsflyer, Firebase, Amplitude.
- Working with RabbitMQ, Kafka.
We offer
- Fully legal employment with a competitive salary
- Flexible working hours
- Hybrid format — 2 days/week in the office (Tbilisi), rest remotely
- Minimal bureaucracy — freedom to suggest and implement ideas
- Highly skilled and collaborative team
- Work on a product targeting international markets
- Professional development support — workshops, courses, internal training
- Relocation package (travel and accommodation support)
- Health insurance (including dental) for you and your family
- Transport reimbursement
- Free lunches at the office
- English language courses
- Wellness benefits — partial compensation for sports, yoga, therapy