Must haves
- BSc. in Computer Sciences from a top university, or equivalent;
- 5+ years in data engineering, and data pipeline development in high-volume production environments;
- 2+ years experience with Java;
- 2+ years experience with monitoring systems (Prometheus, Grafana, Zabbix, Datadog);
- Ability to develop, design, and maintain end-to-end ETL workflows, including data ingestion and transformation logic, involving different data sources;
- Experience with data-engineering cloud technologies as: Apache Airflow, K8S, Clickhouse, Snowflake, Redis and cache technologies;
- Experience with relational and non-relational DBs. Proficient in SQL and queries optimizations;
- Experience with designing infrastructure to maintain high availability SLAs;
- Experience with monitoring and managing production environments;
- Upper-intermediate English level.
About the project
The benefits of joining us
Professional growth
Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps
Competitive compensation
We match your ever-growing skills, talent, and contributions with competitive USD-based compensation and budgets for education, fitness, and team activities
A selection of exciting projects
Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands
Flextime
Tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office – whatever makes you the happiest and most productive.
Your AgileEngine journey starts here
Test task
We will review your CV and send you a test task via email
Intro Call
Our recruitment team will reach you to discuss available opportunities
WFH or a comfy office? Why not both?
International Projects
Technical Interview
You will have an interview with your future team lead