AgileEngine is seeking highly motivated and experienced Data Engineers to join our team. You’ll design, build and maintain big data systems – extracting, transforming, and loading big data. The role is a part of our Analytics organization and requires the ability to design, iterate, learn, and improve. You’ll work in a highly collaborative organization with a high degree of autonomy.
Arity and its mission
An estimated $3 trillion in wasted spending. One workweek spent in traffic. 40,000 lives lost—every single year. Does that seem like a system working as it should?Technical Experience & Skills
- Experience in building ETL pipelines using AWS services using Java
- AWS services knowledge: AWS Glue, AWS Step Functions
- Good to have experience workflow orchestration pipelines like Apache Airflow etc.
- Prior experience working on batch processing and end to end data processing systems.
- Understanding of data transformation specifications and data formats like JSON, Avro, XML, CSV, UTF-8, Base-64 encoded and standard HTTP response/ error codes
Qualifications
- A college degree or equivalent experience in Computer Science or a similar field, including a solid understanding of Computer Science fundamentals
- 3+ years of experience in developing data engineering solutions with Java technologies
- Strong understanding of computer science concepts, object-oriented design principles
- Experience in building RESTful API
- Excellent Git flow patterns
- Experience with AWS native data storage and processing technologies
- Experience working with stream and batch data processing environments
- Scala knowledge is a great plus
About AgileEngine
How we lead
Haven’t found the right position?
We are always in search of awesome experts. Share your CV, and we’ll notify you once we have the right opportunity.
Subscribe
Our geography
UTC-5
WASHINGTON DC USA
UTC-5
MIAMI USA
UTC-6
MEXICOMexico
UTC-5
ColombiaColombia
UTC-3
BrazilBrazil
UTC-3
ArgentinaArgentina
UTC+2
UkraineEurope
UTC+1
PolandEurope
UTC+0
PortugalPortugal
UTC+5:30