WhatYou Need for this Position:
· Experience developing applications torun in a large-scale environment.
· Experience in designingand implementing distributed data processing pipelines using Spark, Hive,Sqoop, Python, Airflow(scheduling tool)and other toolsand languages prevalent in the Hadoop ecosystem.
· Strong knowledge and hands-on experiencewith Java.
· Experience designing REST APIs anddeveloping RESTful Web Services
· Experience withperformance/scalability tuning, algorithms and computational complexity
· Experience developing Object Oriented,multi-tier applications in a complex architectural landscape.
· Experience (at leastfamiliarity) with data warehousing, dimensional modeling and ETL development
· Able to maintain the build and deploymentprocess through use of build integration tools.
· Experience working in an Agiledevelopment environment.
· Build and incorporateautomated unit tests and integration tests.
· Proven ability to workcross functional teams to deliver appropriate resolution
Preferredif you have:
· Experience with AWScomponents and services, particularly, EMR, S3, and Lambda
· Experience with opensource NOSQL technologies such as DynamoDB
· Experience withmessaging & complex event processing systems such as Kafka and Storm
· Automated testing,Continuous Integration / Continuous Delivery