The Expertise You Have
- BS or MS in a technical discipline or equivalent
- 6-8+ years’ experience in enterprise Hadoop based environments (specifically in AWS Elastic Map Reduce)
- Be highly proficient in the following languages, tool, and frameworks relevant to the components and techniques relating to Hadoop implementations, e.g.:
- Hadoop, HDFS, Sqoop, Spark, Presto, Oozie
- Notebook platforms
- Scala, Python or Java (and specifics of integration with the Hadoop libraries/components)
- Amazon Web Services / Elastic Map Reduce architecture and operation
- Oracle, SQL, MongoDb
- RHEL Linux, basic shell scripting, operating AWS instances and environments
- ETL (Informatica) and data warehouse design
- Public / Hybrid Cloud platforms and practices (ideally AWS)
- An understanding of client reporting tools (Tableau, Oracle OBIEE)
- Java JEE
- Web services (REST, JSON, HTTP)
- CRM / Salesforce
- DevOps and deployment models
- Good communication skills, both verbal and written
- Proven track record of successful implementation of Data projects and solutions
The Skills You Bring:
- Your experience as a Hadoop developer
- Your strong time management, balancing work dependencies
- Your ability to work effectively in teams consisting not only of peers but representatives from our applications development and operations teams, who in effect are our customers
- Your teamwork skills; Ability to communicate effectively with a mix of people – technical / non-technical
- A self-starter who can lead and take ownership
- Your ability to multi-task complex problems effectively
- Experience in agile environments – used to iterative delivery
- Flexible and able to work across a mix of modern and legacy systems