Role and Responsibilities:
- Build COE team to provide excellent service to platform user by solution standardization, user enabling, solution tuning and deployment.
- Daily communication with key business team or users to manage user voice better.
- Support business cases delivery efficiency by providing solution consulting service.
Position Requirements:
- Bachelor degree or above in computer related major, and experience in design and development of big data architecture over 5 years.
- Good at data engineering, data modelling, public cloud solution and collaboration.
- Minimum 3 years of experience in deployment using Big Data Technologies (such as - MapReduce, Kafka, HBase) in complex large scale environments
- Minimum 3 years of experience in at least 3 of the following: Pig, Sqoop, MapReduce, Kafka, Spark, Java
- At least 3 years’ experience writing production code with either Java, Scala or Python
- At least 2 year of combined experience with the following technologies: Spark, Scala, Akka, Cassandra, Accumulo, Hbase, Hadoop, HDFS, AVRO, MongoDB, or Mesos
- Good at communication and customer satisfaction management.