Search by job, company or skills
Kerry is currently hiring a Big Data Developer / Data Modeller that is proficient with Cloudera Distributed Hadoop (CDH), Big Data, Spark, and Informatica. This role will involve strong experience in testing life cycles as well as excellent verbal and written communications skills to add value to the client-facing experiences.
If you have exposure to the core banking / financial services domain and are adept at learning and applying new technologies, we would like to hear from you.
Responsibilities
. Big Data implementation using Cloudera Distributed Hadoop (CDH), Hive, Pig, Sqoop, Flume, Oozie, MapReduce, YARN, Spark, Storm, Kafka
. Development of a retail banking / financial services domain, perform hadoop administration
. NoSQL databases (e.g., HBase, Cassandra, Mongo DB, Couchbase), SQL and relational database programming
. Data integration using Informatica, Data Stage, Advanced SQL, Data modelling
. Working in an agile development methodology
Skills and experience required:
Min Degree in engineering, computer science, or related technical field with 5 years of relevant working experience and 2 years experience on Big Data implementation projects
Experience in advanced SQL, Data Modeling and Agile development methodology is advantageous Certification in big data will be an advantage.
Date Posted: 20/11/2024
Job ID: 100932967