Short Description
Hadoop Developer Roles and Responsibilities
- Defining job flows
- Managing and Reviewing Hadoop Log Files
- Manage Hadoop jobs using scheduler
- Support MapReduce programs running on the Hadoop cluster.
- Writing jobs using Pig, Hive, Impala, Sqoop, Spark
Skills Required
- Ability to write MapReduce jobs
- Experience in writing Pig Latin scripts
- Hands on experience in HiveQL
- Familiarity with data loading tools like Flume, Sqoop
- Knowledge of workflow/schedulers like Oozie
About Capgemini
Capgemini, one of the world's foremost providers of consulting, technology and outsourcing services, enables its clients to transform and perform through technologies. Capgemini provides its clients with insights and capabilities that boost their freedom to achieve superior results through a unique way of working - the Collaborative Business Experience - and through a global delivery model called Rightshore®, which aims to offer the right resources in the right location at competitive cost. Present in 36 countries, Capgemini reported 2007 global revenues of EUR 8.7 billion and employs over 83,000 people worldwide.