Advanced Analytics :- Hadoop/Dev ops Engineer

Posted 11 Jul 2019

Bangalore SBS, Karnataka - India

Req Id 194749

Details

A career at our company is an ongoing journey of discovery: our around 52,000 people are shaping how the world lives, works and plays through next generation advancements in healthcare, life science and performance materials. For more than 350 years and across the world we have passionately pursued our curiosity to find novel and vibrant ways of enhancing the lives of others.


Job Details:

 

Candidate is responsible for designing, developing, testing and supporting data ingesting and modelling pipelines that leverages Enterprise Level ETL tools  such as SAP Data Services or Informatica for the Org IT Enterprise Information Management (EIM).Also, developing data catalog pipelines that encompasses data ingestion, virtualization, metadata management and governance is added advantage.

In this role, you will be part of a growing, global team of DevOps engineers, system admins and infrastructure technicians who collaborate to design, build, test and implement solutions across Life Sciences, Finance, Manufacturing and Healthcare. 

 

Skills required:

 

Experience: 4-10 years

 

Hadoop General

Deep knowledge of distributed file system concepts, map-reduce principles and distributed computing.  Knowledge of Spark and differences between Spark and Map-Reduce.  Familiarity of encryption and security in a Hadoop cluster.

 

HDFS

HDFS and Hadoop File System Commands

 

Hive

Creating and managing tables; experience of building partitioned tables; HQL; controlling Yarn queues in Hive operations

 

Sqoop

Full knowledge of sqoop including creating and running sqoop jobs in incremental and full load

 

Oozie

Experience in creating Oozie workflows to control Java, Hive, Spark and Shell actions using

 

Spark

Experience in launching spark jobs in client mode and cluster mode.  Familiarity with the property settings of spark jobs and their implications to performance.

 

SCC/Git

Must have leveraged source code control and automated build & deploy tools on large projects, integrating continuous unit testing

 

Docker (good to have)

Experience or understanding of Docker is required as will a good understanding of micro-services architecture

 

Linux

Must be experienced in Enterprise Linux command line, preferably in SUSE Linux

 

Shell Scripting

Ability to write parameterized shell scripts using functions and familiarity with Unix tools such as sed/awk/etc

 

Programming

Must be at proficient level in Python or proficient in at least one high level language such as Java, C, Scala.  Knowledge of Python virtual environments and python package creation will be a plus.

 

SQL

Must be an proficient in manipulating database tables using SQL.  Familiarity with views, functions, stored procedures and exception handling.

 

IT Process Compliance

SDLC experience and formalized change controls

 

Specific information related to the position:

•Physical presence in primary work location (Bangalore)

•Flexible to work CEST and US EST time zones (according to team rotation plan)


What we offer:  With us, there are always opportunities to break new ground. We empower you to fulfil your ambitions, and our diverse businesses offer various career moves to seek new horizons. We trust you with responsibility early on and support you to draw your own career map that is responsive to your aspirations and priorities in life. Join us and bring your curiosity to life!

Curious? Apply and find more information at https://jobs.vibrantm.com

Apply Now

Let’s stay connected

Do you want to receive company news and information about career opportunities tailored to your preferences? Sign up here. You want to check the status of your application or change your candidate profile? Enter our job portal.

Redirect

You have accessed https://www.emdgroup.com, but for users from your part of the world, we originally designed the following web presence https://www.merckgroup.com.

Let's go

Share Disclaimer

By sharing this content, you are consenting to share your data to this social media provider. More information are available in our Privacy Statement