shiv shakti resume

6
Shiv Shakti [email protected] Hadoop Developer at Accenture Pvt. Ltd. Job Objective: Seeking exposure to work on Hadoop ecosystem and automation projects where my skills and work experience can be utilized to the fullest. I want to work with committed and dedicated people and grow in and with the organization. Summary: 4 Years and 8 month of overall experience in Hadoop Development and automation projects. 2 Years and 3 month of relevant experience as a Big data professional with expertise in handling structured and unstructured data using python, HiveQL, Sqoop, Oozie, PigLatin, Impala, HDFS and other hadoop ecosystem. Sound experience on HDFS, MapReduce, Hadoop ecosystem components like Hive, Pig, Sqoop, Oozie, Flume and MySQL databases, Impala. Knowledge on automation projects using Shell Script, Python, MySQL, Core Java. Ability to play key role in the team in technical implementation as well as cross team communication. Providing training to IT employee on Big data, Hadoop and hadoop ecosystem through external vendor (21 st century software solutions.) Technical Proficiency: Framework: Hadoop and it’s component (Hive, Pig, OOZie, Sqoop, Flume, HBase, Impala) Languages: ShellScript, python, SQL, HQL, Piglatin, Core java, Basics of HTML, Impala Databases: MySQL Platforms: Unix/Linux, Windows Professional Experience: Hadoop Developer Accenture Mar’14 – present (2 Years and 3 month) Project #: 1 Project Name : R&F IT Platform Service Big Data Bangalore Cell: +91- 9036284261

Upload: shiv-shakti

Post on 14-Jan-2017

100 views

Category:

Engineering


1 download

TRANSCRIPT

Page 1: Shiv shakti resume

Shiv [email protected] Developer at Accenture Pvt. Ltd.

Job Objective:

Seeking exposure to work on Hadoop ecosystem and automation projects where my skills and work experience can be utilized to the fullest. I want to work with committed and dedicated people and grow in and with the organization.

Summary:

4 Years and 8 month of overall experience in Hadoop Development and automation projects. 2 Years and 3 month of relevant experience as a Big data professional with expertise in handling

structured and unstructured data using python, HiveQL, Sqoop, Oozie, PigLatin, Impala, HDFS and other hadoop ecosystem.

Sound experience on HDFS, MapReduce, Hadoop ecosystem components like Hive, Pig, Sqoop, Oozie, Flume and MySQL databases, Impala.

Knowledge on automation projects using Shell Script, Python, MySQL, Core Java. Ability to play key role in the team in technical implementation as well as cross team communication. Providing training to IT employee on Big data, Hadoop and hadoop ecosystem through external vendor

(21st century software solutions.)

Technical Proficiency:Framework: Hadoop and it’s component (Hive, Pig, OOZie, Sqoop, Flume, HBase, Impala)Languages: ShellScript, python, SQL, HQL, Piglatin, Core java, Basics of HTML, ImpalaDatabases: MySQLPlatforms: Unix/Linux, Windows

Professional Experience:

Hadoop DeveloperAccentureMar’14 – present (2 Years and 3 month)

Project #: 1Project Name : R&F IT Platform Service Big Data

Client : Credit SuisseDomain : Investment Banking

Duration : 2014 Mar to Till dateEnvironment & Tools : Hadoop, Hive, Pig, oozie, Sqoop, python, Impala, shell script, HDFS Concepts.

Team Size : Four

Roles & Responsibilities:

Gathering logs from several system and store(ingest) it into HDFS. Importing data from external database to HDFS using sqoop. Using Python and shell script to do pre-processing on data. Writing MapReduce function to retrieve different outcome from the logs(data). Creating Hive partitioned tables to store the processed results in a tabular

BangaloreCell: +91-9036284261

Page 2: Shiv shakti resume

(structured) format. Writing HQL queries to analyze and process data stored in Hive tables Writing Impala queries to get Insight from data stored into HIVE table. Writing various Apache PIG scripts to process the data stored in HDFS. Developing Sqoop scripts and jobs in order to import/export data between Hadoop

(HDFS) and MySQL Database. Create Oozie workflow to automate several hadoop(hive, sqoop, map-reduce,

sending emails, pig) processes. Building mechanism to handle error in hadoop job workflow using oozie. Using MapReduce(python) to develop Hadoop applications and jobs. Setting up Pseudo/multinode Hadoop cluster, installation and configuration of

Hadoop Eco System, HDFS, Pig, Hive, sqoop, HBase and Impala

Project Description : Platform services big data team is responsible for analyzing big data requirement from various application(clusternet/Marsnet/MET/Basel2/Basel3) and process all the data (logs/RDBMS tables) and come with conclusion which helps management in fixing long term issues related to system/environment. Creating OOzie workflow to automate workflow of several hadoop component together to run as one automated process.

Project #: 2Project Name : Reg-IT Big data

Client : Credit SuisseDomain : Investment Banking

Duration : From February 2015 - till Present Environment & Tools : Sqoop, Hive, Pig, Impala, HDFS, Mapreduce, oozie. Team Size : Three

Roles & Responsibilities: Ingesting data from oracle database to HDFS using sqoop (incremental import). Creating temporary HIVE table to store data for pre-processing. Creating external HIVE table to store pre-processed data. Writing various Hive queries to get insights from data. Writing various Impala queries to process and analyze data. Exporting processed data into oracle database using sqoop export. Automating whole big data process using oozie. Handling oozie error mails and resolving issue related to whole hadoop process.

Project Description : Client wanted to migrate their data from oracle database to hadoop to take most of the advantages from it’s data. Client wanted to analyze data with the help of hadoop ecosystem(Pig, Hive, Impala). They wanted to take advantages of hadoop features like parallel processing of it’s application and distributed storage (HDFS). Client wanted to extract several insights of it’s data with HQL or PIG Latin.

Project #: 3Project Name : Social Media Dashboard

Client : Credit SuisseDomain : Investment Banking

Duration : From February 2015 – December 2015Environment & Tools : Python, HQL, Shell scripting, Hadoop Framework, Flume, HBase Team Size : Four

Roles & Responsibilities: Reading Text files extracted from social media channels into CSV format using Python in the Hadoop environment.

Page 3: Shiv shakti resume

Writing python code to create Page level and Post level excel files for every channel. Developing Python program to check for data inconsistencies in the text files. Creating separate tables in Hive for Page and post level data for every channel. Writing Hive queries to append the excel data to the respective Channel’s Table. Creating Job to automate python program to run daily using shell script. Automating Hive queries to append the data on a daily basis

Project Description : Client wanted to gauge performance of various campaigns on their social media pages on various Channels like Facebook, LinkedIn, Twitter, YouTube and Google+. Digital Marketing team wanted to build a QlikView dashboard to track Impression, Engagement and conversion level metrics for page level as well as post level activities across these Channels

Automation EngineerAccentureFeb’13 - Oct’14 (1 Year and 8 Months)

Project #: 4Project Name : Automated Environment Management

Client : Credit SuisseDomain : Investment Banking

Duration : From February 2013 - till Present Environment & Tools : Unix/Linux, Shell scripting, Python, MySql, JIRA Team Size : Six

Roles & Responsibilities: Write automation script using shell script/Python/SQL Automate deployment/manual process on Linux/Unix boxes Test scripts on sample data and applying scripts on production environment. Create JIRA to track development process of automation.

Project Description : Environment Management team was responsible for automation of all the manual process in Production Environment Management. Team was also responsible for development and deploymentof Environment Management dashboard to display dynamic information about application which was hosted in the environment.

Software EngineerIrely soft services.Sept’11 - Dec’12 (1 Year and 3 Months)

Project #: 5Project Name : Sales Order Management

Client : SDSFT (Securities Dealing Systems)Duration: From September 2011– December 2012

Environment&Tools : LINUX, Bash Shell scripting, SQL, python Role: Software Developer Team Size : Four

Roles & Responsibilities: Appending products data onto already existing SQL tables

Read text file of all the products into excel format using Python Loading data into application for different subscriber to view latest data

Page 4: Shiv shakti resume

Project Description : SDSFT client is a Market Data System for all the US exchanges (CME, LME, NYSE). Its main functionality included maintaining sales order and provides filtered data to the subscribers. The client enabled subscribersto view the latest information for all the Products from different Exchanges.

Education & Credentials:

B.Tech in Information Technology Paavai Eng Collage (Anna University) 8.35 CGPA

Senior Secondary 12th (C.B.S.E) 71.20 %DAV Public School, Patna, Bihar

Higher Secondary 10th (C.B.S.E) 71.10 %DAV Public School, Muzaffarpur, Bihar

Key Accomplishments: Received Champion award for R&F IT Platform Service Big Data project in 2015. Got ACE Award in 2014 for automation projects. CBSE Under 18 doubles winner in Table Tennis doubles in 2007. Winner of several inter state tournament in Table Tennis singles as well as doubles.

Personal Summary:Date of Birth : 27th December 1989

Permanent Address : Bhellcolony,q no:-f 5/5, P.O. Khabra, Dist.Muzaffarpur, Bihar -843146Language Known : English, Hindi, Basics of Tamil

I hereby declare that all the information furnished above is true to the best of my knowledge and belief.

Shiv Shakti(Applicant) (Date)