shiv shakti

5

Click here to load reader

Upload: shiv-shakti

Post on 16-Apr-2017

84 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: Shiv Shakti

Shiv [email protected] Developer at Accenture Pvt. Ltd.

Job Objective:

Seeking exposure to work on Hadoop ecosystem and automation projects where my skills and work experience can be utilized to the fullest. I want to work with committed and dedicated people and grow in and with the organization.

Summary:

4 Years and 4 months overall experience in Hadoop Development and automation projects 1 Year and 4 months of relevant experience as a Big data professional with expertise in handling

structured and unstructured datausing HiveQL, Sqoop, Oozie, PigLatin, HDFS and other hadoop ecosystem Sound experience on HDFS, MapReduce and Hadoop ecosystem components like Hive, Pig, Sqoop, Oozie,

Flume, MySQL databases Knowledge on automation projects using Shell Script, Python, MySQL, Core Java Ability to play key role in the team in technical implementation as well as cross team communication

Technical Proficiency:Framework: Hadoop, Hive, Pig, OOZie, Sqoop, Flume, HBaseLanguages: ShellScript, python, SQL, HQL, Piglatin, Core java, Basics of HTMLDatabases: MySQLPlatforms: Unix/Linux, Windows

Professional Experience:

Hadoop DeveloperAccentureOct’14 – present (1 Year and 4 Months)

Project #: 1Project Name : R&F IT Platform Service Big Data

Client : Credit SuisseDomain : Investment Banking

Duration : 2014 Oct to Till dateEnvironment & Tools : Hadoop, Hive, Pig, OOZie, Sqoop, Flume, HBase.

Team Size : Four

Roles & Responsibilities:

Gathering logs from several system and store it into HDFS Writing MapReduce function to retrieve different outcome from the logs(data). Creating Hive partitioned tables to store the processed results in a tabular format Writing HQL queries to analyze and process data stored in HDFS Writing various Apache PIG scripts to process the HDFS data Developing Sqoop scripts and jobs in order to import/export data between Hadoop

(HDFS) and MySQL Database Create Oozie workflow to automate several hadoop(hive, sqoop, map-reduce,

BangaloreCell: +91-9036284261

Page 2: Shiv Shakti

sending emails) processes. Using MapReduce(python) to develop Hadoop applications and jobs Setting up Pseudo/multinode Hadoop cluster, installation and configuration of

Hadoop Eco System, HDFS, Pig, Hive, sqoop and HBase

Project Description : Platform services big data team is responsible for analyzing big data requirement from various application(clusternet/Marsnet/MET/Basel2/Basel3) and process all the data (logs/RDBMS tables) and come with conclusion which management in fixing long term issues related to system/environment. Creating OOzie workflow to automate workflow of several hadoop component together to run as one automated process.

Project #: 2Project Name : Social Media Dashboard

Client : Credit SuisseDomain : Investment Banking

Duration : From February 2015 - till Present Environment & Tools : Python, HQL, Shell scripting, Hadoop Framework Team Size : Four

Roles & Responsibilities: Reading Text files extracted from social media channels into CSV format using Python in the Hadoop environment

Writing python code to create Page level and Post level excel files for every channel Developing Python program to check for data inconsistencies in the text files Creating separate tables in Hive for Page and post level data for every channel Writing Hive queries to append the excel data to the respective Channel’s Table Creating Job to automate python program to run daily Automating Hive queries to append the data on a daily basis

Project Description : Client wanted to gauge performance of various campaigns on their social media pages on various Channels like Facebook, LinkedIn, Twitter, YouTube and Google+. Digital Marketing team wanted to build a QlikView dashboard to track Impression, Engagement and conversion level metrics for page level as well as post level activities across these Channels

Automation EngineerAccentureFeb’13 - Oct’14 (1 Year and 8 Months)

Project #: 3Project Name : Automated Environment Management

Client : Credit SuisseDomain : Investment Banking

Duration : From February 2013 - till Present Environment & Tools : Unix/Linux, Shell scripting, Python, MySql, JIRA Team Size : Six

Roles & Responsibilities: Write automation script using shell script/Python/SQL Automate deployment/manual process on Linux/Unix boxes Test scripts on sample data and applying scripts on production environment. Create JIRA to track development process of automation.

Project Description : Environment Management team was responsible for automation of all the manual process in

Page 3: Shiv Shakti

Production Environment Management. Team was also responsible for development and deploymentof Environment Management dashboard to display dynamic information about application which was hosted in the environment.

Software EngineerIrely soft services.Sept’11 - Dec’12 (1 Year and 3 Months)

Project #: 4Project Name : Sales Order Management

Client : SDSFT (Securities Dealing Systems)Duration: From September 2011– December 2012

Environment&Tools : LINUX, Bash Shell scripting, SQL, python Role: Software Developer Team Size : Four

Roles & Responsibilities: Appending products data onto already existing SQL tables

Read text file of all the products into excel format using Python Loading data into application for different subscriber to view latest data

Project Description : SDSFT client is a Market Data System for all the US exchanges (CME, LME, NYSE). Its main functionality included maintaining sales order and provides filtered data to the subscribers. The client enabled subscribersto view the latest information for all the Products from different Exchanges.

Education & Credentials:

B.Techin Information Technology PaavaiEng Collage (Anna University) 8.35 CGPA

Senior Secondary 12th (C.B.S.E) 71.20 %DAV Public School, Patna, Bihar

Higher Secondary 10th (C.B.S.E) 71.10 %DAV Public School, Muzaffarpur, Bihar

Key Accomplishments: Received Champion award for Big data Social Media project in 2015 Got ACE Award in 2014 for automation projects CBSE Under 18 doubles winner in Table Tennis doubles in 2007

Personal Summary:Date of Birth : 27th December 1989

Permanent Address : Bhellcolony,q no:-f 5/5, P.O. Khabra, Dist.Muzaffarpur, Bihar -843146Language Known : English, Hindi, Basics of Tamil

I hereby declare that all the information furnished above is true to the best of my knowledge and belief.Shiv Shakti(Applicant) (Date)