data stage cv

24
SUMMARY: . More than 8 years total IT experience in developing Business Intelligence solutions including building Data Warehouses, Datamarts and ETL for clients in major industry sectors like Telecom, Pharmacy, Finance and Insurance . More than 6 years of ETL tool experience using IBM Information Server DataStage and QualityStage 8.x, Ascential DataStage 7.x/6.0 in designing, developing, testing and maintaining jobs using Designer, Manager, Director, Administrator and Debugger . Experienced in troubleshooting of DS jobs and addressing production issues like performance tuning and fixing the data issues . Excellent knowledge of studying the data dependencies using Metadata of DataStage and preparing job sequences for the existing jobs to facilitate scheduling of multiple jobs . Strong understanding of the principles of DW using Fact Tables, Dimension Tables, star schema modeling, Ralph-Kimball approach, Bill- Inmon approach . Experienced in writing system specifications, translating user requirements to technical specifications, ETL Source-Target mapping document and testing documents . Experience in integration of various data sources with Multiple Relational Databases RDBM systems Oracle, Teradata, Sybase, SQL Server, MS Access, DB2 . Worked on integrating data from flat files, COBOL files and XML files. . Extensively worked on extracting data from SAP using ABAP Extract Stage. . Experience in writing, testing and implementation of the Triggers, Procedures, functions at Database level and form level using PL/SQL . Sound knowledge in UNIX shell scripting . Knowledge of full life cycle development for building a Data Warehouse . Good working knowledge of Client-Server Architecture . Articulate with excellent communication and interpersonal skills with the ability to work in a team as well as individually. . Certified in IBM WebSphere DataStage 8.5.v. TECHNICAL SKILLS: Data Warehouse: IBM Information Server DataStage and QualityStage 8.1.1, Ascential DataStage EE 7.X/6.5 (Designer, Director, Manager, Administrator), Parallel Extender 7.5.1, 6.0, MetaStage 6.0, Business Objects 6.x, Data Bases: Oracle 91/10g, IBM UDB DB2 9.1/9.7, Teradata V2R6/V212 Dimensional Modeling: Data Modeling, Star Schema Modeling, Snow- Flake Modeling, Fact and Dimensions, Physical and Logical Data Modeling Erwin 3.5.2/3.x Reporting Tools: OBIEE 10g, Crystal Reports 6.x/5.x, Cognos, Business Objects 6.5 UNIX Tools: C Shell, K Shell, Bourne Shell, Perl, AWK, VI, SED Databases: Teradata V2R12/V2R13, Oracle 10g/9i/8i, PL/SQL, UDB DB2 9.1/9.7, Sybase SQL Server 11.0, MS SQL Server 6.5/7.0, TSQL, MS Access 7.0/97/2000, Excel Languages: PL/SQL, SQL*PLUS, C, VB, JDBC, XML

Upload: bhaskar-reddy

Post on 07-Jul-2016

17 views

Category:

Documents


2 download

DESCRIPTION

ds cv

TRANSCRIPT

Page 1: Data Stage Cv

SUMMARY:

. More than 8 years total IT experience in developing Business Intelligence solutions including building Data Warehouses, Datamarts and ETL for clients in major industry sectors like Telecom, Pharmacy, Finance and Insurance . More than 6 years of ETL tool experience using IBM Information Server DataStage and QualityStage 8.x, Ascential DataStage 7.x/6.0 in designing, developing, testing and maintaining jobs using Designer, Manager, Director, Administrator and Debugger . Experienced in troubleshooting of DS jobs and addressing production issues like performance tuning and fixing the data issues . Excellent knowledge of studying the data dependencies using Metadata of DataStage and preparing job sequences for the existing jobs to facilitate scheduling of multiple jobs . Strong understanding of the principles of DW using Fact Tables, Dimension Tables, star schema modeling, Ralph-Kimball approach, Bill- Inmon approach . Experienced in writing system specifications, translating user requirements to technical specifications, ETL Source-Target mapping document and testing documents . Experience in integration of various data sources with Multiple Relational Databases RDBM systems Oracle, Teradata, Sybase, SQL Server, MS Access, DB2 . Worked on integrating data from flat files, COBOL files and XML files. . Extensively worked on extracting data from SAP using ABAP Extract Stage. . Experience in writing, testing and implementation of the Triggers, Procedures, functions at Database level and form level using PL/SQL . Sound knowledge in UNIX shell scripting . Knowledge of full life cycle development for building a Data Warehouse . Good working knowledge of Client-Server Architecture . Articulate with excellent communication and interpersonal skills with the ability to work in a team as well as individually. . Certified in IBM WebSphere DataStage 8.5.v.

TECHNICAL SKILLS:

Data Warehouse: IBM Information Server DataStage and QualityStage 8.1.1,Ascential DataStage EE 7.X/6.5 (Designer, Director, Manager,Administrator), Parallel Extender 7.5.1, 6.0, MetaStage 6.0, BusinessObjects 6.x,Data Bases: Oracle 91/10g, IBM UDB DB2 9.1/9.7, Teradata V2R6/V212Dimensional Modeling: Data Modeling, Star Schema Modeling, Snow- FlakeModeling, Fact and Dimensions, Physical and Logical Data Modeling Erwin3.5.2/3.xReporting Tools: OBIEE 10g, Crystal Reports 6.x/5.x, Cognos, BusinessObjects 6.5UNIX Tools: C Shell, K Shell, Bourne Shell, Perl, AWK, VI, SEDDatabases: Teradata V2R12/V2R13, Oracle 10g/9i/8i, PL/SQL, UDB DB2 9.1/9.7,Sybase SQL Server 11.0, MS SQL Server 6.5/7.0, TSQL, MS Access 7.0/97/2000,ExcelLanguages: PL/SQL, SQL*PLUS, C, VB, JDBC, XML

Page 2: Data Stage Cv

Professional Experience

Boston College, Boston, MA Feb 2012-PresentSr. DataStage Developer

Boston College (BC) is a private Jesuit research university located in thevillage of Chestnut Hill, Massachusetts, USA. The main campus is bisectedby the border between the cities of Boston and Newton. It has 9,200 full-time undergraduates and 4,000 graduate students. It is a member of the 568Group and the Association of Jesuit Colleges and Universities. BostonCollege offers bachelor's degrees, master's degrees, and doctoral degreesthrough its nine schools and colleges. Boston College is currently ranked31 in the National Universities ranking by U.S. News & World Report.

Responsibilities:

. Involved in all phases of SDLC. . Responsible for creating detailed design and source to target mappings. . Responsible to communicate with business users and project management to get business requirements and translate to ETL specifications. . Used DataStage/QualityStage Designer to import/export jobs, table definitions, Custom Routines and Custom Transformations. . Created Extract Transform and Load (ETL) interfaces and gateways for backend database. . Designed Mappings between sources to operational staging targets, using Star Schema, Implemented logic for Slowly Changing Dimensions (SCD). . Extensive hand on experience on design and developing Parallel and server jobs. . Extensively worked on building datastage jobs using various stages like Oracle Connector, Funnel, Transformer stage, Sequential file stage, LookUp, Join and Peek Stages. . Extensively used Sort, Merge, Aggregator, Peek, DataSet and Remove Duplicates stages. . Involved in the migration of DataStage jobs from development to QA and then to production environment. . Created shared containers to use in multiple jobs. . Hands on experience upgrading datastage from v7.5 to Information Server 8.1.1. . Imported and exported Repositories across DataStage projects using datastage designer. . Extensively worked on DataStage Job Sequencer to Schedule Jobs to run jobs in Sequence.

Environment: IBM Information Server 8.1.1 DataStage and QualityStage(Designer, Directory and Administrator), Oracle 10g, Cognose v10, PL/SQL,HP-UNIX 11, Toad for Oracle.

..............................................................................

Page 3: Data Stage Cv

.................................................

Freddie Mac, McLean, VA Jun 2011 - Jan 2012Sr. DataStage Developer

The Federal Home Loan Mortgage Corporation (FHLMC), known as Freddie Mac,is a public Government Sponsored Enterprise (GSE). The FHLMC was created in1970 to expand the secondary market for mortgages in the US. Along withother GSEs, Freddie Mac buys mortgages on the secondary market, pools them,and sells them as a mortgage-backed security to investors on the openmarket. This secondary mortgage market increases the supply of moneyavailable for mortgage lending and increases the money available for newhome purchases.

Responsibilities:

. Designed and developed jobs for extraction of data from different datafeeds into IBM DB2 database. . Coded many shell scripts for efficient job scheduling. . Worked on preparing the test cases and testing ETL jobs and data validation. . Developed parallel jobs using various Development/debug stages and processing stages (Aggregator, Change Capture, Change Apply, SAP ABAP Extract Stage, IDoc Stage, BAPI Stage, Filter, Sort & Merge, Funnel, and Remove Duplicate Stage). . Worked on change management system on code migrations from Dev to QA to Prod environments. . Performed debugging on these jobs using Peek stage by outputting the data to Job Log or a stage. . Extensively worked on building ETL interfaces to read and write data from DB2 data base using DB2 Enterprise Stage and DB2 API Stage. . Involved in functional and technical meetings and responsible for creating ETL Source - to - Target maps. . Modifying the existing jobs and Hash files according to the changing business rules . Loading the Historical data into the Warehouse. . Developed jobs for transforming the data and stages like Join, Merge, Lookup, Funnel, Transformer, Pivot and Aggregator . Experience with Scheduling tool Autosys for automating the ETL process. . Involved in developing a Control Module for the complete process using PERL and UNIX scripting. . Worked on documenting technical design documents and source to target (STT) documents. . Involved in Unit Testing, Integration testing and UAT Performance Testing. . Worked with Embarcadero to interact with DB2.

Environment: IBM Information Server 8.5 (Designer, Director andAdministrator), QualityStage, Test Director, ClearCase, AutoSys, K-ShellScripts, SAP ECC R3, SAP BW, DS Extract PACK for SAP 5.1,IBM DB2 9.1, AIX5.3, Embarcadero for DB2

..............................................................................

.......................................

First Tennessee Bank, Memphis, TN

Page 4: Data Stage Cv

Jan 2010 - May 2011Sr. Datastage DeveloperProject: TSYS to FDR conversion

First Tennessee Bank is now the largest Tennessee-based bank with anygenuine influence beyond the borders of the state. Chartered in 1864 asFirst National Bank, First Horizon National Corporation (FHN) has grown tobe one of the largest bank holding companies in the United States in termsof asset size. FHN's approximately 6,000 employees provides financialservices through about 180 bank locations in and around Tennessee and 21FTN Financial Group offices in the U.S. and abroad. AARP and Working Mothermagazine have recognized FHN as one of the nation's best employers.

Responsibilities:

. Interacted with End user community (FDR) to understand the business requirements and in identifying data sources. . Analyzed the existing informational sources and methods to identify problem areas and make recommendations for improvement. This required a detailed understanding of the data sources and researching possible solutions. . Used Classic Federation Server to get file from the Mainframe to Unix and vice versa. . Used DataStage stages namely Z/OS file stage using cobol copy books to extract the data from the Mainframe through Classic Federation server , Column Export, Column Import, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the ETL Coding. . Developed job sequencer with proper job dependencies, job control stages, triggers. . Used Zeke job scheduler for automating the monthly regular run of DW cycle in both production and UAT environments. . Reviewed reports on Mainframe TSO ISPF environment and allocated datasets on the mainframe using Classic Federation jobs. . Created shared containers to simplify job design. . Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed. . Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, regression testing, prepared test data for testing, error handling and analysis. . Worked on change management system on code migrations from Dev to QA to Prod environments. . Extensively worked on building ETL interfaces to read and write data from DB2 data base using DB2 Enterprise Stage and DB2 API Stage. . Involved in functional and technical meetings and responsible for creating ETL Source - to - Target maps. . Modifying the existing jobs and Hash files according to the changing business rules . Loading the Historical data into the Warehouse. . Developed jobs for transforming the data and stages like Join, Merge, Lookup, Funnel, Transformer, Pivot and Aggregator . Experience with Scheduling tool Zeke for automating the ETL process.

Environment: IBM Information Server 8.5 (Designer, Director andAdministrator), QualityStage, Test Director, ClearCase, Zeke, K-ShellScripts, Mainframe TSO ISPF, IBM DB2 9.1, AIX 5.3, WinSQL for

Page 5: Data Stage Cv

DB2,Peoplesoft 9.2.

..............................................................................

...............................................

Pfizer Pharmaceutical, Apr 2009 - Dec 2010 Bridgewater, NJDataStage Developer

Pfizer pharmaceutical is dedicated to discovering and developing new, andbetter, ways to prevent and treat disease and improve health and well beingfor people around the world. Data Repository is residing on variousplatforms which is sourced by various Pfizer legacy systems in order toprovide patient-centric care. The main goal of Pfizer is to build anintegrated Enterprise Data warehouse for all of their applications whilereducing operational costs.

Responsibilities:

. Responsible for detailed design and development of Pfizer data- warehouse. . Used DataStage Manager to define Table definitions, Custom Routines and Custom Transformations. . Communicated with business users and management to get business requirements and translate to ETL specifications. . Designed Mappings between sources to operational staging targets, using Star Schema, Implemented logic for Slowly Changing Dimensions (SCD). . Experience with Parallel Extender stages - Funnel/Remove duplicates . Used Built-in, Plug-in and Custom Stages for extraction, transformation and loading of the data, provided derivations over DS Links. . Extensively wrote Custom Routines and Transformations as per the business requirements. . Developed various jobs using CFF, ODBC, Lookup, Aggregator, Sequential file stages. . Extensively used Sort, Merge, Aggregator, Peek, DataSet, DB2 and Remove Duplicates stages. . Involved in the migration of DataStage jobs from development to production environment. . Created shared containers to use in multiple jobs. . Imported and exported Repositories across DataStage projects. . Extensively worked on DataStage Job Sequencer to Schedule Jobs to run jobs in Sequence. . Used shared containers to reuse the specific business logic in various jobs to eliminate 30% of redevelopment

Environment: Ascential DataStage 7.5, (Manager, Designer, Director), DB2,Oracle 8i, PL/SQL, AIX, Toad 7.

..............................................................................

................................................

CITIZENS BANK, Providence, RI Jan 2007 - Feb2009DataStage Developer

. Developed the source to target process and mapping documentation. . Designed and developed jobs for extraction of data from different datafeeds into IBM DB2 database.

Page 6: Data Stage Cv

. Developed jobs for handling different data transformations as per specified requirements using stages like Join, Merge, Lookup, Transformer and Aggregator etc. . Used Change Capture Stage, Sort Merge and Funnel for developing the Delta process. . Designed and developed Shared Containers that can be reused by other parallel jobs. . Developed jobs for loading Data into DB2 target database and used stages like DB2 Bulk loader, DB2 API stages. . Designed the Unit testing and integrated testing process and necessary documentation. . Involved in performance tuning to reduce the time consumption. . Used DataStage Manager for Importing and exporting jobs into different projects. . Used UNIX and PERL scripts to execute jobs and also used the DataStage Director for scheduling, executing and monitoring jobs.

Environment: Ascential DataStage 7.1 Enterprise Edition (Designer, Manager,Director and Administrator), Teradata v2r6, v2r12, Oracle 9i, Aqua DataStudio, Toad, Shell Scripts, AIX 5.1.==============================================================================Professional Summary:

� Over 7 years of experience in Data modeling, Datawarehouse Design, Development and Testing using ETL and Data Migration life cycle using IBM WebSphere DataStage 8.x/7.x� Expertise in building Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Multidimensional Model(Kimball and Inmon),Star and Snowflake schema design.� Experience in analyzing the data generated by the business process, defining the granularity, source to target mapping of the data elements, creating Indexes and Aggregate tables for the data warehouse design and development.� Data Processing experience in designing and implementing Data Mart applications, mainly transformation processes using ETL tool DataStage (Ver8.0/7), designing and developing jobs using DataStage Designer, Data Stage Manager, DataStage Director and DataStage Debugger.� Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning and System Testing.� Excellent in using highly scalable parallel processing Infrastructure using DataStage Parallel Extender.� Efficient in incorporation of various data sources such as Oracle, MS SQL Server, and DB2, Sybase, XML and Flat files into the staging area.� Experience in Mapping Server/parallel Jobs in DataStage to populate tables in Data warehouse and Data marts.� Proven track record in addressing production issues like performance tuning and enhancement.� Excellent knowledge in creating and managing Conceptual, Logical and Physical Data Models.� Experience in dimensional and relational database design.� Strong in Data warehousing concepts, dimensional Star Schema and Snowflakes Schema methodologies.� Expert in unit testing, system integration testing, implementation, maintenance and performance tuning.� Experience in different Scheduling tools like AutoSys for automating and scheduling jobs run.� Excellent with PL/SQL, T-SQL, Stored Procedures, Database Triggers and SQL * Loader.� Experience in UNIX Shell Scripting.� Excellent knowledge of operating systems Windows, UNIX, Macintosh, and databases

Page 7: Data Stage Cv

including Oracle, SQL Server,and DB2.� Experience in implementing Quality Processes like ISO 9001:2000/Audits.� Detail oriented with good problem solving, organizational, analysis, highly motivated and adaptive with the ability to grasp things quickly.� Ability to work effectively and efficiently in a team and individually with excellent interpersonal, technical and communication skills.

Education Qualifications:Masters in Electrical and Computer Engineering.

Skill Sets:IBM Information Server V8.1(DataStage, QualityStage, Information Analyzer), Ascential DataStage V7.5 (Designer, Director, Manager, Parallel Extender)., Oracle 8i/9i/10g, MS SQL Server 2005/2008, DB2 UDB, MS Access, Sybase, SQL, PL/SQL, SQL*Plus, Flat files, Sequential files, TOAD 9.6, Erwin, Microsoft Visio, Oracle Developer 2000, SQL*Loader, IBM Cognos 8.0, IBM AIX UNIX, Red Hat Enterprise Linux 4, UNIX Shell Scripting, Windows NT,/XP, Macintosh,C,C++,VB scripting.

Project Summary:

Prudential Financial, Newark,NJ 1/2009- PresentDataStage DeveloperPrudential Financial, Inc. is a Fortune Global 500, provides insurance, investment management, and other financial products and services to both retail and institutional customers throughout the United States and in over 30 other countries.The project objective was to collect, organize and store data from different operational data sources to provide a single source of integrated and historical data for the purpose of reporting, analysis and decision support to improve the client services.

Hardware/Software:IBM DataStage 8.0 (Designer, Director, Manager, Parallel Extender), Oracle 10g,SQL Server 2008, DB2 UDB, Flat files, Sequential files, Autosys, TOAD 9.6, SQL*Plus, AIX UNIX, IBM Cognos 8.0

Responsibilities:� Interacted with End user community to understand the business requirements and in identifying data sources.� Analyzed the existing informational sources and methods to identify problem areas and make recommendations for improvement. This required a detailed understanding of the data sources and researching possible solutions.� Implemented dimensional model (logical and physical) in the existing architecture using Erwin.� Studied the PL/SQL code developed to relate the source and target mappings.� Helped in preparing the mapping document for source to target.� Worked with Datastage Manager for importing metadata from repository, new job Categories and creating new data elements.� Designed and developed ETL processes using DataStage designer to load data from Oracle, MS SQL, Flat Files (Fixed Width) and XML files to staging database and from staging to the target Data Warehouse database.� Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Row Generator stage

Page 8: Data Stage Cv

s in accomplishing the ETL Coding.� Developed job sequencer with proper job dependencies, job control stages, triggers.� Used QualityStage to ensure consistency, removing data anomalies and spelling errors of the source information before being delivered for further processing.� Excessively used DS Director for monitoring Job logs to resolve issues.� Involved in performance tuning and optimization of DataStage mappings using features like Pipeline and Partition Parallelism and data/index cache to manage very large volume of data.� Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.� Used Autosys job scheduler for automating the monthly regular run of DW cycle in both production and UAT environments.� Verified the Cognos Report by extracting data from the Staging Database using PL/SQL queries.� Wrote Configuration files for Performance in production environment.� Participated in weekly status meetings.

Kaiser Permanente, Pleasanton, CA 05/2007-- 12/2008ETL Designer/ DataStage Developer

Kaiser Permanente is an integrated managed care organization, is the largest health care organization in the United States. The Health Plan and Hospitals operate under state and federal non-profit tax status, while the Medical Groups operate as for-profit partnerships or professional corporations in their respective regions.

The project was to design, develop and maintain a data warehouse for their vendor's data, internal reference data and work with their DBA to ensure that the physical build adheres to the model blueprint.

Hardware/Software:DataStage 7.5.1 Enterprise Edition, Quality Stage, Flat files,Oracle10g, SQL Server -2005/2008, Erwin 4.2, PL/SQL, UNIX, Windows NT/XP

Responsibilities:� Involved in understanding of business processes and coordinated with business analysts to get specific user requirements.� Studied the existing data sources with a view to know whether they support the required reporting and generated change data capture request.� Used Quality Stage to check the data quality of the source system prior to ETL process.� Worked closely with DBA's to develop dimensional model using Erwin and created the physical model using Forward Engineering.� Worked with Datastage Administrator for creating projects, defining the hierarchy of users and their access.� Defined granularity, aggregation and partition required at target database.� Involved in creating specifications for ETL processes, finalized requirements and prepared specification document.� Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the SQL Server database.� Imported table/file definitions into the Datastage repository.� Performed ETL coding using Hash file, Sequential file, Transformer, Sort, Merge, Aggregator stages compiled, debugged and tested. Extensively used stages availa

Page 9: Data Stage Cv

ble to redesign DataStage jobs for performing the required integration.� Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage Director for developing jobs and to view log files for execution errors.� Controlled jobs execution using sequencer, used notification activity to send email alerts.� Ensured that the data integration design aligns with the established information standards.� Used Aggregator stages to sum the key performance indicators used in decision support systems.� Scheduled job runs using DataStage director, and used DataStage director for debugging and testing.� Created shared containers to simplify job design.� Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed.� Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, regression testing, prepared test data for testing, error handling and analysis.

Macy's, Atlanta,GA 12/2005-- 04/2007ETL DeveloperMacy's (NYSE: M) is a chain of mid-to-high range American department stores delivering fashion and affordable luxury to customers coast to coast. Online shopping is offered through macys.com. Its selection of merchandise can vary significantly from location to location, resulting in the exclusive availability of certain brands in only higher-end stores.

The aim of the Project was to build a data warehouse, which would keep historical data according to a designed strategy. Flat files, Oracle tables were part of the source data, which came in on a daily, weekly, monthly basis.

Hardware/Software:IBM Information Server DataStage 7.5, Oracle 10g, SQL, PL/SQL, UNIX, SQL*Loader, Autosys, Business Objects 6.1, Windows 2003, IBM AIX 5.2/5.1, HP Mercury Quality Center 9.0

Responsibilities:� Involved in understanding of business processes and coordinated with business analysts to get specific user requirements.� Used Information Analyzer for Column Analysis, Primary Key Analysis and Foreign Key Analysis.� Extensively worked on DataStage jobs for splitting bulk data into subsets and to dynamically distribute to all available processors to achieve best job performance.� Developed ETL jobs as per business rules using ETL design document� Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance.� Used DataStage maps to load data from Source to target.� Enhanced the reusability of the jobs by making and deploying shared containers and multiple instances of the jobs.� Imported the data residing in the host systems into the data mart developed in Oracle 10g.� Extensively used Autosys for automation of scheduling jobs on daily, bi-weekly, weekly monthly basis with proper dependencies.� Wrote complex SQL queries using joins, sub queries and correlated sub queries� Performed Unit testing and System Integration testing by developing and document

Page 10: Data Stage Cv

ing test cases in Quality Center.� Validated the report generated using Business Objects using PL/SQL queries.� Worked on troubleshooting, performance tuning and performances monitoring for enhancement of DataStage jobs and builds across Development, QA and PROD environments.

Citibank Inc., New York, NY 12/2004--11/2005DataStage DeveloperCitibank, the leading global banking company, has some 200 million customer accounts and does business in more than 100 countries, providing services to consumers, corporations, governments and institutions.

The project was to transform the data coming from various sources through multiple stages before being loaded into the data warehouse and maintenance.

Hardware/Software:Ascential DataStage 7.0(Designer, Manager, Director), Oracle 9i, MS Access, SQL-Server 2000/2005,SQL, PL/SQL, Toad, UNIX

Responsibilities:� Involved in understanding of business processes to learn business requirements.� Extracted data from different systems into Source. Mainly involved in ETL developing.� Defined and implemented approaches to load and extract data from database using DataStage.� Worked closely with data warehouse architect and business intelligence analyst in developing solutions.� Used Erwin for data modeling (i.e. modifying the staging and SQL scripts on Oracle and MS Access Environments).� Involved in design, source to target mappings between sources to operational staging targets, using DataStage Designer.� Performed ETL coding using Hash file, Sequential file, Transformer, Sort, Merge, Aggregator stages compiled, debugged and tested. Extensively used stages available to redesign Data Stage jobs for performing the required integration.� Executed jobs through sequencer for better performance and easy maintenance.� Involved in unit, performance and integration testing of Data Stage jobs.� Used Data Stage Director to run and monitor the jobs for performance statistics.� Involved in performance tuning of the jobs.� Used T-SQL for validating the data generated at OLAP server.

Wipro Technologies, Bangalore 08/2003--11/2004 Quality Assurance Engineer

Wipro Technologies is an Indian Multinational, a leading provider of integrated business, technology and process solutions on a global delivery platform.

Worked as a QA tester on web based E-billing application. The application has two modules account payable (AP) and account receivable (AR) to keep track of transactions.Hardware/Software:Windows XP,ASP.net,Test Director 7.2,WinRunner 7.0,SQL server 2000

Responsibilities:� Active participation in decision making and QA meetings and regularly interacted with the Business Analysts &development team to gain a better understanding of the Business Process, Requirements & Design.� Analyzed business requirements with perspective to Black Box Testing, system architecture/design and converted them into functional requirements/test cases.� Used Test Director to document the requirements and created traceability matrices for the requirements.

Page 11: Data Stage Cv

� Developed Test Plan that included the scope of the release, entrance and exit criteria and overall test strategy. Created detailed Test Cases and Test sets and executed them manually.� Performed Cross-Browsing testing to verify if the application provides accurate information in different (IE, Netscape, Firefox, Safari) browsers.� Extensively used Output and Checkpoint for verifying the UI properties and values using VB scripting.� Back-End Database verification manually and using WinRunner to automatically� Verify Database with the values entered during automated testing.� Performed the Back-End integration testing to ensure data consistency on front-end by writing and executing SQL Queries. Provided management with metrics, reports, and schedules and was responsible for entering, tracking bugs.� Ensured that the Defect was always written with great level of detail.

CertificationsSee above ==========================================================================================================================================================================

Over 6 years of Dynamic career reflecting pioneering experience and high performance in System Analysis, design, development and implementation of Relational Database and Data Warehousing Systems using IBM Data Stage 8.0.1/7.x/6.x/5.x (Info Sphere Information Server, Web Sphere, Ascential Data Stage). Excellent Experience in Designing, Developing, Documenting, Testing of ETL jobs and mappings in Server and Parallel jobs using Data Stage to populate tables in Data Warehouse and Data marts. Proficient in developing strategies for Extraction, Transformation and Loading (ETL) mechanism. Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, Remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify, Aggregator, XML. Expert in designing Server jobs using various types of stages like Sequential file, ODBC, Hashed file, Aggregator, Transformer, Sort, Link Partitioner and Link Collector. Experienced in integration of various data sources (DB2-UDB, SQL Server, PL/SQL, Oracle, Teradata, XML and MS-Access) into data staging area. Expert in working with Data Stage Manager, Designer, Administrator, and Director. Experience in analyzing the data generated by the business process, defining the granularity, source to target mapping of the data elements, creating Indexes and Aggregate tables for the data warehouse design and development. Excellent knowledge of studying the data dependencies using metadata stored in the repository and prepared batches for the existing sessions to facilitate scheduling of multiple sessions. Proven track record in troubleshooting of Data Stage jobs and addressing production issues like performance tuning and enhancement. Expert in working on various operating systems like UNIX AIX 5.2/5.1, Sun Solaris V8.0 and Windows 2000/NT. Proficient in writing, implementation and testing of triggers, procedures and functions in PL/SQL and Oracle. Experienced in Database programming for Data Warehouses (Schemas), proficient in dimensional modeling (Star Schema modeling, and Snowflake modeling). Expertise in UNIX shell scripts using K-shell for the automation of processes and scheduling the Data Stage jobs using wrappers. Experience in using software configuration management tools like Rational Clear case/Clear quest for version control. Experienced in Data Modeling as well as reverse engineering using tools Erwin, Oracle Designer and MS Visio, SQL server management studio, SSIS and SSRS and

Page 12: Data Stage Cv

store procedure. Expert in unit testing, system integration testing, implementation and maintenance of databases jobs. Effective in cross-functional and global environments to manage multiple tasks and assignments concurrently with effective communication skills.

EDUCATIONAL QUALIFICATION:Bachelors in Electronics and Communication,TECHNICAL SKILLS:

ETL Tools

DATA STAGE- IBM Web Sphere Data stage and Quality Stage 8.0, Ascential Data Stage /7.5.2/5.1/6.0 Profile Stage 7.0, SSIS (SQL server 2005), Data Integrator.

Business Intelligence tools

Business Objects, Brio, SSRS(SQL Server 2005),IBM Cognos 8 BI

Development Tools and Languages

SQL, C, C++, Unix Shell Scripting, Perl, PL/SQL, oracle

Testing Tools

Auto Tester, Test Director, Lotus Notes

Data Modeling Tools

Erwin 4.0, Sybase Power Developer, SSIS,SSRS

Operating Systems

HP-UX, IBM-AIX 5.3, Windows 95/98/2000/ NT, Sun Solaris, Red-Hat Linux, MS SQL SERVER 2000/2005/2008& MS AccessWORK EXPERIENCE:Confidential, CA Nov 2010 � Present ETL DeveloperNetApp Inc is leading Network Appliance Manufacturer Company as well as data storage Company which provide Network appliance like hard disk, shelf for small business owners, large business owners. Also NetApp provides efficient data storage facility. The main aim is to provide variety of services like Data storage, Data Analysis, Data warehouse, Data mart etc. which can adopt consistent tailored processes in order to strive and fulfill promise of commitment and reliability to Customers.

Involved as primary on-site ETL Developer during the analysis, planning, design, development, and implementation stages of projects using IBM Web Sphere software (Quality Stage v8.1, Web Service, Information Analyzer, Profile Stage, WISD of IIS 8.0.1). Prepared Data Mapping Documents and Design the ETL jobs based on the DMD with required Tables in the Dev Environment. � Active participation in decision making and QA meetings and regularly interacted with the Business Analysts &development team to gain a better understanding of the Business Process, Requirements & Design. Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the ORACLE database. Designed and Developed Data stage Jobs to Extract data from heterogeneous sources, Applied transform logics to extracted data and Loaded into Data Warehouse Databases. Created Datastage jobs using different stages like Transformer, Aggregator,

Page 13: Data Stage Cv

Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc. Extensively worked with Join, Look up (Normal and Sparse) and Merge stages. Extensively worked with sequential file, dataset, file set and look up file set stages. Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and de-bugging purposes. Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis. Developed complex store procedures using input/output parameters, cursors, views, triggers and complex queries using temp tables and joins. Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance. Creation of jobs sequences. Maintained Data Warehouse by loading dimensions and facts as part of project. Also worked for different enhancements in FACT tables. Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through scheduling tool. Coordinate with team members and administer all onsite and offshore work packages. Analyze performance and monitor work with capacity planning. Performed performance tuning of the jobs by interpreting performance statistics of the jobs developed. Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis. Participated in weekly status meetings. Developed Test Plan that included the scope of the release, entrance and exit criteria and overall test strategy. Created detailed Test Cases and Test sets and executed them manually.

Environment: IBM Web Sphere DataStage 8.1 Parallel Extender, Web Services, Quality Stage 8.1, (Designer, Director, Manager), Microsoft Visio, IBM AIX 4.2/4.1 IBM DB2 Database, SQL Server, IBM DB2,Teradata, ORACLE 11G, Query man, Unix, Windows.

Confidential, NJ Jan 2010 - Oct 2010 Lead Sr. Datastage DeveloperProject was to design and develop enterprise data warehouse. Extract data from heterogeneous source system, transform them using business logic and load in to data warehouse.

Used the DataStage Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into staging tables. Extensively used ETL to load data from IBM DB2 database, XML & Flat files Source to Informix Database Server. Involved in analysis, planning, design, development, and implementation phages of projects using IBM Web Sphere software (Quality Stage v8.0.1, Web Service, Information Analyzer, Profile Stage, WISD of IIS 8.0.1). Developed complex jobs using various stages like Lookup, Join, Transformer, Dataset, Row Generator, Column Generator, Datasets, Sequential File, Aggregator and Modify Stages. Created queries using join and case statement to validate data in different databases. Created queries to compare data between two databases to make sure data is matched. Used the DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on an ad hoc or scheduled basis.

Page 14: Data Stage Cv

Created shared container to incorporate complex business logic in job. Monitoring the Datastage job on daily basis by running the UNIX shell script and made a force start whenever job fails. Created and modified batch scripts to ftp files from different server to data stage server. Extensively used slowly changing dimension Type 2 approach to maintain history in database. Created Job Sequencers to automate the job. Modified UNIX shell script to run Job sequencer from the mainframe job. Create parameter set to assign a value to job at run time. Standardized the Nomenclature used to define the same data by users from different business units. Created multiple layer report providing a comprehensive and detail report with Drill through facility. Used Parallel Extender for Parallel Processing for improving performance when extracting the data from the sources. Worked with Metadata Definitions, Import and Export of Datastage jobs using Data stage Manager. Providing the logical data model design, generating database, resolving technical issues, and loading data into multiple instances. Implemented PL/SQL scripts in accordance with the necessary Business rules and procedures. Developed PL/SQL procedures & functions to support the reports by retrieving the data from the data warehousing application. Used PL/SQL programming to develop Stored Procedures/Functions and Database triggers.

Environment: IBM Web Sphere DataStage 8.0.1 Parallel Extender, Web Services, Quality Stage 8.0, (Designer, Director, Manager), Microsoft Visio, IBM AIX 4.2/4.1 IBM DB2 Database, SQL Server 2000, IBM DB2,Teradata, ORACLE 11G, Query man, BMQ, Unix, Windows.

Confidential, VA Oct 2008 � Dec 2009 Lead Datastage ETL DeveloperProject was involved in design and development of a group insurance system, which processes claims for group insurance. It covers benefits with subsystems covering Term Life Insurance, Medical Indemnity and Managed Health Care.Data Modeling:

Gathered and analyzed the requirements of the in-house business users for the data warehousing from JAD sessions. Collected the information about different Entities and attributes by studying the existing ODS and reverse engineering into Erwin. Defined the Primary keys and foreign keys for the Entities. Defined the query view, index options and relationships. Created logical schema using ERWIN 4.0 and also created the Dimension Modeling for building the Cubes. Designed staging and Error handling tables keeping in view the overall ETL strategy. Assisted in creating the physical database by forward engineering.

ETL Process:

Extracted data from source systems transformed and loaded into Oracle database according to the required provision. Primary on-site technical lead during the analysis, planning, design, development, and implementation stages of data quality projects using Integrity (now known as Quality Stage). Involved in system analysis, design, development, support and documentation. Created objects like tables, views, Materialized views procedures, packages using Oracle tools like PL/SQL, SQL*Plus, SQL*Loader and Handled Exceptions.

Page 15: Data Stage Cv

Involved in database development by creating Oracle PL/SQL Functions, Procedures, Triggers, Packages, Records and Collections. Created views for hiding actual tables and to eliminate the complexity of the large queries. Created various indexes on tables to improve the performance by eliminating the full table scans. Used the DataStage Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into Data Marts. Created source table definitions in the DataStage Repository. Identified source systems, their connectivity, related tables and fields and ensure data suitability for mapping. Generated Surrogate ID�s for the dimensions in the fact table for indexed and faster access of data. Created hash tables with referential integrity for faster table look-up and for transforming the data representing valid information. Used built-in as well as complex transformations. Used Data Stage Manager to manage the Metadata repository and for import/export of jobs. Implemented parallel extender jobs for better performance using stages like Join, Merge, Sort and Lookup, transformer with different source files complex flat files, XML files. Optimized job performance by carrying out Performance Tuning. Created Stored Procedures to confirm to the Business rules. Used Aggregator stages to sum the key performance indicators in decision support systems and for granularity required in DW. Tuned DataStage transformations and jobs to enhance their performance. Used the DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on an ad hoc or scheduled basis. Scheduled Datastage job using Autosys scheduling tool. Prepared the documentation of Data Acquisition and Interface System Design. Assigned the tasks and provided technical support to the development team. Monitored the development activities of the team and updated to the Management. Created complicated reports using reporting tool Cognos.

Environment: IBM / Ascential Data Stage E.E./7.5(Manager, Designer, Director, Parallel Extender), Quality Stage 7.5 Data Stage BASIC language Expressions,Autosys, Erwin 4.0, Windows NT, UNIX, Oracle 9i, SQL SERVER, Cognos, Sequential files, .csv files.

Confidential, PA Jan 2007 -- Sep 2008Sr. Data Stage DeveloperAs a DW developer designed, developed, and deployed DataStage Jobs and associated functionality. The warehouse employed highly complex data transformations including Slowly Changing Dimensions and a series of Stored Procedures, which made performance tuning and efficient mapping highly critical. Along with designing jobs from scratch re-wrote existing code to enhance performance and trouble-shoot errors in both DataStage & Oracle10G.Responsibilities:

Used IBM Datastage Designer to develop jobs for extracting, cleaning, transforming and loading data into data marts/data warehouse. Developed several jobs to improve performance by reducing runtime using different partitioning techniques. Used different stages of Datastage Designer like Lookup, Join, Merge, Funnel, Filter, Copy, Aggregator, and Sort etc. Used to read complex flat files from mainframe machine buy using Complex Flat File Stage. Sequential File, Aggregator, ODBC, Transformer, Hashed-File, Oracle OCI, XML

Page 16: Data Stage Cv

, Folder, FTP Plug-in Stages were extensively used to develop the server jobs. Use the EXPLAIN PLAN statement to determine the execution plan Oracle Database. Worked on Complex data coming from Mainframes (EBCIDIC files) and knowledge of Job Control Language (JCL). Used Cobol Copy books to import the Metadata information from mainframes. Designed Datastage jobs using Quality Stage stages in 7.5 for data cleansing & data standardization Process. Implemented Survive stage & Match Stage for data patterns & data definitions. Staged the data coming from various environments in staging area before into DataMarts. Involved in writing Test Plans, Test Scenarios, Test Cases and Test Scripts and performed the Unit, Integration, system testing and User Acceptance Testing. Used stage variables for source validations, to capture rejects and used Job Parameters for Automation of jobs. Strong knowledge in creating procedures, functions, sequences, triggers. Expertise in PLSQL/SQL. Performed debugging and unit testing and System Integrated testing of the jobs. Wrote UNIX shell script according to the business requirements. Wrote customized server/parallel routines according to complexity of the business requirements. Designed strategies for archiving of legacy data. Created shell scripts to perform validations and run jobs on different instances (DEV, TEST and PROD). Created & Deployed SSIS (SQL Server Integration Services) Projects, Schemas and Configured Report Server to generate reports through SSRS SQL Server 2005. Used to create ad-hoc reports by MS SQL Server Reporting Services for the business users. Used SQL Profiler to monitor the server performance, debug T-SQL and slow running queries. Expertise in developing and debugging indexes, stored procedures, functions, triggers, cursors using T-SQL. Wrote mapping documents for all the ETL Jobs (interfaces, Data Warehouse and Data Conversion activities).

Environment: IBM Web Sphere Data stage and Quality Stage 7.5, Ascential Datastage7.5/EE (Parallel Extender), SQL Server 2005/2008, Linux, Teradata 12, Oracle10g, Sybase, PL/SQL Toad, UNIX (HP-UX), Cognos 8 BI

Confidential, NJJr. DATASTAGE DEVELOPER Jan 2006- Dec 2006

Merrill Lynch was a global financial service provides capital markets services, investment banking and advisory services, wealth management, asset management, insurance, banking and related financial services worldwide.

Responsibilities:

Worked on the logical and physical design of the Data warehouse. Identified sources/targets and analyzed source data for dimensional modeling. Good knowledge on Voluntary Insurance plans to employers to offer total Insurance packages. Worked in design of Voluntary Disability, Voluntary Dental and Voluntary Life of data marts. Good knowledge on policy and claims processing Worked on integration of Health Claims ODS from legacy systems. Designed and developed jobs for extracting, transforming, integrating, and loading data into data mart using DataStage Designer, used Data Stage manager for importing metadata from repository, new job categories and creating new data el

Page 17: Data Stage Cv

ements Worked with EBCIDIC files to extract data in required format. DataStage jobs were scheduled, monitored, performance of individual stages was analyzed and multiple instances of a job were run using DataStage Director. Used Parallel Extender for splitting the data into subsets, utilized Lookup, Sort, Merge and other stages to achieve job performance Used DS Erwin MetaBroker to import Erwin 4.x Metadata into DataStage Repository. Developed user defined Routines and Transformations for implementing Complex business logic. Extensively used Shared Containers and Job Sequencer to make complex jobs simple and to run the jobs in sequence Involved in the preparation of ETL documentation by following the business rule, procedures and naming conventions. Created reports for various Portfolios using the Universes as the main Data Providers. Created the reports using Business Objects functionality�s like Queries, Slice and Dice, Drill Down, Cross Tab, Master Detail etc. As a part of report development, created the reports using universes as a main data provider and using the Powerful business objects functionalities, and formulae. Involved in trouble shooting of various reporting errors. Created Business Objects reports, Queries with constant interaction with the end users. Trained end users in understanding the reports. Functionalities such as Slice and Dice, Drill mode and Ranking were used for Multidimensional Formatting. Web Intelligence was used to generate reports on the internet/intranet. Exporting the Reports to the Broadcast Agent and Used the Broadcast Agent to Schedule, Monitor and Refresh the Reports. Developed Test plans, Test Scenarios and Test cases for Code testing. Trained team members Provided 24/7 production support

Environment: IBM Web Sphere DataStage 7.5, Metastage 7.0, Business Objects 6.5, Oracle 9i, PL/SQL, SQL * Plus, UNIX Shell Scripts, Windows 2000/NT 4.0, ERWIN 4.1.

Confidential June 2004 � Dec 2005 Jr. Datastage developerDescription: ICICI Prudential Insurance provides a wide range of insurance policies such as Life Insurance, Health Insurance, Motor Vehicle Insurance and General Insurance etc. This project is developed as a process of automation for insurance policy management by using centralized data warehouse and Data Mart. This application provides the provision to take in various related information regarding the region, generates premiums and desired data in the form of reports.

Responsibilities:

Designed and developed mappings between sources and operational staging targets, using Star and Snow Flake Schemas. Provided data models and data maps (extract, transform and load analysis) of the data marts for systems in the aggregation effort. Involved in Extracting, cleansing, transforming, integrating and loading data into data warehouse using Datastage Designer. Developed various transformations based on customer last name, zip code for internal business analytical purposes, loaded warehouse based on customer credit card number with dynamic data re-partitioning. Developed user defined Routines and Transformations by using Universe Basic. Used Datastage Manager for importing metadata from repository, new job categories and creating new data elements. Used the Datastage Director and the runtime engine to schedule running the solution, testing and debugging its components and monitoring the resulting execu

Page 18: Data Stage Cv

table versions (on adhoc or scheduled basis). Developed, maintained programs for scheduling data loading and transformations using Datastage and Oracle 8i. Developed Shell scripts to automate file manipulation and data loading procedures.

Environment: Datastage 5.2/6.0, Oracle 8i, SQL, TOAD, UNIX, Windows NT 4.0.======================================================================================Currently I am an associate with Accenture with 3.3 years of work experience in Data analysis, design, development and implementation of Data warehousing applications using IBM DataStage (ETL) in DW/BI Technologies. My main area of experience has been project delivery of various sizes. I have worked primarily in the domain of Banking, Manufacturing.

Having 3.3 years of Technical exp in design and development of the ETL jobs and processes to support the business requirements for an enterprise data warehouse.

Good Experience on DataStage Parallel Extender (EE) using various stages with designer component.

Exposed to various Domains of Data warehouses like Banking, Manufacturing.

Good Knowledge on Data Warehousing Concepts.

Has the ability to develop and maintain good, long lasting client relationships.

Enjoy challenging and thought provoking work and have a strong desire to learn and progress (motivated enough to self learn) Ability to pick up new technology independently.

Technology

Application/Functional areas/Packages worked on Banking, Manufacturing

Types of projects worked on Development, Enhancement

Environments worked Data Warehousing( IBM DataStage),Oracle 9i

Languages/Tool worked on DataStage Designer (Parallel jobs), SQL.

Operating System Windows 2003 Server/XP/UNIX

RDBMS Oracle 9i

Career Profile

Dates Organization Role

Jun 2010 to Till Date Accenture, Bangalore ETL Developer

Jan 2009 to May 2010 Patni Computers,Bangalore ETL Developer

Qualifications

Degree and Dates

M.C.A from St.Martin�s Engineering College ,Hyderabad affiliated to JNTU(2009)

Work Experience

Page 19: Data Stage Cv

Project #2

Client : MASHREQ BANK

Domain : BANKING

ROLE : ETL DEVELOPER

LOCATION : BANGALORE

Jun 2010 � Till Date

Description:

Mashreq Bank is the leading private bank in the United Arab Emirates (UAE) with a growing retail presence in the region including Egypt, Qatar and Bahrain. It has provided banking and financial services to millions of customers and businesses since 1967.

As per current practice, each line of business manages data and risk measurements themselves, and as a result, it is difficult to standardize the methodology across different line of businesses. Business Unit risk managers currently present analysis on as-requested basis, hence it is difficult to standardize and to audit the methodology used as well as it is impossible to do on-demand analysis.

Responsibilities:

� Understanding the business functionality & Analysis of business requirements

� Design and developed ETL jobs using various Active and Passive stages in parallel jobs using Designer.

� Extracting data from sources like oracle, flat files and transforming them using business logic and loading the data to the business tables.

� Extensively used processing stages like sort, aggregator, transformer Stages in developing jobs.

� Performing unit testing on the jobs developed.

� Preparing the Test Case Document and capturing the test results.

� Monitoring the Datastage jobs using crontab.

Environment: IBM-DataStage-8.1(Parallel Jobs), Oracle 9i, Windows XP/UNIX.

Project #1

Client : FORD MOTORS

Domain : MANFACTURING

ROLE : ETL DEVELOPER

LOCATION : BANGALORE

Jan 2009 � May 2010

Description:

Page 20: Data Stage Cv

QIS2 is a global quality application that is used to perform analysis of diagnostic data from Powertrain and Vehicle Control Modules to reduce the detection-to-correction time for Ford Motors. FM would like to further enhance QIS2 to report on propulsion battery measurement data.

The purpose of this project is to create additional reports to analyze the battery measurement data captured in the propulsion battery assembly plants. This will allow FM Battery and Volt Engineers to identify issues using measurement data including volt and temperature readings.

Responsibilities:

� Understanding the business functionality & Analysis of business requirements

� Design and developed ETL jobs using various Active and Passive stages in parallel jobs using Designer.

� Extracting data from sources like oracle, flat files and transforming them using business logic and loading the data to the Data warehouse.

� Extensively used processing stages like sort, aggregator, transformer, join Stages in developing jobs.

� Performing unit testing on the jobs developed.

� Preparing the Test Case Document and capturing the test results.

� Monitoring the Datastage jobs using Autosys scheduler

Environment: IBM-DataStage-7.5.x2 (Parallel Jobs), Oracle 9i, Windows XP/2003.======================================================================================Datastage Consultant

EXPERIENCE SUMMARY:� 9+ Years of overall IT experience in Data warehousing Applications design, Development, Testing and Project Management, With 7+ years of Experience in ETL Development and Design using IBM Websphere Datastage8.1 Enterprise and other previous Ascential Datastage versions and 2+ years in Oracle ETL development.� Expertise in Ascential Datastage 7.5.2 Server Edition as well.� Expertise in Datastage issue resolution, debugging and application performance tuning.� Expertise in Performance tuning of Datastage jobs� Expertise in extracting data from various versions of Oracle and Teradata Database using Datastage.� Expertise in Oracle SQL, PLSQL coding.� Proficient in UNIX korn Shell Scripting.� Familiar with PERL scripts.� Familiar with BASIC Batch jobs development in Datastage.� Familiar with Pentaho ETL Tool (haven't implemented any projects)� Familiar with BTEQ, Fast load, Fast export concepts in Teradata.� Expertise in performance tuning by using Explain Plan, Creating appropriate indexes, queries optimization, utilizing table spaces, partitioning schemes in oracle.� Experience in creating project design documents High Level Design Document, Software Requirement Specification, Mid Level Platform Application design Document(MLPD), Detailed Platform Application Design Document (DPAD).� Expertise in developing and maintaining overall Test Methodology and Strategy, Documenting Test Plans, Test Cases and editing, executing Test Cases and Test Scr

Page 21: Data Stage Cv

ipts.� Resolving data issues, complete unit testing and complete system documentation for ETL processes� Involved in development of Test cases, executing the Unit/UAT Test cases.� Involved extensively in Unit Testing, System Testing and Regression Testing.� Experience on Defect tracking tools like HP Quality Center.� Involved in daily interactions with the Business to understand their requirements and act in a Business Liaison role.� Setting up Defect calls for tracking and closing the defects.� Experience in producing project estimation, timeline scheduling, resource forecasting, communicating key milestones to IT management and clients, Task scheduling and status tracking.� Excellent communication skills, problem solving skills, Leadership qualities and an attitude to learn the new cutting edge technologies.� Experience in leading and mentoring team members on both functional and technical aspects.� Flexible, enthusiastic and project oriented team player with solid communication and leadership skills to develop creative solution for challenging client needs.� Able to work independently and collaborate proactively & cross functionally within a team.

Technical Skills:

Operating System Windows All Versions, IBM AIX UnixProgramming Languages SQL,PL/SQL, UNIX Shell ScriptingDatabases Oracle9i, 10g,11g, Teradata 13.10ETL Tools Datastage 8.1 / 7.5.2, Pentaho 4.4.0Tools PL/SQL Developer, ToadTesting Tools HP Quality Centre 11.0

Education:Degree: Bachelor of Electronics and Communication Engineering 2004Period: 2000 - 2004University: Bharathidasan Universtiy, India

PROFESSIONAL EXPERIENCE:Client: FedEx Office, Dallas, Texas July'12 to PresentRole: Datastage Consultant.Technology & Tools Used: IBM Websphere Datastage 8.1 Designer, Director, Administrator, Ascential Datastage 7.5.2 Server, Oracle 9i/ 10g/11g, Teradata 13.10, Solaris UNIX, TOAD, PLSQL DeveloperResponsibilities� Design and Development of ETL jobs using Datastage 8.1 Enterprise Edition and Server Edition as well.� Preparing Software Requirement Specification (SRS), Code review document, Test Case Specification and Deployment plan Documents� Done, Full project design for FedEx Tax Services system, which includes preparing Software Requirement Specification document, interacting with source teams and preparing mapping documents, Datastage Coding, Unit Testing, support during system testing, preparing deployment plan, supporting business objects reporting team for their testing and supporting business during their UAT.� Done Development enhancements for the work request received for FedEx ECOM, Mobile print, Online Signs and Graphics, packaging and shipping projects which included extracting data using Datastage from Oracle Database and Enterprise Data Warehouse Teradata Database, transforming them and loading into Oracle database which was used by the Business objects team for their reporting, The work involved in these projects include, Preparing SRS, Identifying source systems and preparing Mapping documents, documenting and presenting reject handling techniques to b

Page 22: Data Stage Cv

usiness, preparing code review document and deployment plan.� Created Reject handling shared container for FedEx ECOM project which was used for reporting issues in various source system files received from the upstream systems.� Created common jobs in Datastage Server Edition using CRC for Change Data Capture and producing Insert, update (load ready files) and delete files for loading in the target tables.� Created SFTP Unix script Datastage jobs to support the HR source system which was upgraded to PeopleSoft 9.1� Produced Impact analysis documents, followed by SRS, Code review, Datastage Coding changes and other project documents to support decommissioning of the Central Repository(CR) DB and replacing with Enterprise Database (EDB)� Supported business enhancements for Business objects requirements which required code changes in Datastage for work requests like Coupon code applied, Customer Base foundation (CBF), etc� Done BASIC code development and job changes for implementing batch jobs for Order analysis project and an ETL Modernization decommissioning work request.� Created documentation for a large ETL application which was having no documentation at all previously and facing various issues in production and kept receiving various change request, the documentation has been widely appreciated for its usefulness and the precise information helped both the business and other downstream systems.� Other responsibilities include Fixing Production defects, doing system testing and system integration testing for Datastage coding done by other developers� Provided various inputs for creating Data warehouse standard Document

Client: Lloyds Banking Group (LBG), Manchester, United Kingdom April'07 to May'12Role: Datastage Designer / Lead Developer.

Technology & Tools Used: IBM Websphere Datastage 8.1 Designer, Director, Administrator, Oracle 10g, Teradata 13.10, IBM AIX UNIX 5.1 , TOAD, PLSQL Developer, Connect Direct File Transfer Tool, IBM Mainframe

Responsibilities:� During my tenure in Lloyds as Datastage Designer and Lead developer I have predominantly worked in Datastage modules of the projects related to Financial Data processing, Business Performance Management, Sales Marketing Analysis and Reporting, e-Statements which has extensive usage of Datastage 8.1.� Participated in all phases of project cycle including Requirement Analysis, Client Interaction, Design, Coding, Testing, Production support and Documentation.� Involved in gathering business requirements and providing development and testing estimation, timelines and resource forecasting.� Responsible for preparing Mid Level Platform Application Design (MLPD) and Detailed Platform Application Design (DPAD) Documents based on Business Requirement specification document, giving walkthrough to Document review panel members, clients and obtaining sign offs from the CIO and other Subject Matter Experts from the Client Organization before proceeding with the build activity.� Involved in preparing the coding standards documentation for Datastage Development.� Involved in designing complex Parallel Datastage jobs, sequencers.� Involved in working with SAP Ledger downstream systems by sending SAP reference files and loading processed data to the SAP tables.� Extensively used Job Stages like Datasets, Sequential file, Lookup, Aggregator, Join, Transformer, Sort, Funnel, Remove Duplicates, Merge, Filter, SCD, Change capture, Copy, External filter, External Source, Pivot, Complex Flat File Stages, etc� Used Datastage XML input stage for processing banks hierarchical files� Coordinated with UNIX admin and service delivery team in procuring the UNIX envi

Page 23: Data Stage Cv

ronment and to create Datastage UNIX environment file system and directories.� Involved in extracting the data from various sources like Flat files, Oracle and Teradata databases.� Familiar with BTEQ, Fast load, Fast export concepts in Teradata.� Involved in developing Key Data validation and splitting modules using Datastage in Financial Data processing project for various sources files coming from different source systems.� Used multiple invocation techniques for running jobs in various instances so as to reduce the time and expediting the processes.� Used Datastage Designer to develop jobs for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.� Utilized slowly changing Dimensions for tracking changes to dimensions over time� Used Data stage Administrator for defining environment variables and project level settings.� Utilized the stages of Job Sequence such as User Variable Activity, Notification Activity, Routine Activity, Terminator Activity, etc.� Involved in developing error logging shared containers and extensively used in various Datastage jobs to gather the error logs in a business defined format, which facilitated the business in analyzing the erroneous files and fixing them quickly.� Responsible for validating the Datastage jobs, sequencers against the pre-defined ETL design standards.� Involved in developing Connect direct jobs for transferring files to the downstream systems and for pulling files from the source systems as well.� Involved in Datastage jobs bug fixing and supporting testing team during various stages of testing.� Involved in tuning Source extract SQL scripts used in ETL jobs to meet the business SLAs.� Analyzed the requirements to identify the necessary tables that need to be populated into the staging database.� Prepared the DDL's for the staging/work tables and coordinated with DBA for creating the development environment.� Developed Ksh shell scripts for automating Datastage jobs and Housekeeping activities.� Created Shell Scripts to automatically notify Business Exceptions and Rejects during the Loads and file processing stages.� Created Unix Shell Scripts to read the Parameter Files and passing these values to the job during runtime.� Experience in preparing and reviewing Functional Test Plans and Master Test Plans for various stages of the testing activities� Responsible for preparing various test scenarios and validating the data as per the business rules.� Involved in preparing and executing System Testing and System Integration test cases� Provided support during User Acceptance Testing by running jobs and providing fixes.� Co-ordinate the offshore development, testing and implementations using implementation plans.� Responsible for doing the peer reviews, Planning and estimating the project requirements and to report the status to business managers.

Client: Halifax Bank of Scotland (HBoS), India, August'06 to February'07Role: Datastage Developer/Designer.Technology & Tools Used: IBM Websphere Datastage 8.1 Designer, Director, Administrator,Datastage 7.5.2, Oracle 10g, Teradata V2R6, IBM AIX UNIX 5.1, TOAD, PLSQL Developer.

Responsibilities:� Involved in Gathering Business Requirement and producing inputs to MLPD and DPAD

Page 24: Data Stage Cv

.� Involved in all phases of project cycle including Requirement Analysis, Client Interaction, Design, Coding, Testing, support and Documentation.� Responsible for preparing ETL Documentation for the developed processes.� Investigation of possible Terrorist/Money laundering transactions happening in the system for the tickets raised by the business� Responsible for handling Production Support tickets.� Automated SQL queries using UNIX scripts for loading large volume of data during data migration activities in AML Project� Involved in tuning many Oracle scripts and other ETL processes used in this project.� Responsible for doing the peer reviews, Planning and estimating the project requirements and to report the status to business managers.� Created Documentation for the developed jobs where appropriate documents were not available.� Provided production support and bug fixing during various stages of the testing� Involved in Designing and Developing of Datastage Parallel Jobs, Sequencers and Shared Containers� Performance Tuning of Datastage jobs and Oracle SQL queries.� Extensively used Job Stages like Datasets, Sequential file, Lookup, Aggregator, Join, Transformer, Sort, Funnel, Remove Duplicates, Merge, Filter, SCD, Change capture, Copy, External filter, External Source, Pivot, etc.� Used multiple invocation technique for running jobs in multiple instances to speed up the process.� Involved in writing oracle PLSQL Stored Procedures, SQL scripts� Developed Ksh shell scripts for automating Datastage jobs and Housekeeping activities.� Created Shell Scripts to automatically notify Business Exceptions and Rejects during the Loads and file processing stages.� Created Unix Shell Scripts to read the Parameter Files and passing these values to the job during runtime.� Responsible for preparing Unit test cases and validating the data as per the business rules.� Provided support to the testing team during System Testing , System Integration Testing and User Acceptance testing

Client: AstraZeneca, India March'04 to August'06Role: ETL Oracle Developer.Technology & Tools Used: Datastage 7.5.2, Oracle 10g, Teradata, IBM AIX UNIX 5.1, TOAD, PLSQL Developer.

Responsibilities:� Understanding Business Requirements� Developing Datastage jobs using the MLPD, DPAD and Business Requirement Specification document.� Extensively used Job Stages like Datasets, Sequential file, Lookup, Aggregator, Join, Transformer, Sort, Funnel, Remove Duplicates, Merge, Filter, Change capture, Copy, External filter, External Source, Pivot, etc.� Developing Oracle SQL, PL/SQL queries and DDL's for the Database creation.� Performance tuning of SQL Queries.� Developing UNIX scripts for automating Datastage jobs.� Extensively used Datastage Designer & Director.� Writing and executing test case for unit testing� Providing support to QA team during System Testing and System Integration Testing.� Coordinating with UNIX Admin and DBA's for Datastage UNIX environment creation and Database creation� Engaged in Production Support tasks.� Documenting the changes for future reference.