ETL
Siva

Professional Summary

  • Experienced professional with Around 10 years of IT experience and 7 years of extensive ETL tool experience using Data Integration tool Informatica Power Center 8.x/7.x/6.x/5.x and ETL testing. Strong working experience as Oracle PL/SQL developer.
  • Expertise in the Analysis, Design and Development of Software Applications and providing Data Integration/Warehousing solutions.
  • Six (6) years of Data warehousing experience using Informatica PowerCenter 9.1/8.5/8.1/7.1/6.2.2
  • Strong Data Analysis and Data Profiling background, using MS Access, Hyperion Brio, (worked and attended Informatica’s professional training on IDE and IDQ).
  • Extensive knowledge in architecture design of Extract, Transform, Load environment using Informatica Power Center.
  • Developed Slowly Changing Dimension Mappings of type I, III and type II (Flag, Version Number, and Effective Date Range).
  • Strong understanding of the principles of DW using Fact Tables, Dimension Tables and Star Schema modeling with sound knowledge of database architecture for OLTP and OLAP applications.
  • Knowledge on Late arriving facts/dimensions and Hirearchies.
  • Worked on creating surrogate keys in SCD mappings.
  • Used Detail Navigator Power Exchange which is the Data Integration tool Informatica Power Connect for Mainframe to get Mainframe files from the source to the staging area
  • Hands on scheduling tools like Control-M, Crontab.
  • Technical expertise in ETL/ Data Integration tool Informatica (Server Manager, Repository Manager, Designer Client tools, Power Center, Workflow Manager, Workflow Monitor, and Repository Manager).
  • Configured and Implemented Date Integration Standards and procedures in Informatica.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle, SQL Server, DB2, XML and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Strong knowledge of OBIEE as a Business Intelligence tool.
  • Created/ Designed Re-use templets for Deliverables for Low level design phases.
  • Understanding the architecture of Vertica (Column Orientation, Compress and Encode, Clustering and Continuous).
  • Knowledge on writing queries on HP Vertica Database.
  • Expertise in developing Mappings and Mapplets, Sessions, Workflows, Worklets and Tasks using Informatica Designer, Workflow Manager and Batch processes using PMCMD command.
  • Experience in designing/developing complex mappings using transformations like Connected/ Unconnected Lookups, Router, Filter, Expression, Aggregator, Normalizer, Joiner and Update Strategy.
  • Running Sessions and workflows on Grid, Load Balancing, Dynamic Partition with Enterprise Grid Option.
  • Worked on Very large databases like Teradata using loaders and utilities like TPT and Netezza.
  • Experience in Performance tuning and Optimization of Cache with Informatica and OBIEE.
  • Worked with different data sources like Flat Files, Databases.
  • Knowledge on Access Big data Sources & Targets.
  • Knowledge on Pushdown to Hadoop using Developer.
  • Knowledge on Hive, Hive metastore (Local & Remote) & HDFS
  • Expertise in unit testing of Informatica mappings.

Certification Summary

  • ISTQB Certificated Professional in Testing.
  • IBM Certificated Professional in Cognos Report Developer

Technical Skills

Operating Systems
Windows 98/2k/2003server/XP/NT4.0, UNIX.
ETL Tools
Informatica Power Center V9,8x/7x/6x. 1 [Professional Online Training in Informatica Data Explorer (IDE 8.6.0) & Data Quality (IDQ 8.6) from Data Integration tool Informatica] Informatica Cloud
BI Tools
Cognos 8.4/8.3, Report Net, Impromptu, Power play Transformer, Microstrategy 8 x, 9x, OBIEE 11.7
Data Modeling Tool
Erwin 3.5.2/4.0/4.2.
Databases
Oracle7.3/8i/9i/10g, DB2, SqlServer2k, 2003, 2005, 2008, 2012 Teradata V13.
Languages
C, SQL and PL/SQL, Shell Script.
Tools
TOAD 7.6/8.0, SQL * Plus, Putty.
Testing
Manual Testing, Automated Testing, Mercury QC

Education

  • Bachelors in Electronics
  • Master in Computer Applications (MCA)

Professional Experience

Providence.Org. Portland/Beaverton, OR
Duration
May-15 to till Date
Role
ETL/BI Consultant
Responsibilities
A not-for-profit organization which includes hospitals, health plans, clinics, physicians, long-term care facilities, low-income housing, assisted living, home care.etc.

The Main objective of this project is to send Membership Eligibility data, Claims (Medical & Rx) Data and Fixed Cost (Premium & Capitation) Data from Facets for ODS system to Vendor for reporting purposes.

  • Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.
  • Created and executed workflows using Workflow Manager.
  • Understanding existing business model and customer requirements on Claims & Membership data.
  • Involved in the development of Informatica mappings and trouble shootings.
  • Understanding the process of claims process and facets.
  • Involved in Design & creating mappings for Incremental Loading process.
  • Error Handling in Informatica mappings.
  • Analyzing and Relate Claims data numbers with finance.
  • Created and monitored the sessions and workflows.
  • Worked with Informatica Debugger to debug the mappings in Designer.
  • Developed Informatica Mappings and also tuned for better Performance.
  • Data Enhancements and deliverables done to QA Environment Successfully
  • Worked on Informatica tool like Source Analyzer, Data Warehousing designer, Mapping Designer, Transformations, Informatica Repository Manager and Informatica Workflow Manager.
  • Extensively used the transformations like Expression, Aggregator, Router, Sequence generator, Lookup transformations, Update strategy and stored procedure transformation to create mappings.
  • Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.
Environment
Informatica Power Center 9.1.0, SQL Developer 2008, PL/SQL, SQL
Nike Inc., Hillsboro, OR
Duration
May-14 to Apr-15
Role
Sr. ETL Lead Developer
Responsibilities
Nike, Inc.  is an American multinational corporation that is engaged in the design, development, manufacturing and worldwide marketing and selling of footwear, apparel, equipment, accessories and services. Nike Recently acquires Converse. The Main objective of this project is to load the Footwear and Apparel data from FlexPLM system to Converse SAP which is identical to Nike’s SAP system.

  • Using Informatica Power Center Designer, Extract & Transform the data from various source systems by incorporating various business rules for different applications by using the different objects and functions which the tool supports.
  • Worked on Informatica tools like Source Analyzer, Data Warehousing designer, Mapping Designer, Transformations, Informatica Repository Manager and Informatica Workflow Manager.
  • Extensively used the transformations like Expression, Aggregator, Router, Sequence generator, Lookup transformations, SAP/ALE Prepare, transformation to create mappings and loading data into SAP iDocs directly.
  • Worked on Importing SAP IDoc definition into Informatica as Targets.
  • Worked on SAP IDoc’s using SAP Logon to verify data in IDocs.
  • Connected to SAP Netweaver using PowerExhange for SAP Netweaver and Imported SAP BW tables definition into Informatica.
  • Experienced on working with Big Data and Hadoop File System (HDFS).
  • Strong knowledge of Hadoop and Hive and Hive’s analytical functions.
  • Capturing data from existing databases that provide SQL interfaces using Sqoop.
  • Efficient in building hive, map-reduce scripts.
  • Implemented Proof of concepts on Hadoop stack and different big data analytic tools, migration from different databases (i.e. Teradata, Oracle, MySQL) to Hadoop.
  • Loaded the dataset into Hive for ETL (Extract, Transfer and Load) operation.
  • Pushdown to Hive/Hadoop using Informatica Developer.
  • Excellent problem solving skills, high analytical skills, good communication and interpersonal skills. Error Handling in Informatica mappings.
  • Worked on Teradata BTEQ & Merge scripts.
  • Worked with Informatica Debugger to debug the mappings in Designer.
  • Developed Informatica Mappings and also tuned for better Performance.
  • Data Enhancements and deliverables done to QA Environment Successfully.
  • Co-ordinate with the QA team in various testing phases by resolving the defects and ensuring smooth execution of the test plans.
  • Creating the deployment documents and migrating the code to the production environment.
  • Investigating and fixing the bugs occurred in the production environment and providing the on-call support
Environment
Informatica Power Center 9.6.0, Informatica Power Center Big Data Edition, PL/SQL, SQL, Oracle 11g, Teradata 13, Linux Shell Scripting, Microsoft Sql Server, SAP Logon, Hive 0.13, Apache Hadoop 2.5.2, Sqoop 1.99.4. SAP Netweaver
Stanford University, Palo Alto, CA
Duration
Oct-13 to May-14
Role
Sr. ETL Developer
Responsibilities
Stanford University is one of the world's leading teaching and research universities. The main goal of the project Spend Analytics is to get the yearly expenses amount spend by the University for the Different Departments categorized like (PO, PCARD/TCARD, IOU/IEXPENSE).

  • Using Informatica Power Center Designer, Extract & Transform the data from various source systems by incorporating various business rules for different applications by using the different objects and functions which the tool supports.
  • Worked on Informatica tools like Source Analyzer, Data Warehousing designer, Mapping Designer, Transformations, Informatica Repository Manager and Informatica Workflow Manager.
  • Extensively used the transformations like Expression, Aggregator, Router, Sequence generator, Lookup transformations, Update strategy and stored procedure transformation to create mappings.
  • Working with business users closely to understand the requirements and changing them into technical capabilities.
  • Worked with business analysts to identify the appropriate data elements for required capabilities.
  • Coordinating with offshore team and providing the inputs.
  • Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.
  • Participate in Design Reviews of Data model and Informatica mapping design
  • Extracting data from oracle financials and loading into ODS using oracle package’s.
  • Created and executed workflows using Workflow Manager.
  • Understanding existing business model and customer requirements
  • Extensively worked on SCD Type I and SCD Type II Mappings.
  • Involved in the development of Informatica mappings and trouble shootings.
  • Error Handling in Informatica mappings.
  • Experience in basic Report buildings in OBIEE.
  • Experience in ETL data loading to OLAP (OBIEE) Database.
  • Participated in testing Data matching between OF and OBIEE Databases and Dashboards.
  • Unit testing the reports, debugging and fixing the errors for Right reports.
  • Providing support and fixing issues related to Data on Reports, Dashboards etc.
  • Worked with Informatica Debugger to debug the mappings in Designer.
  • Developed Informatica Mappings and also tuned for better Performance.
  • Data Enhancements and deliverables done to QA Environment Successfully.
Environment
Informatica Power Center 9.0.1, SQL Developer, PL/SQL, SQL, Oracle 10g, Linux Shell Scripting, OBIEE 11.7.
Responsys, San Bruno CA
Duration
June-13 to Oct-13
Role
Sr. ETL Engineer
Responsibilities
Responsys helps the best brands in the world effectively execute marketing campaigns across all key digital channels like email, mobile, social, display and web. The Main objective of this project is to load mobile data into staging, Reporting and Data Warehouse Tables for analysis and reporting purpose.

  • Using Informatica Power Center Designer, Extract & Transform the data from various source systems by incorporating various business rules for different applications by using the different objects and functions which the tool supports.
  • Worked on Informatica tools like Source Analyzer, Data Warehousing designer, Mapping Designer, Transformations, Informatica Repository Manager and Informatica Workflow Manager.
  • Extensively used the transformations like Expression, Aggregator, Router, Sequence generator, Lookup transformations, Update strategy and stored procedure transformation to create mappings.
  • Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.
  • FTP Source files from third party vendor and Loaded data into Oracle Tables thru informatica.
  • Created and executed workflows using Workflow Manager.
  • Understanding existing business model and customer requirements.
  • Worked on Unix shell scripts for ftp source files from third party vendor.
  • Worked on encrypting and decrypting files using gpg keys using unix shell scripts.
  • Created dynamic parameter mappings.
  • Involved in the development of Informatica mappings and trouble shootings.
  • Error Handling in Informatica mappings.
  • Created and monitored the sessions and workflows.
  • Worked with Informatica Debugger to debug the mappings in Designer.
  • Developed Informatica Mappings and also tuned for better Performance.
  • Data Enhancements and deliverables done to QA Environment Successfully.
Environment
Informatica Power Center 9.1.2, SQL Developer, Flat files, PL/SQL, SQL, Oracle 10g, Linux Shell Scripting.
Kohl’s Inc., Milpitas CA
Duration
Oct-12 to June-13
Role
Sr. ETL Lead Designer/Developer
Responsibilities
The Main objective of this project is to Migrate data from Blue Martini (legacy) Database to ATG database using Informatica includes Orders Data, Profiles Data, Sales Pricing, Promotion and Credit Card Information.

  • Working closely with the business users to understand the requirements and converting them into project level technical capabilities.
  • Worked on Informatica tool like Source Analyzer, Data Warehousing designer, Mapping Designer, Transformations, Informatica Repository Manager and Informatica Workflow Manager.
  • Designing ETL jobs. Using Informatica.
  • Designed and developed complex mapping for varied transformation logic like Expression, Filter, Aggregator, Router, Joiner Update Strategy, Unconnected and Connected lookups
  • Used Informatica Debugger to troubleshoot logical errors and runtime errors.
  • Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.
  • Created and executed workflows using Workflow Manager.
  • Understanding existing business model and customer requirements.
  • Involved in the development of Informatica mappings and trouble shootings.
  • Involved in Design & creating mappings for Incremental Loading process.
  • Error Handling in Informatica mappings.
  • Involved in writing batch scripts to run jobs using PMCMD.
  • Created and monitored the sessions and workflows.
  • Worked with Informatica Debugger to debug the mappings in Designer.
  • Developed Informatica Mappings and also tuned for better Performance.
  • Data Enhancements and deliverables done to QA Environment Successfully.
Environment
Informatica Power Center 9.1.2, DB2, SQL Developer, Flat files, PL/SQL, SQL, Oracle 10g, Erwin 4.2, Microsoft Visio, ESP Scheduler.
COLUMBIA SPORTSWEAR, PORTLAND OR
Duration
Sep-11 to Sep-12
Role
Sr. ETL Designer/Developer
Responsibilities
The Main objective of this project is to build the Technology Foundation to support Columbia’s Information Needs. Seed the new EDW with detail level core corporate data as prioritized to support future business analytics. Building separate data store for each and every region.

  • Using Informatica Power Center Designer, Extract & Transform the data from various source systems by incorporating various business rules for different applications by using the different objects and functions which the tool supports.
  • Worked on Informatica tool like Source Analyzer, Data Warehousing designer, Mapping Designer, Transformations, Informatica Repository Manager and Informatica Workflow Manager.
  • Extensively used the transformations like Expression, Aggregator, Router, Sequence generator, Lookup transformations, Update strategy and stored procedure transformation to create mappings.
  • Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.
  • FTP Source file from third party vendor and Loaded data into SAP BW using BAPI function thru informatica.
  • Created and executed workflows using Workflow Manager.
  • Understanding existing business model and customer requirements.
  • Involved in the development of Informatica mappings and trouble shootings.
  • Involved in creating mappings for CDC (change Data Capture) Data Extraction from AS 400 using Informatica Power Exchange.
  • Error Handling in Informatica mappings.
  • Involved in writing batch scripts to run jobs using PMCMD.
  • Created and monitored the sessions and workflows.
  • Worked with Informatica Debugger to debug the mappings in Designer.
  • Developed Informatica Mappings and also tuned for better Performance.
  • Data Enhancements and deliverables done to QA Environment Successfully.
  • Involved in loading data from source to target using different loaders like TPUMP, FASTLOAD, MULTILOAD, BTEQ and Teradata Parallel Transporter (TPT) (LOAD, STREAM, UPSERT, and EXPORT.)
Environment
Informatica Power Center 9.1.2, Informatica Power Exchange 8.6, SAP BW, DB2, SQL Server 2005, Flat files, PL/SQL, SQL Assistant, iSeries(AS400), Teradata V13, XML, Erwin 4.2, Microsoft Visio, Tidal.
Abbott Laboratories, Waukegan IL
Duration
Apr-11 to Aug-11
Role
Sr. ETL Developer
Responsibilities
Abbott is ranked one in the world sales pharmaceutical this project objective is to calculate the apportioning with physician prescribed information, sales of Sales forces.  This project also will focus on Alignment process of loading information for zip-territory. This Sales Crediting process extracts transactions that contain address to territory for physicians. Extraction of data from Oracle as a part of various inbounds process and store them in to EDW.

  • Extracted data from multiple sources, which included relational sources, Flat-Files and VSAM Datasets.
  • Used various Informatica Transformations like Expressions, Filters, Joiners, Aggregators, Routers and Lookups to load better and consistent data.
  • Build new informatica maps/mapplets that extract data from different Database systems
  • Build the informatica maps to load the system Flat Files into Enterprise Data Warehouse
  • Developed maps that include Slowly Changing Dimensions Type III.
  • Developed informatica process for exclusively backfill purpose.
  • Designed reusable mapplets to handle data profiling at staging level
  • Created reusable/non reusable Session and used them by changing session attributes
  • Extensively used Email task for success, failure or abort job notification to client.
  • Experience in loading the data into different Database systems using Relational connection and External loader
  • Created relational connections to read source data from different environment than the test server and alter Session attributes to set up the process that moves data for testing.
  • Identified bottlenecks using various methods and resolved the performance issues of the Informatica Mappings and Sessions
Environment
Informatica Power Center 6.2, Power Exchange5.1, Oracle 10g, HP AIX, PL/SQL, Windows XP OS/ Linux, Workbench 7.0.
COLUMBIA SPORTSWEAR, PORTLAND OR
Duration
May-10 to Apr-11
Role
Sr. ETL Designer/Developer
Responsibilities
The Main objective of this project is to build the Technology Foundation to support Columbia’s Information Needs. Seed the new EDW with detail level core corporate data as prioritized to support future business analytics. Building separate data store for each and every region

  • Using Informatica Power Center Designer, Extract & Transform the data from various source systems by incorporating various business rules for different applications by using the different objects and functions which the tool supports.
  • Worked on Informatica tool like Source Analyzer, Data Warehousing designer, Mapping Designer, Transformations, Informatica Repository Manager and Informatica Workflow Manager.
  • Developed Slowly Changing Dimension Mappings of type I, III and type II - Flag, Version Number, and Effective Date Range.
  • Extensively used the transformations like Expression, Aggregator, Router, Sequence generator, Lookup transformations, Update strategy and stored procedure transformation to create mappings.
  • Experience with high volume datasets from various sources like Text Files, XML, and Relational Tables.
  • Expertise in configuration, performance tuning & integration of various data sources like Oracle, MS SQL Server, XML, Flat files.
  • Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.
  • Created and executed workflows using Workflow Manager.
  • Understanding existing business model and customer requirements.
  • Involved in the development of Informatica mappings and trouble shootings.
  • Involved in creating mappings for CDC (change Data Capture) Data Extraction from AS 400 using Informatica Power Exchange.
  • Error Handling in Informatica mappings.
  • Created and monitored the sessions and workflows.
  • Running Workflows and sessions using Enterprise Grid Option.
  • Daily ETL Load Process, BW Load Process, Master Data Matching, Tuning and Cleansing.
  • Worked with Informatica Debugger to debug the mappings in Designer.
  • Developed Informatica Mappings and also tuned for better Performance.
  • Data Enhancements and deliverables done to QA Environment Successfully.
  • Involved in loading data from source to target using different loaders like TPUMP, FASTLOAD, MULTILOAD, BTEQ and Teradata Parallel Transporter (TPT) (LOAD, STREAM, UPSERT, and EXPORT.)
Environment
Informatica Power Center 8.6.1, Informatica Power Exchange 8.6, DB2, SQL Server 2005, Flat files, PL/SQL, SQL Assistant, iSeries(AS400), Teradata V13, XML, Erwin 4.2, Microsoft Visio.
Pfizer, Groton, CT
Duration
Mar-09 to Apr-10
Role
Informatica Developer
Tools
Informatica Power center 8.1/7.1, Oracle9i, SQL Server 2005, Informatica Power Exchange 8.0, DTS, T SQL, TOAD, Erwin 4.2 and UNIX Shell Script.
Responsibilities
Pfizer, Inc. is an American multinational pharmaceutical corporation. It is among one of the world's largest pharmaceutical company. Pfizer develops and produces medicines and vaccines for a wide range of medical disciplines. The objective of the project is to develop Enterprise Data Warehouse system which in turn establishes a common data warehouse technical infrastructure with common data population architecture and support robust, end-user reporting capabilities for all corporate needs. Also extracting the data from Flat files and load it into the EDW, and other application database like Oracle using Informatica Power Center 8.1/7.1 tools.

  • Analyzed source data and gathered requirements from the business users.
  • Coordinating with source systems owners, day-to-day ETL progress monitoring, Data warehouse target schema design (star schema) and maintenance.
  • Worked on delimited flat file sources.
  • Created reusable transformations and Mapplets and used them in complex mappings.
  • Created different transformations for loading the data into target database using e.g. Source Qualifier, Joiner transformations, Update Strategy, Lookup transformation, Rank transformation, Expressions, Aggregator, and Sequence Generator.
  • Used mapping Parameters and Variables to pass the values between sessions.
  • Developed mappings with Transformations and mapplets confirming to the business rules.
  • Created the (ER) Entity Relationship diagrams & maintained corresponding documentation for corporate data dictionary with all attributes, table names and constraints.
  • Database design and development, SQL stored procedures, Jobs and DTS packages in MS SQL server 2005.
  • Testing and Data Validation – each extract process will be validated, and data elements within each file will be reviewed to ensure data quality prior to passing the file to Risk Management.
  • Created tasks like Timer Event Raise, Event Wait, Decisions, and Email.
  • Document all the DTS packages implemented to move the data from source to target tables.
  • Data Enhancements and deliverables done to QA Environment Successfully.
Sql Star, Hyderabad, India.
Duration
Dec-06 to Jan-09
Role
ETL & Report Developer
Responsibilities
Worked on the project Debit Campaigns, the goal of this project is to enrich the RPS DW with new customer relationship and payments data. Accomplishing this will reduce the burden of ad hoc reporting and data requests on MI&R (Marketing Intelligence & Research). Maintaining this data in the RPS DW will also facilitate and expedite the process of in-house analytics. RPS business lines will be provided self-serve access to the data and can use a variety of enterprise toolsets to analyze the data.

Debit Campaign Interchange Reporting:

The goal of this project is to provide campaign reporting for Debit campaigns in an automated fashion on a go forward basis. Additionally, we should have the ability to bring in data on previously mailed campaigns that will need reporting for the RPS DW team.
  • Responsible for gathering and refining business requirements from the Business users.
  • Prepared the logical and physical Data Models.
  • Extensively worked on creating Metrics, Derived Metrics and Filters for Complex Reports.
  • Created reports using Advanced Metric Objects, Filters, and Prompts and thus provided users with the facility to create customized reports by creating their own filters and templates.
  • Generated advanced reports in Grid mode and Graph mode using Micro Strategy Desktop.
  • Created advanced grid reports, consisting of data analysis by combining a template with filters. The end-users were able to generate ad-hoc report by drilling up, down, within and across dimension or anywhere.
  • Involved in creating users in Micro Strategy Desktop for testing the reports on Web.
  • Interacted with the business users and support personal to make sure the reports were picking.
  • Providing Scorecard reporting and ad-hoc reporting for decision support.
  • Created reusable Transformation objects that enabled time-series analysis for comparative analysis.
  • Reduced the time taken for the reports to run by taking advantages of VLDB properties like star join, query and sub query optimizations for oracle 9i and SQL hints.
Environment
Informatica 7.1, Microstrategy 8, DB2, VI Editor and UNIX