Denodo Developer
Suri

Professional Summary

  • Seven years of IT experience expertise in Data Warehousing and Data Migration/Data Conversion (ETL) involved in Analysis, Design, Development of various business applications, different platforms using Informatica 9.1/8.6 and Oracle 12c/11g/10g/9i, Pentaho PDI, Denodo 5.5/6.0.
  • Experience in gathering Business Requirements, Data Analysis, Data Modeling both Logical & Physical and Design of a Data Warehouse using both the Star Schema and Snow Flake Schema.
  • Experience in Documenting and Designing the Data Warehouse and ETL data mapping specifications and Reporting specifications.
  • Strong design and development experience in Denodo Data virtualization platform
  • Strong technical expertise in Scheduler/VDP components of Denodo
  • Experience on caching approaches on different databases for Denodo.
  • Strong technical expert with good experience in ETL tools Informatica, PL/SQL and T-SQL.
  • Extensively used Transformations like Source Qualifier, Filter, Joiner, Rank, Sequence generator, Aggregator, Expression, Lookup, Router, Stored Procedure, Update strategy, Sorter, Normalizer, Union Transformations while developing Informatica mappings.
  • Experience with Informatica IDQ tool to provide web services
  • Experience with Informatica Cloud
  • Good knowledge in SFDC(salesforce.com) application
  • Strong knowledge in Source Data Analysis and Data Profiling & Data Cleansing.
  • Highly skilled in writing PL/SQL scripts containing Functions, Stored Procedures, Triggers, Packages, Views, etc and creating Materialized Views for Aggregated/Summarized data
  • Expertise in UNIX Shell Scripting.
  • Expertise in all phases of Testing like Unit, System, Integration and User Acceptance and creating Test Plans and Test Scripts for testing.
  • Posses very good Interpersonal & Communications Skills, individually managed various clients and very good Individual and Team Player.

Technical Skills

Operating Systems
Windows XP/2000/7, Unix
Languages
UNIX Shell Scripting, PL/SQL
Databases
Redshift, Oracle 12c/11g/10g/9i, Greenplum, SQL Server 2000/2005, MySQL, MS Access, Amazon Aurora (MySQL Compatible engine for Amazon RDS ),Derby
ETL Tools
PostgreSQL, PL/SQL, Informatica 9.1/8.6, Denodo Platform 5.5,6.0 and Pentaho 5.4,6.0
Version Control Tools
SVN, GIT
Reporting Tools
Business Objects, OBIEE and Tableau
GUI Tools
SQL*Plus, SQL*Loader, TOAD, PL/SQL Developer and Shell Scripts
Life Cycle Expertise
Experience in all phases of SDLC & Data Warehousing life cycle methodologies Experienced in both Development and Support Projects

Education

  • BS in Engineering

Professional Experience

Logitech, Newark, CA
Duration
July 2014 – Till Date
Role
Senior ETL developer /Denodo Developer
Responsibilities
Logitech is a global provider of personal computer and tablets accessories. I am working with BI EDW team on multiple projects (Cloudy, SFDC(Customercare) and Scrapy).

  • Working as a Senior ETL and Denodo developer
  • Responsible for Designing and Development ETL processes
  • Extensively working on gathering Business Requirements, Data Profiling, Data Cleansing, Migrations and Reporting
  • Working on new Denodo virtualization tool to read data from SFDC and other sources to provide reports for Users.
  • Responsible for Denodo VDP and Scheduler tools.
  • Designed and developed high quality integration solutions consistent with given requirements by using Denodo
  • Collaborating with other application development teams to design, develop and deploy the best solutions to ensure high level of customer service
  • Creating and Monitoring Denodo Scheduler jobs in Denodo Administration tool.
  • Read the tables from Amazon Redshift to Denodo and implemented business logic in Denodo and exposed the final business views in Tableau.
  • Created Denodo custom views and final business views in VDP of Denodo.
  • Created caching jobs in different databases like MySQL, Amazon Aurora (MySQL Compatible engine for Amazon RDS), Derby.
  • Created Informatica Mappings to load the data from sources to Staging and from Staging to the Target database from Target Database to Data Warehouse.
  • Created Audit tables against EDW Database to Redshift Database using Denodo Tool.
  • Worked with SFDC/Legacy teams to gather requirements and load legacy data into SFDC.
  • Worked on Informatica IDQ tool to provide web services to users.
  • Created Salesforce to EDW mappings using informatica cloud platform.
  • Involved in oracle r12 migration.
  • Created various Sources, Targets, Mappings, Workflows using Informatica Power Center
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance
  • Extensively used Informatica Client tools: Designer, Workflow Manager and Workflow Monitor.
  • Parameterized the mappings and increased the re-usability
  • Created the Test Plans and Test Cases for ETL Process.
  • Supported the Client in User Acceptance Testing
  • Involved in different POC projects like Pentaho data integration tool and Redshift database and Denodo data virtualization tool.
Environment
Oracle 11g, SQL Server 2005, MySQL, Informatica 9.1/8.6, IDQ, PL/SQL, Denodo Platform 5.5,6.0, OBIEE, Salesforce.com and UNIX shell scripting, Pentaho PDI 5.4., Redshift(AWS), Tableau.
VMware, Palo Alto, CA
Duration
Feb 2012 – June 2014
Role
Data Warehouse developer
Responsibilities
VMware provides cloud and virtualization software and services. I am working with DSS (Decision Support Systems) team on multiple projects (Model N BI Reporting, Web Flash and DISTI API).

  • Working as a Data Warehouse developer, responsible for Designing and Development of Data Warehouse and ETL process
  • Gathered the Business Requirements for Data Profiling, Cleansing, Migration and Reporting from Data Warehouse
  • Created Detail Design Specification for Data Warehouse, ETL (including Data Profiling and Cleansing) and Reporting
  • Created the Logical and Physical Data Model for the Data Warehouse.
  • Created Informatica Mappings to load the data from sources to Staging and from Staging to the Target database from Target Database to Data Warehouse.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance
  • Worked extensively on Informatica transformations like Source Qualifier, Expression, Filter, Router, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence generator, Joiner and Normalizer.
  • Parameterized the mappings and increased the re-usability
  • Used the PL/SQL procedures for Informatica mappings for truncating data in target at run time.
  • Created and Modified PL/SQL (Functions, Procedures, Packages and Triggers) and Shell Scripts for improving the performance of the System
  • Some of the performance techniques used are: Analyzing the Explain plans, Tkprof, DBMS_Scheduler, Indexes, Partitioning, Materialized Views, etc
  • Developed PL/SQL stored procedures, user defined functions for complex calculations and bundle them into stored package that could be invoked from the Forms triggers
  • Created Custom Triggers, Stored Procedures, Packages and SQL Scripts
  • Carried Data Profiling and Source Data Analysis and made the recommendations for Errors/Defects in data, Data Cleansing and modifications for the ETL logic.
  • Created the ETL exception reports and validation reports after the data is loaded into the warehouse database
  • Drove the Performance Tuning efforts of the complex transformation logic and queries used in both ETL and Reporting by creating Indexes like Bitmap, b-tree, function based index.
  • Created the Test Plans and Test Cases for the Data Warehouse and ETL Process.
  • Supported the Client in User Acceptance Testing
Environment
Oracle 11g, Greenplum, Informatica 9.1/8.6, PL/SQL and UNIX shell scripting
State of Colorado department of Revenue, Denver, CO
Duration
Apr 2010 – Jan 2012
Role
ETL developer
Responsibilities
The CITA Project Involves implementing an enterprise class tax system called GenTax that will be used by users geographically dispersed across Colorado. Gentax uses a multi-tier distributed architecture that can be divided into three layers: Interface (TaxPayer or DOR user), Business and Data. The Project (Phase II, III, IV) involves migrating Tax Returns (Individual and Businesses, Motor Fuel and IFTA) data from Legacy Systems to Gentax.

  • Gathered the Business Requirements and responsible for Data Migration and Data Cleansing.
  • Created the Detail Design Specification and Technical Specifications for ETL Process including Data Cleansing rules.
  • Designed and Developed the Staging Area for historical data migration.
  • Created Logical Flow of Data Conversion including source data analysis, data profiling and data cleansing for the historical data and Created Test Plans, Test Scripts and Perform the testing for data conversion process during all phases of Testing Starting from Unit, System, Integration and UAT
  • Analyzed the Source Data and made recommendations/modifications to the ETL logic
  • Developed complex mappings in Informatica to load the data from various sources into Gentax
  • Parameterized the mappings and increased the re-usability.
  • Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
  • Applied slowly changing dimensions like Type 1 and 2 effectively to handle the delta Loads.
  • Created Materialized Views on Summary Tables for Data Summarization for Reporting.
  • Performed Analysis, Performance Tuning of the Informatica Mappings and Queries for Reporting by creating Indexes and Materialized Views.
  • Extensively worked on Oracle Packages, Triggers, procedures, Functions, Database links, Synonyms, Indexes, Sequences, Views, Materialized views and Cursors.
  • Drove the Performance Tuning efforts of the complex transformation logic and queries
  • Assisted in Integration and User Acceptance Testing.
Environment
Oracle 10g, Informatica 8.6, PL/SQL, ETL, SQL*Loader, TOAD, MS- Office, FCR tool and FTP
Fiserv, Denver, CO
Duration
June 2008 – Mar 2010
Role
PL/SQL and ETL developer
Responsibilities
Fiserv offers both integrated browser-based J2EE solutions and IBM iSeries solutions that enable users to track all information associated with Workers Compensation policies and claims in a seamless environment. PowerSuite is fully integrated yet modular Policy and Claims software that provides end-to-end core processing for Workers Compensation. It is a comprehensive All-in-One suite that takes care of policy administration/underwriting, claims administration, managed care, database and remarks tracking, and imaging needs

  • Migrated/Converted three clients Legacy data into PowerSuite Workers Compensation product
  • Designed the ETL process using PL/SQL and Informatica
  • Designed and created 100+ mappings to move data from source to PowerSuite
  • Involved in gathering Requirements, Systems Analysis, preparing Functional Specifications, Design Reviews, Plan Reviews, Implementation and Post Implementation
  • Participated in project planning sessions and all technical client meetings.
  • Creation of Informatica Mappings to load the data from Source to Staging and from Staging area to Target by applying business rules.
  • Created various Sources, Targets, Mappings, Workflows using Informatica Power Center
  • Developed T-SQL, PL/SQL (Functions, Procedures, Packages, Views and Triggers).
  • Carried Data Profiling and Source Data Analysis and made the recommendations for Errors/Defects in data, Data Cleansing and modifications for the conversion logic wherever necessary.
  • Provided the 24/7 Application Support for Post-Production Process.
Environment
Oracle 9i/10g, Oracle SQL Server 2000, Informatica 8.6, UNIX shell scripting, PL/SQL, T-SQL, TOAD and PL/SQL developer and MS-Office