Denodo Developer
Satya

Professional Summary

  • Over 9 Years of IT experience with extensive experience in Data warehousing, Data Analysis, Reporting, ETL, Data Modeling, Development, Maintenance, Testing and Documentation.
  • Hands-on experience in all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data modeling, build, unit testing, systems testing and user acceptance testing of building a Data Warehouse.
  • Extensive Data Warehouse, Data Migration and Data Integration experience using Informatica Power Center as ETL tool on Sybase, Oracle, SQL Server and DB2 Databases.
  • Experience in Informatica Administration in setting up new user, new connections, folders and creating deployment groups to move from one repository to another repository and also maintaining the Informatica services.
  • Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes
  • Extensive experience with Sybase, SQL Server, Oracle 11g/10g/9i/8.x/7.x.
  • Expertise in Dimensional data modeling, Star schema modeling, Snowflake modeling, Normalization.
  • Experience in performance tuning of Informatica Mappings, Sessions and SQL queries.
  • Proficient in shell scripting and Perl scripting.
  • Strong experience in coding using SQL, PL/SQL.
  • Extensively used SQL, PL/SQL and T-SQL to write Stored Procedures, Functions, Packages and Triggers.
  • Extensive experience in the Facets 4.71 Data Model and with the tables and data in the Facets product.
  • Experience in working with agile scrum/Kanban and waterfall methodologies.
  • Understanding & Working knowledge of Informatica CDC (Change Data Capture)
  • Experience in using reporting tools such as COGNOS, Spotfire.
  • Experience in USING Denodo as data virtualization tool.
  • Experience in using Informatica tools such as IDQ, MDM and DVO.
  • Self-motivated, able to set effective priorities to achieve immediate and long-term goals and meet operational deadlines.

Technical Skills

Languages
SQL, PL/SQL, TSQL, PERL, HTML, C, C++, C#, Javascript, Linux/Unix Shell Script (SH, KSH, BASH), VQL, Python.
Applications
Informatica Powercenter 9.x/8.x/7.x, Informatica DVO, IDQ, MDM, Power Exchange, Denodo, ICS, Cognos, Spotfire, Facets, Denodo 5.5, MQ Series, JMS
Data Modelling
Dimensional Data Modeling, Star Join Schema Modeling, Snow Flake Modeling, FACT and Dimension Tables, Physical and Logical Data Modeling.
Databases
Oracle, MS SQL Server, MS Access, IBM DB2, Sybase.
Environments
UNIX, Windows NT, Windows server, HP-Unix, Linux

Education

  • Master’s in Engineering

Professional Experience

Oppenheimerfunds Inc, Denver, CO
Duration
Septmber 2011 - Current
Role
Enterprise Data warehouse Analyst/ Sr. Data Engineer
Responsibilities
Oppenheimerfunds Inc. (OFI) is one of the largest investment advisers to mutual funds sold through independent broker dealers and the financial institutions in US. As an Enterprise Data warehouse Analyst/ Data Engineer, I worked on multiple data related Projects like ASPECT/SALES FORCE/SDW/MSTAR/IDW/MDM/FDM/Abandoned property, where new data is integrated and transformed and loaded into existing Enterprise data warehouses, Data Marts.

  • Involved in Business Requirement Analysis and prepared Functional specification Documents.
  • Actively Involved in Dimensional Modeling (Star Schema) of the Shareholder Data Warehouse and used modeling tools (VISIO) to design the business process, granularity, dimensions and facts.
  • Actively Involved in the ETL Technical Design Discussions and prepared high-level technical design documents.
  • Formed a Data Group with other team members to come up with best strategies/practices/policies for the ETL code development.
  • Developed complex Informatica mappings to populate both TYPE2 and TYPE1 tables and fact tables by using different kind of Informatica transformations.
  • Developed PL/SQL procedures/Packages to perform some complex ETL logics, to maintain Partitions, to prevent table updates from different processes, which are used in lot of other Data Projects in the company.
  • Developed dynamic parameter file creation for some of the ETL code for easy maintainability.
  • Identified slowly running ETL jobs using the Informatica repository queries and reduced the run time by applying various performance tuning techniques.
  • Mentored junior developers to improve their knowledge of Informatica and Oracle SQL and PL/SQL.
  • Used Denodo in one of the projects where the data is staged using Denodo Virtulization.
  • Extensively worked on Informatica/SQL performance tuning to find potential bottlenecks in the source, target systems, mappings and SQL queries.
  • Developed ETL code in Informatica Cloud Services (ICS) to integrate the Dealer/Advisor data into Sales force Cloud.
  • Used Informatica DVO for performing Automation testing in different projects.
  • Worked with a team member for creating mappings in IDQ for Address standardization across the enterprise using IDQ Address doctor.
  • Worked with Informatica MDM team to integrate new Dealer Data from Sales force and developed ETL code for Contribution and consumption and gained a very good working knowledge of Informatica MDM while doing this project.
  • Performed some Informatica Administration tasks to create relational connections, deployment of code.
  • Worked with COGNOS team to build frameworks for couple of projects.
  • Implemented Real time Chang Data Capture (CDC) using informatica Power Exchange for JMS Queue.
  • Worked in Agile Scrum/ Kanban Methodology.
  • Used Autosys for scheduling the ETL Jobs.
  • Created Unix Shell scripts for performing some data scrubbing/ cleansing for the incoming data.
Environment
Informatica Power Center 9.1.1/8.6.1, Oracle 11g/10i/9g, Oracle Exadata, IDQ, Informatica DVO, Informatica MDM, SQL Server 2008/2010, Sybase, Toad, Tortoise SVN, JIRA, COGNOS, SPOTFIRE, Denodo 5.5
The Trizetto group, Inc. Denver,CO
Duration
April 2010 - September 2011
Role
ETL Developer/Project Lead
Responsibilities
Trizetto is a leader in providing the Integrated Healthcare Management and its technology solutions is touching nearly half of the U.S Insured population and Facets is TriZetto's enterprise-wide software solution for health plan administration. As an ETL consultant, I worked on building the ETL code for one of their clients who recently bought their Facets Products. As an ETL Developer my main responsibilities are to build and maintain and support the code. As a Project lead involved in co-coordinating the entire team. My main responsibilities in this project are:

  • Involved in requirements gathering meetings on how the interfaces need to be developed.
  • Actively involved in the Development/ Design phases of the project.
  • Designed the technical specification documents and Functional design documents as per the business requirements.
  • Worked with team developers in implementing the best strategies/practices for code development.
  • Built the ETL Informatica code for each interface for the vendors with their respective system to process their business transactions with respect to the Health care services.
  • Worked as Project lead in co-coordinating with TriZetto and LHP.
  • Reviewed the Build\Code and design documents.
  • Performed the Unit testing/Performance testing.
  • Provided 24x7 production support for the Production Batch jobs and assisted in resolving the failures immediately to avoid any impact on the business.
  • Extensively involved in tuning of the Informatica mappings and SQL queries.
  • Involved in writing Perl Scripts for Data Scrubbing and Modification for the inbound files.
  • Helped the testing team in writing the Test Cases which covers all the different Business scenarios that can happen in Production.
  • Worked with the Production Support team in Scheduling the Batch jobs and Improving the performance of Batch Runs by giving lot of Input.
  • Worked with PVCS tool to maintain the code using Version Control.
  • Lead/Coordinated a team of 10+ members.
Environment
Informatica Power Center 8.6.1, Sybase , SQL, T-SQL, RapidSQL, UNIX Shell Scripting, UNIX, Facets 4.71, PVCS
WellPoint, Inc. , Denver, CO
Duration
Jan 2009 – Mar 2010
Role
ETL Developer
Responsibilities
WellPoint is one of the largest healthcare providers in USA expanding through several states and has a huge enterprise Data warehouse. As an Informatica ETL admin and Developer and analyst my main responsibilities are to build and maintain the part of the datawarehouse. I worked on couple of projects (WCC & Facets)which builds part of the Data warehouse which is being further used for web reporting purposes. As an Informatica admin worked on the upgrade of Informatica from 8.1.1 to 8.6.1. I worked most part as an admin/ developer for these projects and my main responsibilities are:

  • Participated in the requirements gathering meetings with End users, business analysts.
  • Helped in designing the working structure for Informatica with different folders to be created at different tiers.
  • Actively involved in the Development/ Design phases of the project.
  • Worked with team developers in implementing the best strategies/practices for code development.
  • Worked in Developing the Informatica mappings which pulls the data from Facets Sybase Servers to Informatica servers and it is further loaded into Teradata tables using Informatica.
  • Assisted the Development team in developing some of the critical code in Informatica by using all the Informatica tools.
  • Debugged and resolved some of the job failures in Informatica by making changes to the code and helped finding the issues in the code which cause the workflow failures.
  • Extensively involved in tuning of the Informatica mappings and SQL queries.
  • As an Informatica admin worked on Upgrade of Informatica from 8.1.1 to 8.6.1 and helped the system admin in the testing procedures of new version
  • Created users/groups/folders and associated UNIX directories for the smooth development of the code.
  • Created Informatica connections pointing all types of databases like teradata, Oracle, Sybase and also external loader connections and made the corresponding changes to ODBC file and Tnsnames.ora files.
  • As an admin worked on the failures of the Informatica services and production job failures.
  • Worked on migration of the Informatica code from Dev to UAT to Prod repositories by using different migration procedures.
  • Worked on the migration of UNIX components across Dev/UAT/Prod servers.
  • Documented all the Functional and technical specifications.
  • Provided 24x7 production support for business users and documented problems and solutions for running the workflows.
  • Developed some mappings to get the repository OPB table information into local database tables which are further used for reporting purposes.
  • Wrote some queries which run every month to get the statistics on each repository like number of jobs ran repository size etc…
  • Worked as a part of Informatica admin team in maintaining all the repositories across all the data warehouses.
  • Used SQL tools like EmbarcaderoRapidSQL to run SQL queries and validate the data.
Environment
Informatica Power Center 8.1.1/8.6.1, Sybase, Oracle 9i/10g, teradata, SQL, PL/SQL, TOAD, UNIX Shell Scripting, UNIX, SQL Server 2005, WLM, Factes 4.71
Oppenheimerfunds Inc, Denver, CO
Duration
July 2007- Dec 2008
Role
ETL Developer and Data warehouse Analyst
Responsibilities
Oppenheimerfunds Inc. (OFI) is one of the largest investment advisers to mutual funds sold through independent broker dealers and the financial institutions in US. As an ETL Developer I worked on couple of projects in building the Enterprise Data warehouse. In the first project, as an ETL Developer of Data Warehouse team, my main responsibility is to build and maintain one of the Data marts based off the data present in the Source system (ORIN) which will be further used for reporting purposes to the end users for decision making purposes. In the Second project, as an ETL Developer, my main responsibility is to integrate and transform the data from different source systems (Power Agent, DST, SAMI) into Normalized ODS (Operational Data Store) which will be further used to support the DSTVision Application. As an ETL Developer and Data warehouse Analyst in both the projects my main responsibilities are:

  • Participated in the requirements gathering meetings with End Users, Business Analysts and Project Manager.
  • Helped in designing the logical data model for the Data mart which is a star schema design and for ODS which is a normalized data model with referential integrities.
  • Developed complex Informatica mappings to populate both TYPE2 and TYPE1 Slowly Changing Dimension tables and fact tables by using different kind of transformations such as Source Qualifier, Aggregator, Sorter, Expression, Joiner, Connected and Unconnected lookups, Filters, Stored Procedure, Router and Update strategy.
  • Created reusable mapplets and worklets for reusability and easy readability of the code.
  • Extensively dealt with different type of sources such as flat files, COBOL workbooks, and relational tables in Sybase and SQL Server.
  • Partially involved in writing the UNIX shell scripts for job scheduling.
  • Extensively tuned the SQL queries which are being used as a part of different transformations such as look ups and source qualifiers by using the SQL tool SQL Developer and Toad.
  • Extensively tuned all the Informatica mappings by finding the bottlenecks in different areas.
  • Worked with DBAs to make the daily loads run much faster by tuning different aspects of the load jobs.
  • Involved in suggesting the Materialized view creations for faster reporting purposes according to the business requirements.
  • Documented all the Functional and technical specifications.
  • Worked closely with the testing team helping them writing the test cases and fixing the bugs.
  • Involved in performing the DQA on the Source Systems’ data and based off that, performed the Data Cleansing on the Source Systems’ data using Informatica and First Logic software.
  • Created two partition strategy on the tables in Oracle to support the High Availability of data for the Real time DST Vision Application.
  • Designed and Implemented the Error table strategy which catches the data issues in the daily load and reports them to the end users by sending an Email after the load is finished.
  • Supported the daily load batch process and gave post production support by fixing some of the rare post production issues.
Environment
Informatica Power Center 8.1.1/7.1.4, Oracle 11g/10g, SQL, PL/SQL, SQL Developer, TOAD, UNIX Shell Scripting, Mercury Quality Center, Windows XP, UNIX, Sybase, SQL Server 2005, First Logic, Autosys, ULP.
Arrow Electronics Inc, Denver, CO
Duration
Oct 2006- Jun 2007
Role
ETL Developer
Responsibilities
Arrow ECS is the global business group of Arrow Electronics, Inc. that provides enterprise and midrange computing products, services and solutions to value-added resellers, system integrators, and independent software vendors (ISVs). The project involves building and maintaining data warehouse for Arrow ECS and meeting the organizational reporting needs. Support for Arrow ECS under this project involves Development, Production support and maintenance of Arrow ECS Data warehouse in the following activities:

  • Requirements gathering and analysis.
  • Participated in the Data Model design according the Business Requirements
  • Created Design Specification Documents including source to target mappings for Mainframe/Operational Source/financial systems to Stage, Stage to ODS and form ODS to Data warehouse tables.
  • Translated Business Requirements into Informatica mappings to build Data Warehouse by using Informatica Designer, which populated the data into the target Star Schema on Oracle 9i Instance.
  • Extensively used Router, Joiner, Lookup, Aggregator, Expression and Filter transformations in mappings.
  • Extensively worked with performance tuning of the mappings by implementing the Hash Key algorithms for the flat files.
  • Debugged and resolved load failures by verifying the log files. Supported QA Testing in fixing the bugs and also helped to resolve various data issues.
  • Maintained a very good interaction with analysts, Project Managers, architects and testers to have efficient and better results.
  • Used SQL tools like TOAD to run SQL queries and validate the data.
  • Partially involved in writing the UNIX Shell Scripts, which triggers the workflows to run in a particular order as a part of the daily loading into the Warehouse.
  • Involved in implementing the ERROR TRAPPING in all the mappings, where we trap all the errors and load them into an ERROR table for reporting purposes to present to the Business Users
  • Provided 24x7 production support for business users and documented problems and solutions for running the workflows.
Environment
Informatica Power Center 7.1.4/8.1.1, Oracle 9i,SQL, PL/SQL, TOAD, UNIX Shell Scripting, Test Director, Windows XP, UNIX.
Cleveland state university
Duration
Aug 2005 – Sep 2006
Role
ETL Developer (Intern position)
Responsibilities
The aim of CDS (Consolidated Data Source) was to build a Data Warehouse to capture and store data from OLTP (Online Transaction Processing System) for business analysis. CDS was built to continually upload databases coming from different source systems nationwide. Informatica was used to load source data into the databases. This Data Warehouse is also used for decision making and business analysis.

  • Involved in gathering business requirements in liaison with business users and technical teams. Created requirement specification documents and identified data sources and targets.
  • Worked on Informatica Power Center 7.1 tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplet Designer, and Transformation Developer.
  • Translated Business processes into Informatica mappings to build Data marts by using Informatica Designer, which populated the data into the target Star Schema on Oracle 9i Instance.
  • Identified conformed dimensions, degenerated dimensions and aggregates, assigned primary keys, surrogate keys and indexes across fact and dimensions tables.
  • Extensively used Router, Joiner, Lookup, Aggregator, Expression and Update Strategy transformations in mappings.
  • Extensively used ETL to load data from wide range of sources such as flat files (CSV, fixed length), Oracle to Oracle.
  • Extensively worked with performance tuning of the mappings, sessions and Stored Procedures.
  • Developed re-usable transformations, mappings and mapplets conforming to the business rules.
  • Created reusable Worklets involving many tasks to use in workflows.
  • Involved in the design, development and testing of the PL/SQL stored procedures and packages for the ETL processes.
  • Used SQL tools like TOAD to run SQL queries and validate the data.
  • Performed logical and physical Data Modeling/ER-Modeling using Erwin.
  • Involved in writing UNIX shell scripts for scheduling to automate the load process
Environment
Informatica Power Center 7.1.3, Teradata, Oracle 9i, SQL, DB2, PL/SQL, Erwin 4.1.4, TOAD, UNIX Shell Scripting, test director, Windows XP, UNIX.
Aurobindo Pharma Ltd., India
Duration
Aug 2002 – Aug 2003
Role
Database Developer
Responsibilities
The primary objective of AcuTrack2.1 was to optimize tracking & analysis of various employee and customer transactions accurately, which involved in Analysis, Design, Development and Testing of HR and Customer Data Marts. Data was extracted from multiple data sources to load in to the warehouse.

  • Designed and developed database in Oracle.
  • Wrote functional specifications to determine the application performance, the application user interface and to help manage the user expectations.
  • Improved the existing logical design by eliminating certain entities.
  • Responsible for creating ER Diagrams, tables, views, table level constraints and triggers to enforce referential integrity and apply business rule for the above referenced information system models and normalized data up to second normal form.
  • Created tables and inserted the required data.
  • Extensively coded PL/SQL writing Stored Procedures and packages including functions.
  • Responsible for backup and recovery, tuning and security.
Environment
Oracle 7.3, SQL*Plus 3.3, Windows 98.