Project 2:Asset Management Reporting
Project 3: Fuel & Energy Consumption Reporting at Plants
Employment Type: Freelance Contract
Role: Data Engineer
Project: Energy Data Lake
Project Technology Stack
Cloud Platform: Microsoft Azure
Source System: Azure Blobs, REST API, MS SQL Server, Snowflake, CSV, Excel,
XML, etc.
Target System: Microsoft Azure SQL DB, MS SQL Server, Snowflake, CSV
ETL Tool/Programming Language: Talend Data Integration, Azure Data
Factory V2, Python
Other programming languages: Python, T-SQL, SnowSQL
Scheduling Tool: Azure Data Factory Triggers, Talend Management Console
Other Azure tools: Azure Data Explorer, Azure Data Studio
Project Details:
Project 1:
Health, Safety, Security and the Environment (HSSE) Reporting Migration from SQL Server to Snowflake Datawarehouse
This project involves extracting incident or cases data from Incident Management Systems via REST APIs. These incident or cases are recorded at various energy assets such as powerplants, units, weir, etc. The extracted data then enriched with central asset master data and then loaded into report tables for generating HSSE Reports in Tableau.
Project 2:
Asset Management Reporting
This project involves extracting data from Azure Blob containers to Snowflake Datawarehouse for Asset Management Reporting. SAP Plant Maintenance data containing Notifications, Orders, master data are loaded into CSVs in Azure Blob containers. These CSVs then read by Python scripts to load into import layer, perform calculations/versioning in raw layer and then load into reporting layer to be used by Tableau reports.
Project 3: Fuel & Energy Consumption Reporting at Plants
Senior ETL Developer
Credit Suisse
Credit Suisse, Zürich through Atyeti Inc
Contract Type: Contract
Role: ETL Developer
Project: Trade & Transaction Regulatory Reporting (TCIS/TAPI)
MIFIR/EMIR Transaction Regulatory Reporting to various LCAs.
Project Technology Stack
Source System: XML Files, Flat Files, Oracle
Target System: Oracle 19c, XML, CSV
ETL Tool: Informatica PowerCenter 10.2
Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting
Scheduling Tool: Control-M
Project Details:
TAPI (Trading & Product Information) or TCIS (Trading & Sales Controlling & Information Systems is a central EOD repository holding Credit Suisse?s transactions, positions, cash, stock, accruals and valuation data. The data will be fed from all front office systems and back office systems holding Swiss Trads or Positions. Reference data will be taken from other strategic applications such as GRD Product, GRD Customer, MDS, etc. This repository will supply consistent and reconciled data for legal, regulatory and management reporting.
Responsibilities
- Design, Develop and Maintain Informatica ETL/Data pipelines.
- Analysis of JIRA tickets for finding issues and providing ETL/SQL solution
- Development and Support of Enterprise Data Warehouse in loading history data using SCD Type 1 and Type 2 mappings.
- Development of database objects like procedures, functions, views, etc for data loading support.
ETL Developer
Lipsia Digital GmbH
Contract Type: Freelance
Role: ETL Developer
Project: Procurify Integration with DATEV
Read all bill details including purchase orders, approvals, attachments from Procurify, a cloud-based procurement management system and send it to the Flowwer2, a target system for the Procurify DATEV Connector. Flowwer2 is DATEV approved tool which can be connect to a specific DATEV Client and can send via structured data as well as attachments to DATEV.
Flowwer2 will be used to receive and send invoice data and related attachments (invoice.pdf, po.pdf, shipping slip.pdf AND approval log.pdf) to DATEV.
Project Technology Stack
Source System: REST API
Target System: PostgreSQL 10.7, REST API
ETL Tool: Talend Open Studio 7.2
Other programming languages: SQL, Unix Shell Scripting
Scheduling Tool: CronTab
Other tools: Github, JIRA, Confluence, PostMan, Putty, WinSCP, etc.
Responsibilities
- Architect, Design, Develop and Maintain Talend jobs.
- Bug Fixing, Deployment, Production Support, Data Analysis
- Read data from REST APIs via components like tRESTClient, tXMLMap, tMap, tPostgresqlInput, etc and load into staging layer in PostgreSQL database using components like tFilterRow, tSortRow, tPostgresqlOutput.
- Perform various cleansing and data completeness checks
- Analysis of JIRA tickets for finding issues and providing ETL/SQL solution
- Used various components like tAggregateRow, tFileFetch, tFileInputDelimited, tFileOutputDelimited, tFileInputRaw, tJava, tJavaRow, tS3Connection, tS3Put, tFlowToIterate, etc.
- Used Talend ESB components like tRESTClient to interact with RESTful Web Service providers by sending HTTP & HTTPS request and processing response received in JSON or XML.
- Create PDF files out of deliver area using tFileOutputPDF2 component from Talend Exchange.
- Created incremental loading components using tSetGlobalVar. Used tContextLoad component to load context variable for loading generic configuration.
- Used tFileFetch component to download PDF documents from URLs.
Deutsche Boerse, Frankfurt am Main through Marlin Green Ltd
Vertragsart: Freiberuflich
Role: ETL Entwickler
Project: Regulatory Reporting Hub (RRH)
MIFIR/EMIR Transaction Regulatory Reporting to NCAs e.g. BaFin, AMF, etc.
Project Technology Stack
Source System: XML Files, Flat Files, Oracle
Target System: Oracle, XML, CSV
ETL Tool: Informatica Powercenter 10.2
Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting
Scheduling Tool: Control-M
Responsibilities
- Design, Develop and Maintain Informatica ETL/Data pipelines
- Performance tuning of ETL pipelines for faster loading in various environments
- Bug Fixing, Deployment, Production Support, Data Analysis
- Read data from XML & Flat files to load into staging, core layer and further to Delivery Area in Oracle database.
- Perform various cleansing and data completeness checks
- Enrich data from various reference/lookup tables and load into core layer
- Used various transformation like XML Source Qualifier, XML Parser, XML Generator, Transaction Control, Normalizer, lookup, update strategy, etc.
- Performance optimization of informatica mappings and sessions for faster loads
- Developed SCD Type1 and 2 mappings to load history data into data mart.
Commerzbank, Frankfurt am Main through JOB AG Source One GmbH
Vertragsart: Freiberuflich
Role: ETL Entwickler
Project
Compliance (CMC & CAF) - AML Reporting - Frankfurt & Singapore
This was a data Integration project which includes providing data from various banking applications like Murex Cash, Murex Equity, Murex Currency, etc. for compliance reporting.
Project Technology Stack
Source System: Flat Files, MS SQL Server
Target System: Oracle, Flat Files, Hadoop HDFS
ETL Tool: Informatica Powercenter 10.1, Informatica BDM
Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting, UC4 Scripting
Scheduling Tool: Automic UC4
Responsibilities:
- Design ETL Pipelines and ETL Architecture.
- Design Informatica ETL jobs as per the quality and software development standards.
- Source to target data mapping analysis and design.
- Analyze, design, develop, test and document Informatica ETL programs from detailed and high-level specifications, and assist in troubleshooting.
- Creation of project-related documents like HLD, LLD, etc.
- Created reusable transformations and mapplets
- Developed data ETL pipelines for Change Data Capture (CDC)
- Creation of data pipelines to load into Hadoop HDFS.
- Complex Informatica Powercenter ETL development and Quality Assurance.
- Design and develop various slowly changing dimension load e.g. Type 1, Type 2, and Type 3
- Responsible for finding various bottlenecks and performance tuning at various levels like mapping level, session level, and database level.
- Extensive use of various active and passive transformations like Filter, Router, Expression, Source Qualifier, Joiner, and Look up, Update Strategy, Sequence Generator, Rank, and Aggregator.
- Debugging and troubleshooting Sessions using the Informatica Debugger and Workflow Monitor.
- Implement various loads like Daily Loads, Weekly Loads, and Quarterly Loads.
- Conduct Unit tests, Integration tests, performance tests, etc.
- Contact point for problems in the Production environment and Defects Tracking with business. (3rd-Level-Support)
- Supported deployment team in various environment's deployments
- Developed database objects including tables, Indexes, views, sequences, packages, triggers and procedures to troubleshoot any database problems
Aldi Sued, Muelhiem an der Ruhr through Templeton & Partners Ltd
Vertragsart: Freiberuflich
Role: ETL Tech Lead
Project: Retail Enterprise Data Warehouse
Project Technology Stack
Source System: MS SQL Server, Flat Files, Oracle
Target System: Oracle Exadata
ETL Tool: Informatica Powercenter 10.1
Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting
Scheduling Tool: Informatica Scheduler
Project Methodology: Scrum/Agile
Responsibilities
- Participate in scoping, data quality analysis, source system data analysis, target system requirements, volume analysis and migration window determination.
- Implement various loads like Daily Loads, Weekly Loads, and Quarterly Loads.
- Perform data cleansing tasks.
- Perform test using sample test data in accordance with the client data migration/integration needs.
- Contact point for problems in the Production environment and Defects Tracking with business. (3rd-Level-Support)
- Helped Business Analyst in refining mapping specification documents.
- Developed Informatica Powercenter mappings to move data from stage to target tables
- Developed PL/SQL Packages, Procedures and Functions accordance with Business Requirements.
- Documented various input databases and data sources.
- Debugging and troubleshooting Sessions using the Informatica Debugger and Workflow Monitor.
- Complex ETL development and Quality Assurance.
- Responsible for finding various bottlenecks and performance tuning at various levels like database, ETL, etc.
- Created Materialized Views and partitioning tables for performance reasons.
- Worked on various back end Procedures and Functions using PL/SQL.
- Developed UNIX shell scripts to perform various user requirements.
- Designing Tables, Constraints, Views, and Indexes etc.
- Developed database objects including tables, Indexes, views, sequences, packages, triggers and procedures to troubleshoot any database problems
- Tuned complex Stored Procedures for faster execution
- Responsible for Analyzing and Implementing the Change Requests.
- Involved in handling the changes in compiling jobs and scripts according to the database changes.
HRS, Köln through Informationsfabrik GmbH
Job Type: Freelancer
Role: Senior ETL Consultant
Project: Hotel Enterprise Data Warehouse
Project Technology Stack
Source System: MS SQL Server, Flat Files, Oracle, XML
Target System: Sybase IQ
ETL Tool: Informatica Powercenter 9.5
Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting
Scheduling Tool: Control-M
Project Methodology: Waterfall
Data Modeling: Data Vault
Karstadt, Essen through IBM Deutschland GmbH through Questax Heidelberg GmbH
Job Type: Freelancer
Role: ETL Tech Lead
Project:
Karstadt information systems for measures and analytics (KARISMA)
The goal of this project was to create centralized Analytical and Reporting system for Karstadt Warehouse GmbH. The major part of the project was to replace existing SAP BW Reporting system and create new enterprise data warehouse with Informatica PowerCenter 9.5.1 for ETL and Cognos 10 for Reporting. Informatica PowerExchange 9.5.1 with BCI (Business Content Integration) & Data Integration using ABAP methods were used to connect to Karstadt SAP Retail system and read data from SAP Standard and Customized Data Sources. IBM Netezza 7 was used as Target system with Informatica PowerExchange for Netezza.
Project Technology Stack
Source System: SAP, IDOC, Flat Files, XML
Target System: IBM Netezza
ETL Tool: Informatica Powercenter 9.5, Informatica Powerexchange 9.5
Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting
Scheduling Tool: Informatica Scheduler
Project Methodology: Waterfall
Deutsche Boerse, Frankfurt through Javaji Softech GmbH & Co. KG
Job Type: Freelancer
Role: Senior ETL Consultant
Project: Data Integration
Deutsche Bank, Frankfurt through Datamatics Global Solutions GmbH/DXC GmbH (Formerly CSC Deutschland) and Hays AG
Job Type: Employee & Freelancer
Role: Senior ETL Consultant
Informatica Powercenter ETL Tool Development & Support
for Data Migration and Data Integration Projects.
Projects:
#1 Retail Banking - Postbank Savings Deposit Accounts Migration
#2 Retail Banking - Postbank Savings Deposit Accounts Integration
#3 Retail Banking - Auto Deployment
#4 Retail Banking - LDAP Integration
American Home Mortgage Servicing Inc, Texas through Hitachi Consulting Pvt Ltd, Pune
Job Type: Employee
Role: Senior ETL Consultant
Project :
Enterprise Data Warehouse
Sigma Systems, Pune
Job Type: Employee
Role: Software Engineer
Oracle, Unix, Java Development & Support
Bachelor Of Engineering in Computer Science
Pune University, Pune, India
ZERTIFIZIERUNGEN
March 2014
Informatica PowerCenter 9.x Certified Professional
Certificate No. 004-000384
Percentile: 71%
May 2015
TOGAF 9.1 (Enterprise Architecture Framework) Certified Professional from OPEN GROUP
Certification ID. 96457
Percentile: 75%
July 2017
International Knowledge Mesurement
Certificate: Informatica Powercenter
Percentile: 93%
ETL Tools
Informatica PowerCenter, Informatica Big Data Management,
Informatica Power Exchange, Talend
Databases
Oracle 12c, Oracle Exadata 12c, Microsoft SQL Server 2016, Hadoop HDFS, XML
Big Data Technologies/Ecosystem
Cloudera, HDFS, YARN, MapReduce, Hive, Pig, HBase, Oozie, Flume and Sqoop
Modeling
Star & Snowflake Schema, 3-NF, Data Modeling, Dimensional Modeling, Data Vault
Modeling Tools
PowerDesinger, Informatica Mapping Architect for Visio
Software Development Methods
Agile, SCRUM, Waterfall
Programming Languages
Core JAVA, SQL, T-SQL, PL/SQL, UNIX/Bash Shell scripting
Scheduler
BMC Control-M, Automic UC4, Informatica Scheduler
Version Control
Informatica Version Control, Subversion, Tortoise SVN, GitHub
Other Tools - Atlassian Jira, Atlassian Confluence, GitHub, Hue, Eclipse, Toad, PL/SQL-Developer, FTP, sFTP, WinSCP, FileZilla, Putty, HP Quality Center, Aginity Workbench for IBM Netezza
UNIX (Sun Solaris, Linux), Windows 7/Vista/XP
ETL Design, ETL Development using Informatica PowerCenter
Banking & Finance, Telecom
Project 2:Asset Management Reporting
Project 3: Fuel & Energy Consumption Reporting at Plants
Employment Type: Freelance Contract
Role: Data Engineer
Project: Energy Data Lake
Project Technology Stack
Cloud Platform: Microsoft Azure
Source System: Azure Blobs, REST API, MS SQL Server, Snowflake, CSV, Excel,
XML, etc.
Target System: Microsoft Azure SQL DB, MS SQL Server, Snowflake, CSV
ETL Tool/Programming Language: Talend Data Integration, Azure Data
Factory V2, Python
Other programming languages: Python, T-SQL, SnowSQL
Scheduling Tool: Azure Data Factory Triggers, Talend Management Console
Other Azure tools: Azure Data Explorer, Azure Data Studio
Project Details:
Project 1:
Health, Safety, Security and the Environment (HSSE) Reporting Migration from SQL Server to Snowflake Datawarehouse
This project involves extracting incident or cases data from Incident Management Systems via REST APIs. These incident or cases are recorded at various energy assets such as powerplants, units, weir, etc. The extracted data then enriched with central asset master data and then loaded into report tables for generating HSSE Reports in Tableau.
Project 2:
Asset Management Reporting
This project involves extracting data from Azure Blob containers to Snowflake Datawarehouse for Asset Management Reporting. SAP Plant Maintenance data containing Notifications, Orders, master data are loaded into CSVs in Azure Blob containers. These CSVs then read by Python scripts to load into import layer, perform calculations/versioning in raw layer and then load into reporting layer to be used by Tableau reports.
Project 3: Fuel & Energy Consumption Reporting at Plants
Senior ETL Developer
Credit Suisse
Credit Suisse, Zürich through Atyeti Inc
Contract Type: Contract
Role: ETL Developer
Project: Trade & Transaction Regulatory Reporting (TCIS/TAPI)
MIFIR/EMIR Transaction Regulatory Reporting to various LCAs.
Project Technology Stack
Source System: XML Files, Flat Files, Oracle
Target System: Oracle 19c, XML, CSV
ETL Tool: Informatica PowerCenter 10.2
Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting
Scheduling Tool: Control-M
Project Details:
TAPI (Trading & Product Information) or TCIS (Trading & Sales Controlling & Information Systems is a central EOD repository holding Credit Suisse?s transactions, positions, cash, stock, accruals and valuation data. The data will be fed from all front office systems and back office systems holding Swiss Trads or Positions. Reference data will be taken from other strategic applications such as GRD Product, GRD Customer, MDS, etc. This repository will supply consistent and reconciled data for legal, regulatory and management reporting.
Responsibilities
- Design, Develop and Maintain Informatica ETL/Data pipelines.
- Analysis of JIRA tickets for finding issues and providing ETL/SQL solution
- Development and Support of Enterprise Data Warehouse in loading history data using SCD Type 1 and Type 2 mappings.
- Development of database objects like procedures, functions, views, etc for data loading support.
ETL Developer
Lipsia Digital GmbH
Contract Type: Freelance
Role: ETL Developer
Project: Procurify Integration with DATEV
Read all bill details including purchase orders, approvals, attachments from Procurify, a cloud-based procurement management system and send it to the Flowwer2, a target system for the Procurify DATEV Connector. Flowwer2 is DATEV approved tool which can be connect to a specific DATEV Client and can send via structured data as well as attachments to DATEV.
Flowwer2 will be used to receive and send invoice data and related attachments (invoice.pdf, po.pdf, shipping slip.pdf AND approval log.pdf) to DATEV.
Project Technology Stack
Source System: REST API
Target System: PostgreSQL 10.7, REST API
ETL Tool: Talend Open Studio 7.2
Other programming languages: SQL, Unix Shell Scripting
Scheduling Tool: CronTab
Other tools: Github, JIRA, Confluence, PostMan, Putty, WinSCP, etc.
Responsibilities
- Architect, Design, Develop and Maintain Talend jobs.
- Bug Fixing, Deployment, Production Support, Data Analysis
- Read data from REST APIs via components like tRESTClient, tXMLMap, tMap, tPostgresqlInput, etc and load into staging layer in PostgreSQL database using components like tFilterRow, tSortRow, tPostgresqlOutput.
- Perform various cleansing and data completeness checks
- Analysis of JIRA tickets for finding issues and providing ETL/SQL solution
- Used various components like tAggregateRow, tFileFetch, tFileInputDelimited, tFileOutputDelimited, tFileInputRaw, tJava, tJavaRow, tS3Connection, tS3Put, tFlowToIterate, etc.
- Used Talend ESB components like tRESTClient to interact with RESTful Web Service providers by sending HTTP & HTTPS request and processing response received in JSON or XML.
- Create PDF files out of deliver area using tFileOutputPDF2 component from Talend Exchange.
- Created incremental loading components using tSetGlobalVar. Used tContextLoad component to load context variable for loading generic configuration.
- Used tFileFetch component to download PDF documents from URLs.
Deutsche Boerse, Frankfurt am Main through Marlin Green Ltd
Vertragsart: Freiberuflich
Role: ETL Entwickler
Project: Regulatory Reporting Hub (RRH)
MIFIR/EMIR Transaction Regulatory Reporting to NCAs e.g. BaFin, AMF, etc.
Project Technology Stack
Source System: XML Files, Flat Files, Oracle
Target System: Oracle, XML, CSV
ETL Tool: Informatica Powercenter 10.2
Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting
Scheduling Tool: Control-M
Responsibilities
- Design, Develop and Maintain Informatica ETL/Data pipelines
- Performance tuning of ETL pipelines for faster loading in various environments
- Bug Fixing, Deployment, Production Support, Data Analysis
- Read data from XML & Flat files to load into staging, core layer and further to Delivery Area in Oracle database.
- Perform various cleansing and data completeness checks
- Enrich data from various reference/lookup tables and load into core layer
- Used various transformation like XML Source Qualifier, XML Parser, XML Generator, Transaction Control, Normalizer, lookup, update strategy, etc.
- Performance optimization of informatica mappings and sessions for faster loads
- Developed SCD Type1 and 2 mappings to load history data into data mart.
Commerzbank, Frankfurt am Main through JOB AG Source One GmbH
Vertragsart: Freiberuflich
Role: ETL Entwickler
Project
Compliance (CMC & CAF) - AML Reporting - Frankfurt & Singapore
This was a data Integration project which includes providing data from various banking applications like Murex Cash, Murex Equity, Murex Currency, etc. for compliance reporting.
Project Technology Stack
Source System: Flat Files, MS SQL Server
Target System: Oracle, Flat Files, Hadoop HDFS
ETL Tool: Informatica Powercenter 10.1, Informatica BDM
Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting, UC4 Scripting
Scheduling Tool: Automic UC4
Responsibilities:
- Design ETL Pipelines and ETL Architecture.
- Design Informatica ETL jobs as per the quality and software development standards.
- Source to target data mapping analysis and design.
- Analyze, design, develop, test and document Informatica ETL programs from detailed and high-level specifications, and assist in troubleshooting.
- Creation of project-related documents like HLD, LLD, etc.
- Created reusable transformations and mapplets
- Developed data ETL pipelines for Change Data Capture (CDC)
- Creation of data pipelines to load into Hadoop HDFS.
- Complex Informatica Powercenter ETL development and Quality Assurance.
- Design and develop various slowly changing dimension load e.g. Type 1, Type 2, and Type 3
- Responsible for finding various bottlenecks and performance tuning at various levels like mapping level, session level, and database level.
- Extensive use of various active and passive transformations like Filter, Router, Expression, Source Qualifier, Joiner, and Look up, Update Strategy, Sequence Generator, Rank, and Aggregator.
- Debugging and troubleshooting Sessions using the Informatica Debugger and Workflow Monitor.
- Implement various loads like Daily Loads, Weekly Loads, and Quarterly Loads.
- Conduct Unit tests, Integration tests, performance tests, etc.
- Contact point for problems in the Production environment and Defects Tracking with business. (3rd-Level-Support)
- Supported deployment team in various environment's deployments
- Developed database objects including tables, Indexes, views, sequences, packages, triggers and procedures to troubleshoot any database problems
Aldi Sued, Muelhiem an der Ruhr through Templeton & Partners Ltd
Vertragsart: Freiberuflich
Role: ETL Tech Lead
Project: Retail Enterprise Data Warehouse
Project Technology Stack
Source System: MS SQL Server, Flat Files, Oracle
Target System: Oracle Exadata
ETL Tool: Informatica Powercenter 10.1
Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting
Scheduling Tool: Informatica Scheduler
Project Methodology: Scrum/Agile
Responsibilities
- Participate in scoping, data quality analysis, source system data analysis, target system requirements, volume analysis and migration window determination.
- Implement various loads like Daily Loads, Weekly Loads, and Quarterly Loads.
- Perform data cleansing tasks.
- Perform test using sample test data in accordance with the client data migration/integration needs.
- Contact point for problems in the Production environment and Defects Tracking with business. (3rd-Level-Support)
- Helped Business Analyst in refining mapping specification documents.
- Developed Informatica Powercenter mappings to move data from stage to target tables
- Developed PL/SQL Packages, Procedures and Functions accordance with Business Requirements.
- Documented various input databases and data sources.
- Debugging and troubleshooting Sessions using the Informatica Debugger and Workflow Monitor.
- Complex ETL development and Quality Assurance.
- Responsible for finding various bottlenecks and performance tuning at various levels like database, ETL, etc.
- Created Materialized Views and partitioning tables for performance reasons.
- Worked on various back end Procedures and Functions using PL/SQL.
- Developed UNIX shell scripts to perform various user requirements.
- Designing Tables, Constraints, Views, and Indexes etc.
- Developed database objects including tables, Indexes, views, sequences, packages, triggers and procedures to troubleshoot any database problems
- Tuned complex Stored Procedures for faster execution
- Responsible for Analyzing and Implementing the Change Requests.
- Involved in handling the changes in compiling jobs and scripts according to the database changes.
HRS, Köln through Informationsfabrik GmbH
Job Type: Freelancer
Role: Senior ETL Consultant
Project: Hotel Enterprise Data Warehouse
Project Technology Stack
Source System: MS SQL Server, Flat Files, Oracle, XML
Target System: Sybase IQ
ETL Tool: Informatica Powercenter 9.5
Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting
Scheduling Tool: Control-M
Project Methodology: Waterfall
Data Modeling: Data Vault
Karstadt, Essen through IBM Deutschland GmbH through Questax Heidelberg GmbH
Job Type: Freelancer
Role: ETL Tech Lead
Project:
Karstadt information systems for measures and analytics (KARISMA)
The goal of this project was to create centralized Analytical and Reporting system for Karstadt Warehouse GmbH. The major part of the project was to replace existing SAP BW Reporting system and create new enterprise data warehouse with Informatica PowerCenter 9.5.1 for ETL and Cognos 10 for Reporting. Informatica PowerExchange 9.5.1 with BCI (Business Content Integration) & Data Integration using ABAP methods were used to connect to Karstadt SAP Retail system and read data from SAP Standard and Customized Data Sources. IBM Netezza 7 was used as Target system with Informatica PowerExchange for Netezza.
Project Technology Stack
Source System: SAP, IDOC, Flat Files, XML
Target System: IBM Netezza
ETL Tool: Informatica Powercenter 9.5, Informatica Powerexchange 9.5
Other programming languages: Oracle SQL & PLSQL, Unix Shell Scripting
Scheduling Tool: Informatica Scheduler
Project Methodology: Waterfall
Deutsche Boerse, Frankfurt through Javaji Softech GmbH & Co. KG
Job Type: Freelancer
Role: Senior ETL Consultant
Project: Data Integration
Deutsche Bank, Frankfurt through Datamatics Global Solutions GmbH/DXC GmbH (Formerly CSC Deutschland) and Hays AG
Job Type: Employee & Freelancer
Role: Senior ETL Consultant
Informatica Powercenter ETL Tool Development & Support
for Data Migration and Data Integration Projects.
Projects:
#1 Retail Banking - Postbank Savings Deposit Accounts Migration
#2 Retail Banking - Postbank Savings Deposit Accounts Integration
#3 Retail Banking - Auto Deployment
#4 Retail Banking - LDAP Integration
American Home Mortgage Servicing Inc, Texas through Hitachi Consulting Pvt Ltd, Pune
Job Type: Employee
Role: Senior ETL Consultant
Project :
Enterprise Data Warehouse
Sigma Systems, Pune
Job Type: Employee
Role: Software Engineer
Oracle, Unix, Java Development & Support
Bachelor Of Engineering in Computer Science
Pune University, Pune, India
ZERTIFIZIERUNGEN
March 2014
Informatica PowerCenter 9.x Certified Professional
Certificate No. 004-000384
Percentile: 71%
May 2015
TOGAF 9.1 (Enterprise Architecture Framework) Certified Professional from OPEN GROUP
Certification ID. 96457
Percentile: 75%
July 2017
International Knowledge Mesurement
Certificate: Informatica Powercenter
Percentile: 93%
ETL Tools
Informatica PowerCenter, Informatica Big Data Management,
Informatica Power Exchange, Talend
Databases
Oracle 12c, Oracle Exadata 12c, Microsoft SQL Server 2016, Hadoop HDFS, XML
Big Data Technologies/Ecosystem
Cloudera, HDFS, YARN, MapReduce, Hive, Pig, HBase, Oozie, Flume and Sqoop
Modeling
Star & Snowflake Schema, 3-NF, Data Modeling, Dimensional Modeling, Data Vault
Modeling Tools
PowerDesinger, Informatica Mapping Architect for Visio
Software Development Methods
Agile, SCRUM, Waterfall
Programming Languages
Core JAVA, SQL, T-SQL, PL/SQL, UNIX/Bash Shell scripting
Scheduler
BMC Control-M, Automic UC4, Informatica Scheduler
Version Control
Informatica Version Control, Subversion, Tortoise SVN, GitHub
Other Tools - Atlassian Jira, Atlassian Confluence, GitHub, Hue, Eclipse, Toad, PL/SQL-Developer, FTP, sFTP, WinSCP, FileZilla, Putty, HP Quality Center, Aginity Workbench for IBM Netezza
UNIX (Sun Solaris, Linux), Windows 7/Vista/XP
ETL Design, ETL Development using Informatica PowerCenter
Banking & Finance, Telecom
Direktester geht's nicht! Ganz einfach Freelancer finden und direkt Kontakt aufnehmen.