- Building a corporate data model and centralized analytics platform within a hybrid mesh architecture toensure scalable and efficient data integration.
-Implementing requirements using the Scrum methodology to ensure iterative and collaborativedevelopment.
-Developing transformation pipelines, a comprehensive concept for assertions and an incremental loadprocess for the business layer, all to streamline data processing and ensure high-quality data flows, enhancedata validation and integrity and improve data refresh efficiency and reduce processing times.
-Driving automation initiatives using Github Actions to streamline deployment processes and ensurecontinuous integration and delivery.
-Providing support on architectural topics and training of new consultants to ensure high quality technicalsolutions.
-Engaging in data modeling activities to create a robust and flexible data structure.
-Conducting requirement analysis and organizing workshops to define the high-level concept for the datawarehouse to support contribution margin and sales reporting.
-Identifying the required source tables and ensuring accurate data extraction for the reporting processes.
-Integrating source data from SAP using Theobald Xtract Universal, as well as from Excel and SQL Serverdatabases using PDI (Pentaho Data Integration).
-Setting up the Snowflake infrastructure, including roles, warehouses, and resource monitors to ensure ascalable and efficient data environment.
-Implementing transformation processes within Snowflake to enable seamless data flows and supportcomplex reporting requirements.
-Continuously enhancing and evolving the DWH and providing customer training to ensure the systemmeets their ongoing business needs and is used effectivley.
-Creating a comprehensive operations manual to guide ongoing management and maintenance.
-Development of an enterprise-wide data warehouse to enable cross-functional reporting across Sales,Finance, and Logistics.
-Enhancing existing modules and implementing the Logistics module to extend analytical capabilities.
-Integrating SAP source data using Theobald Xtract Universal and PDI to ensure seamless data extractionand processing.
-Transforming and modeling data from the staging layer to the data mart using dbt to support efficient andreliable reporting structures.
-Expanding existing Power BI datasets to provide broader insights and meet evolving business requirements.
-Delivering knowledge transfer and training sessions for internal consultants to ensure effective projectcontinuity.
-Providing ongoing support and ensuring the stability and usability of the implemented solution.
- Building a corporate data model and centralized analytics platform within a hybrid mesh architecture toensure scalable and efficient data integration.
-Implementing requirements using the Scrum methodology to ensure iterative and collaborativedevelopment.
-Developing transformation pipelines, a comprehensive concept for assertions and an incremental loadprocess for the business layer, all to streamline data processing and ensure high-quality data flows, enhancedata validation and integrity and improve data refresh efficiency and reduce processing times.
-Driving automation initiatives using Github Actions to streamline deployment processes and ensurecontinuous integration and delivery.
-Providing support on architectural topics and training of new consultants to ensure high quality technicalsolutions.
-Engaging in data modeling activities to create a robust and flexible data structure.
-Conducting requirement analysis and organizing workshops to define the high-level concept for the datawarehouse to support contribution margin and sales reporting.
-Identifying the required source tables and ensuring accurate data extraction for the reporting processes.
-Integrating source data from SAP using Theobald Xtract Universal, as well as from Excel and SQL Serverdatabases using PDI (Pentaho Data Integration).
-Setting up the Snowflake infrastructure, including roles, warehouses, and resource monitors to ensure ascalable and efficient data environment.
-Implementing transformation processes within Snowflake to enable seamless data flows and supportcomplex reporting requirements.
-Continuously enhancing and evolving the DWH and providing customer training to ensure the systemmeets their ongoing business needs and is used effectivley.
-Creating a comprehensive operations manual to guide ongoing management and maintenance.
-Development of an enterprise-wide data warehouse to enable cross-functional reporting across Sales,Finance, and Logistics.
-Enhancing existing modules and implementing the Logistics module to extend analytical capabilities.
-Integrating SAP source data using Theobald Xtract Universal and PDI to ensure seamless data extractionand processing.
-Transforming and modeling data from the staging layer to the data mart using dbt to support efficient andreliable reporting structures.
-Expanding existing Power BI datasets to provide broader insights and meet evolving business requirements.
-Delivering knowledge transfer and training sessions for internal consultants to ensure effective projectcontinuity.
-Providing ongoing support and ensuring the stability and usability of the implemented solution.