Snowflake, AWS, data energy trading (ETRM)
Aktualisiert am 16.12.2025
Profil
Freiberufler / Selbstständiger
Remote-Arbeit
Verfügbar ab: 16.12.2025
Verfügbar zu: 100%
davon vor Ort: 100%
Snowflake
AWS
ETRM

Einsatzorte

Einsatzorte

Deutschland, Schweiz, Österreich
möglich

Projekte

Projekte

2021 - ongoing: Automate weekly singleview report on the status of all of NBI operations


Role: AWS Data Architect

Customer: National Broadband Ireland, Dublin, Ireland (100% remote)


Tasks:

Use AWS Athena to report all metrics from all systems supporting the design, build and operation of the NBI network. In addition typically ERP data like finance and HR and Salesforce for marketing. This involved...

  • Reverse engineer the logic from the snapshots that are copied to the data lake daily for design, building and operation of the network using Athena
  • Build semi-automatically interfaces to ERP and marketing systems.
  • Weekly reconciliation and explanation of the numbers with all stakeholders.


Skills:

Systems: AWS, SQL, S3, Athena and Tableau


2023 - 2024: Localise global Data Vault implementation, Consolidating the merged company onto Snowflake/Databricks


Role: Snowflake Data Engineer

Customer: Ayvens (former Leaseplan taken over by ALD), Almere and Hoofdorp, Netherlands (Hybrid)


Tasks:

Initially brought on to localize a global Data Vault 2.0 implementation. LeasePlan was a front-runner and needed to consume certain data available only from the global platform but lacked trust in the data quality and the data-vault modeling of the implementation. Worked closely with a Data Vault expert and delivered the localized solution in three months. Based on this successful delivery, I was retained to build the Leaselink and Robnet implementations on the Snowflake platform to combine the data of ALD and LeasePlan. The global Snowflake project was eventually canceled, and the NL branch decided to move to Databricks as their analytics platform I then migrated the Robnet implementation to Databricks. This involved...

  • Localized the global Data Vault 2.0 model for the NL and created the data mart with business logic in Snowflake.
  • Developed a Snowflake pipeline using Snowpark UDTFs to retrieve raw data from a REST API for Leaselink. Leaselink is a platform for purchasing and delivering cars. The data is used for tracking the KPIs of vendors.
  • Created a PoC in Snowflake for Robnet, replicating and reconciling the current Oracle solution. This included flattening a JSON that contains the end-to-end processing and details of all repairs, maintenance, and tire jobs from request, approvals to invoicing.
  • The PoC was transitioned into a native Data Vault 2.0 model in Databricks, including replicating the reporting previously done in Oracle. The data is used for tracking KPIs of approvals and garages as well as bonus targets with tire vendors.


Skills:

Systems: Snowflake, Snowpark (Python), Databricks, Data Vault 2.0, GitLab. SQL


2022 - 2023: Analyze and mitigate impact of mass-migration of fixed customers


Role: Sr. AWS Data Analyst

Customer: VodafoneZiggo, Utrecht, Netherlands (Hybrid)


Tasks:

Use AWS Athena and Glue to analyze impact of all data sources consumed by AAP prior to the mass-migration of fixed customers from fixed-stack (Ziggo) to mobile-stack (Vodafone). This involved...

  • Analyst impact on over 100 data sources ingested by the AAP platform.
  • Re-designed and tested new billing topology data pipeline.


Skills:

Systems: AWS, SQL, pySpark, S3, Athena and Glue


2021 - 2021: Create point in time data for the activity in their app using SQL in Snowflake


Role: Snowflake Data Engineer

Customer: Speakap.com, Amsterdam, NL (100% remote)


Tasks:

Use Snowflake to create history and usage profiles for the users of the platform. To gain more insight in how the user are using the apps and how the usage of the app grows over time. This involved...

  • Reverse engineer the usage of the app based on database log that is feed into Snowflake using Kinesis.
  • Identify gaps in the architecture (on a data level) that could help improve query times and hence the throughput time in Snowflake.
  • Produce around 40 metrics to be shown to customers on the usage in close to realtime and feed them to Posgress for direct consumption in their BI tool that will embed the metrics in their app.


Skills:

Systems: Snowflake, SQL, S3 ( Maria DB, Kafka, Node.js, Postgres and BI application)


2020 - 2020: Testing of BO suite prior to go-live of upgrade


Role: Testing BO

Customer: MVV Energie, Mannheim, DE


Tasks:

Scope of work to test the full functionality of Aligne BO in order to go live with upgraded version. This involved...

  • Worked together with key-users in BO to reconcile new versions running in parallel with production.
  • Identify gaps in the trade capture between production and parallel version.
  • Enabled them to go-live as planned during the lock-down


Skills:

Systems: Aligne | Markets: DE power, coal, gas and emissions.


2017 - 2018: Front office transition for ISEM


Role: Business Analyst

Customer: ESB, Dublin, IR


Tasks:

Scope of work to implement and enable the front office to operate in ISEM. This involved...

  • Designed tool to capture and manage the capacity auction results
  • Redesigned tool to capture new CFD contracts in ISEM
  • Based on intial solution from Baringa. Designed a tool to control and manage pumped storage assets (Turlough Hill Power Station). Prior to ISEM the assets were operated by Eirgrid. In ISEM ESB would control the assts directly themselves. Using Excel as front end and nMarket as backend. The tool incorporate all physical constraints for operating the 4 pumped storage turbines and to facilitate the optimization and control of the assets for the dispatcher.
  • Reviewing design for Aigne implementation


Skills:

Systems: Aligne, ABB nMarket, Bid Manager, In-house development | Markets: ISEM


2016 - 2016: Impelment futures portfolio for Power, EUA and Coal


Role: Aligne Consultant

Customer: AET, Bellinzona, CH


Tasks:

Scope of work to implement and enable the organization to trade futures. Prior to the implementation AET did not trade any futures.In addition to the futures implementation we also ensured that all physical power processes (EPEX Spot) would not be impacted. This involved...

  • Created cascading process for power futures
  • Validate valuations and settlement for futures
  • Create margin report for back office and risk
  • Modify all reports for front office, risk, back ofice and scheduling


Skills:

Systems: Aligne | Exchanges: EEX, EPEX spot and ICE


2015 - 2015: Upgrade from Zainet to Aligne


Role: Zainet Analyst

Customer: E.On, Coventry, UK


Tasks:

Scope of work regression testing of the new version of Aligne. This involved...

  • Regression testing of all Aligne functionality.


Skills:

Systems: Zainet/Aligne


2014 - 2015: Proof of concept


Role: Zainet reporting architect

Customer: E.On, Coventry, UK


Tasks:

Scope of work proof of concept of PnL reporting. This involved...

  • Successfully completed a proof of concept for PnL reporting of a large power portfolio.
  • Input used to make decisions for future PnL reporting. In the end another solution was selected.


Skills:

Systems: Zainet


2013 - 2014: Migrate power and gas futures portfolio


Role: Aligne Consultant

Customer: Shell Trading, London, UK


Tasks:

Scope of work migrate power and gas futures portfolio from trade type physical to futures. This involved...

  • Create migration and reconciliation batches.
  • Validate valuations for futures.
  • Create an automatic report driven solution for splitting, cascading and physicalization of futures.
  • Modify all reports front-office power/gas reports, risk, finance, derivops and gas operations.


Skills:

Systems: Aligne, STP (in-house) | Exchanges: EEX, Endex, ICE, OMIP and Powernext


more Projects on request

Aus- und Weiterbildung

Aus- und Weiterbildung

1992

Master of Science, Mechanical Enegineering Norwegian Institute of Technology, Trondheim, Norway


AWS CERTIFICATIONS (2019)

  • AWS Certified Solutions Architect ? Associate (Expired)
  • AWS Certified Cloud Practitioner (Expired)

Kompetenzen

Kompetenzen

Top-Skills

Snowflake AWS ETRM

Produkte / Standards / Erfahrungen / Methoden

Profile

  • He is a senior data engineer specializing in AWS and Snowflake, with a strong track record of delivering data engineering, reporting, and analytics solutions across the energy, telecom, finance, and media sectors. He leverages deep experience in Energy Trading and Risk Management (ETRM) software to build scalable and robust analytics and data warehouse systems.
  • He excels both independently and as part of a team, with proven strengths in solution design, development, and testing. He has the ability to engage with high-level stakeholders and effectively coordinate tactical development teams.


KEY SKILLS

  • Liaison between the user community and the application developers
  • Able to define clear business requirements and to contribute to the implementation of the technical solution.
  • Provide in-depth analysis and come up with innovative solutions to complex problems.
  • Perform detailed business modeling and functional system design
  • Ability to take ownership of project/change deliverables from requirements definition through to testing and deployment
  • Diligent and dependable with strong integrity


EXPERIENCE SUMMARY

2023 - 2024

Role: Snowflake Data Engineer

Customer: Ayvens 


2022 - 2023

Role: Sr. AWS Data Analyst

Customer: VodafoneZiggo 


2021 - 2022

Role: AWS Data Architect

Customer: National Broadband Ireland 


2021 - 2021

Role: Snowflake Data Engineer

Customer: Speakap .com 


2020 - 2020

Role: Test analyst

Customer: MVV Energie 


2017 - 2018

Role: Business Analyst

Customer: ESB 


2016 - 2016

Role: Aligne Consultant

Customer: AET 


2015 - 2015

Role: Zainet Analyst

Customer: E.On 


2014 - 2015

Role: Zainet Reporting Architect

Customer: E.On 


2013 - 2014

Role: Aligne Consultant

Customer: Shell Trading 


2012 - 2013

Role: Implementation consultant

Customer: Enovos 


2012 - 2012

Role: Business Analyst

Customer: Noble Group 


2010 - 2011

Role: Business Analyst

Customer: Eneco Energy Trade 


2008 - 2009

Role: Business Analyst

Customer: Eneco Energy Trade 


2004 - 2007

Role: Team-leader

Customer: Delta Energy 


2002 - 2002

Role: Senior Support Analyst

Customer: Barclays Capital 


1999 - 2002

Role: Director of Quality Assurance, Europe

Customer: Caminus 


1996 - 1998

Role: Technical Manager

Customer: Zainet 


AWS SIDEPROJECT (initially AWS 2017-2021) later Snowflake (from 2021) and GCP (from 2025)

  • I designed and built a fully automated, end-to-end real-time analytics and trading infrastructure leveraging AWS services including S3, Lambda, Aurora DB, SageMaker (Prophet, scikit-learn, and DeepAR+), SNS (SMS), and QuickSight. This architecture supports real-time data processing, machine-learning inference, visualization, and downstream alerting, all without any manual intervention.
  • Within this environment, I developed multiple end-of-day and intraday systematic trading models for U.S. equities, using both alternative and traditional datasets. These models operate on structured and unstructured data and are optimized to generate low-frequency, high-reward-to-risk or near risk-free signals across short- to medium-term horizons. All signals are delivered in real time through a fully serverless AWS pipeline. 
  • In addition to technical development, I recruited and managed a distributed team of three remote Python developers, data scientists, and DevOps specialists. As a team, we built and maintained the data lake architecture, which uses Aurora SQL databases integrated with S3 and Lambda for ingestion and transformation, SNS for downstream messaging, and QuickSight for visualization. NoSQL systems such as MongoDB and DynamoDB were incorporated for time-series data storage, with data pipelines orchestrated via AWS Glue, Data Pipeline, and Athena.
  • All data processing was implemented using Python within a serverless stack?Lambda, SQS, SNS, and CloudWatch?supported by libraries including NumPy, Pandas, scikit-learn, and Facebook Prophet. For machine-learning workloads, we used AWS Machine Learning services, AWS Forecast, and SageMaker to handle classification, regression, forecasting, and clustering workflows in both batch and real-time modes.
  • Additionally, EC2 instances were provisioned and maintained for legacy third-party applications, such as data vendors and brokers, with full configuration of VPC networking. These components were tightly integrated with S3 and Lambda to ensure consistent downstream data processing across the entire system.
  • Later, I migrated the entire project to Snowflake, enabling all data processing to run natively within the platform using SQL and Python. This included leveraging Snowpark with external access for Python workflows powered by Pandas, NumPy, and REST API integrations, as well as Snowflake Notebooks. The solution was organized into dedicated notebooks for system configuration and daily operational runs.
  • A significant portion of the workflow involved processing SEC filings and other complex datasets. This year, I also delivered two custom data-ingestion solution using Python (Pandas, NumPy, REST APIs)?to two family offices, one in the United States and one in Australia, for downloading and managing specialized datasets. The solutions runs in a notebook on GCP and Relevance AI

Branchen

Branchen

  • Snowflake and AWS - Since 2021 Snowflake and AWS for data analytics/engineering for Speakap, NBI Broadband, VodafoneZiggo and Ayvens.
  • Irish Power ? Since 2017 he has been involved with the transition from SEM to ISEM with ESB in Ireland.
  • UK power and gas trading organisations ? Since 2012 he has been involved with Noble Group, Shell Trading and E.On.
  • European utilities ? Since 2004 he has been involved with AET, Delta Energy, Eneco Energy Trade, MVV Energie and Enovos.
  • Software ? Started in 1996 in Houston working on system implementation along the founders of Zainet for their North- and South American clients. Later Zainet became part of Caminus. Caminus was acquired by Sungard in 2003.

Einsatzorte

Einsatzorte

Deutschland, Schweiz, Österreich
möglich

Projekte

Projekte

2021 - ongoing: Automate weekly singleview report on the status of all of NBI operations


Role: AWS Data Architect

Customer: National Broadband Ireland, Dublin, Ireland (100% remote)


Tasks:

Use AWS Athena to report all metrics from all systems supporting the design, build and operation of the NBI network. In addition typically ERP data like finance and HR and Salesforce for marketing. This involved...

  • Reverse engineer the logic from the snapshots that are copied to the data lake daily for design, building and operation of the network using Athena
  • Build semi-automatically interfaces to ERP and marketing systems.
  • Weekly reconciliation and explanation of the numbers with all stakeholders.


Skills:

Systems: AWS, SQL, S3, Athena and Tableau


2023 - 2024: Localise global Data Vault implementation, Consolidating the merged company onto Snowflake/Databricks


Role: Snowflake Data Engineer

Customer: Ayvens (former Leaseplan taken over by ALD), Almere and Hoofdorp, Netherlands (Hybrid)


Tasks:

Initially brought on to localize a global Data Vault 2.0 implementation. LeasePlan was a front-runner and needed to consume certain data available only from the global platform but lacked trust in the data quality and the data-vault modeling of the implementation. Worked closely with a Data Vault expert and delivered the localized solution in three months. Based on this successful delivery, I was retained to build the Leaselink and Robnet implementations on the Snowflake platform to combine the data of ALD and LeasePlan. The global Snowflake project was eventually canceled, and the NL branch decided to move to Databricks as their analytics platform I then migrated the Robnet implementation to Databricks. This involved...

  • Localized the global Data Vault 2.0 model for the NL and created the data mart with business logic in Snowflake.
  • Developed a Snowflake pipeline using Snowpark UDTFs to retrieve raw data from a REST API for Leaselink. Leaselink is a platform for purchasing and delivering cars. The data is used for tracking the KPIs of vendors.
  • Created a PoC in Snowflake for Robnet, replicating and reconciling the current Oracle solution. This included flattening a JSON that contains the end-to-end processing and details of all repairs, maintenance, and tire jobs from request, approvals to invoicing.
  • The PoC was transitioned into a native Data Vault 2.0 model in Databricks, including replicating the reporting previously done in Oracle. The data is used for tracking KPIs of approvals and garages as well as bonus targets with tire vendors.


Skills:

Systems: Snowflake, Snowpark (Python), Databricks, Data Vault 2.0, GitLab. SQL


2022 - 2023: Analyze and mitigate impact of mass-migration of fixed customers


Role: Sr. AWS Data Analyst

Customer: VodafoneZiggo, Utrecht, Netherlands (Hybrid)


Tasks:

Use AWS Athena and Glue to analyze impact of all data sources consumed by AAP prior to the mass-migration of fixed customers from fixed-stack (Ziggo) to mobile-stack (Vodafone). This involved...

  • Analyst impact on over 100 data sources ingested by the AAP platform.
  • Re-designed and tested new billing topology data pipeline.


Skills:

Systems: AWS, SQL, pySpark, S3, Athena and Glue


2021 - 2021: Create point in time data for the activity in their app using SQL in Snowflake


Role: Snowflake Data Engineer

Customer: Speakap.com, Amsterdam, NL (100% remote)


Tasks:

Use Snowflake to create history and usage profiles for the users of the platform. To gain more insight in how the user are using the apps and how the usage of the app grows over time. This involved...

  • Reverse engineer the usage of the app based on database log that is feed into Snowflake using Kinesis.
  • Identify gaps in the architecture (on a data level) that could help improve query times and hence the throughput time in Snowflake.
  • Produce around 40 metrics to be shown to customers on the usage in close to realtime and feed them to Posgress for direct consumption in their BI tool that will embed the metrics in their app.


Skills:

Systems: Snowflake, SQL, S3 ( Maria DB, Kafka, Node.js, Postgres and BI application)


2020 - 2020: Testing of BO suite prior to go-live of upgrade


Role: Testing BO

Customer: MVV Energie, Mannheim, DE


Tasks:

Scope of work to test the full functionality of Aligne BO in order to go live with upgraded version. This involved...

  • Worked together with key-users in BO to reconcile new versions running in parallel with production.
  • Identify gaps in the trade capture between production and parallel version.
  • Enabled them to go-live as planned during the lock-down


Skills:

Systems: Aligne | Markets: DE power, coal, gas and emissions.


2017 - 2018: Front office transition for ISEM


Role: Business Analyst

Customer: ESB, Dublin, IR


Tasks:

Scope of work to implement and enable the front office to operate in ISEM. This involved...

  • Designed tool to capture and manage the capacity auction results
  • Redesigned tool to capture new CFD contracts in ISEM
  • Based on intial solution from Baringa. Designed a tool to control and manage pumped storage assets (Turlough Hill Power Station). Prior to ISEM the assets were operated by Eirgrid. In ISEM ESB would control the assts directly themselves. Using Excel as front end and nMarket as backend. The tool incorporate all physical constraints for operating the 4 pumped storage turbines and to facilitate the optimization and control of the assets for the dispatcher.
  • Reviewing design for Aigne implementation


Skills:

Systems: Aligne, ABB nMarket, Bid Manager, In-house development | Markets: ISEM


2016 - 2016: Impelment futures portfolio for Power, EUA and Coal


Role: Aligne Consultant

Customer: AET, Bellinzona, CH


Tasks:

Scope of work to implement and enable the organization to trade futures. Prior to the implementation AET did not trade any futures.In addition to the futures implementation we also ensured that all physical power processes (EPEX Spot) would not be impacted. This involved...

  • Created cascading process for power futures
  • Validate valuations and settlement for futures
  • Create margin report for back office and risk
  • Modify all reports for front office, risk, back ofice and scheduling


Skills:

Systems: Aligne | Exchanges: EEX, EPEX spot and ICE


2015 - 2015: Upgrade from Zainet to Aligne


Role: Zainet Analyst

Customer: E.On, Coventry, UK


Tasks:

Scope of work regression testing of the new version of Aligne. This involved...

  • Regression testing of all Aligne functionality.


Skills:

Systems: Zainet/Aligne


2014 - 2015: Proof of concept


Role: Zainet reporting architect

Customer: E.On, Coventry, UK


Tasks:

Scope of work proof of concept of PnL reporting. This involved...

  • Successfully completed a proof of concept for PnL reporting of a large power portfolio.
  • Input used to make decisions for future PnL reporting. In the end another solution was selected.


Skills:

Systems: Zainet


2013 - 2014: Migrate power and gas futures portfolio


Role: Aligne Consultant

Customer: Shell Trading, London, UK


Tasks:

Scope of work migrate power and gas futures portfolio from trade type physical to futures. This involved...

  • Create migration and reconciliation batches.
  • Validate valuations for futures.
  • Create an automatic report driven solution for splitting, cascading and physicalization of futures.
  • Modify all reports front-office power/gas reports, risk, finance, derivops and gas operations.


Skills:

Systems: Aligne, STP (in-house) | Exchanges: EEX, Endex, ICE, OMIP and Powernext


more Projects on request

Aus- und Weiterbildung

Aus- und Weiterbildung

1992

Master of Science, Mechanical Enegineering Norwegian Institute of Technology, Trondheim, Norway


AWS CERTIFICATIONS (2019)

  • AWS Certified Solutions Architect ? Associate (Expired)
  • AWS Certified Cloud Practitioner (Expired)

Kompetenzen

Kompetenzen

Top-Skills

Snowflake AWS ETRM

Produkte / Standards / Erfahrungen / Methoden

Profile

  • He is a senior data engineer specializing in AWS and Snowflake, with a strong track record of delivering data engineering, reporting, and analytics solutions across the energy, telecom, finance, and media sectors. He leverages deep experience in Energy Trading and Risk Management (ETRM) software to build scalable and robust analytics and data warehouse systems.
  • He excels both independently and as part of a team, with proven strengths in solution design, development, and testing. He has the ability to engage with high-level stakeholders and effectively coordinate tactical development teams.


KEY SKILLS

  • Liaison between the user community and the application developers
  • Able to define clear business requirements and to contribute to the implementation of the technical solution.
  • Provide in-depth analysis and come up with innovative solutions to complex problems.
  • Perform detailed business modeling and functional system design
  • Ability to take ownership of project/change deliverables from requirements definition through to testing and deployment
  • Diligent and dependable with strong integrity


EXPERIENCE SUMMARY

2023 - 2024

Role: Snowflake Data Engineer

Customer: Ayvens 


2022 - 2023

Role: Sr. AWS Data Analyst

Customer: VodafoneZiggo 


2021 - 2022

Role: AWS Data Architect

Customer: National Broadband Ireland 


2021 - 2021

Role: Snowflake Data Engineer

Customer: Speakap .com 


2020 - 2020

Role: Test analyst

Customer: MVV Energie 


2017 - 2018

Role: Business Analyst

Customer: ESB 


2016 - 2016

Role: Aligne Consultant

Customer: AET 


2015 - 2015

Role: Zainet Analyst

Customer: E.On 


2014 - 2015

Role: Zainet Reporting Architect

Customer: E.On 


2013 - 2014

Role: Aligne Consultant

Customer: Shell Trading 


2012 - 2013

Role: Implementation consultant

Customer: Enovos 


2012 - 2012

Role: Business Analyst

Customer: Noble Group 


2010 - 2011

Role: Business Analyst

Customer: Eneco Energy Trade 


2008 - 2009

Role: Business Analyst

Customer: Eneco Energy Trade 


2004 - 2007

Role: Team-leader

Customer: Delta Energy 


2002 - 2002

Role: Senior Support Analyst

Customer: Barclays Capital 


1999 - 2002

Role: Director of Quality Assurance, Europe

Customer: Caminus 


1996 - 1998

Role: Technical Manager

Customer: Zainet 


AWS SIDEPROJECT (initially AWS 2017-2021) later Snowflake (from 2021) and GCP (from 2025)

  • I designed and built a fully automated, end-to-end real-time analytics and trading infrastructure leveraging AWS services including S3, Lambda, Aurora DB, SageMaker (Prophet, scikit-learn, and DeepAR+), SNS (SMS), and QuickSight. This architecture supports real-time data processing, machine-learning inference, visualization, and downstream alerting, all without any manual intervention.
  • Within this environment, I developed multiple end-of-day and intraday systematic trading models for U.S. equities, using both alternative and traditional datasets. These models operate on structured and unstructured data and are optimized to generate low-frequency, high-reward-to-risk or near risk-free signals across short- to medium-term horizons. All signals are delivered in real time through a fully serverless AWS pipeline. 
  • In addition to technical development, I recruited and managed a distributed team of three remote Python developers, data scientists, and DevOps specialists. As a team, we built and maintained the data lake architecture, which uses Aurora SQL databases integrated with S3 and Lambda for ingestion and transformation, SNS for downstream messaging, and QuickSight for visualization. NoSQL systems such as MongoDB and DynamoDB were incorporated for time-series data storage, with data pipelines orchestrated via AWS Glue, Data Pipeline, and Athena.
  • All data processing was implemented using Python within a serverless stack?Lambda, SQS, SNS, and CloudWatch?supported by libraries including NumPy, Pandas, scikit-learn, and Facebook Prophet. For machine-learning workloads, we used AWS Machine Learning services, AWS Forecast, and SageMaker to handle classification, regression, forecasting, and clustering workflows in both batch and real-time modes.
  • Additionally, EC2 instances were provisioned and maintained for legacy third-party applications, such as data vendors and brokers, with full configuration of VPC networking. These components were tightly integrated with S3 and Lambda to ensure consistent downstream data processing across the entire system.
  • Later, I migrated the entire project to Snowflake, enabling all data processing to run natively within the platform using SQL and Python. This included leveraging Snowpark with external access for Python workflows powered by Pandas, NumPy, and REST API integrations, as well as Snowflake Notebooks. The solution was organized into dedicated notebooks for system configuration and daily operational runs.
  • A significant portion of the workflow involved processing SEC filings and other complex datasets. This year, I also delivered two custom data-ingestion solution using Python (Pandas, NumPy, REST APIs)?to two family offices, one in the United States and one in Australia, for downloading and managing specialized datasets. The solutions runs in a notebook on GCP and Relevance AI

Branchen

Branchen

  • Snowflake and AWS - Since 2021 Snowflake and AWS for data analytics/engineering for Speakap, NBI Broadband, VodafoneZiggo and Ayvens.
  • Irish Power ? Since 2017 he has been involved with the transition from SEM to ISEM with ESB in Ireland.
  • UK power and gas trading organisations ? Since 2012 he has been involved with Noble Group, Shell Trading and E.On.
  • European utilities ? Since 2004 he has been involved with AET, Delta Energy, Eneco Energy Trade, MVV Energie and Enovos.
  • Software ? Started in 1996 in Houston working on system implementation along the founders of Zainet for their North- and South American clients. Later Zainet became part of Caminus. Caminus was acquired by Sungard in 2003.

Vertrauen Sie auf Randstad

Im Bereich Freelancing
Im Bereich Arbeitnehmerüberlassung / Personalvermittlung

Fragen?

Rufen Sie uns an +49 89 500316-300 oder schreiben Sie uns:

Das Freelancer-Portal

Direktester geht's nicht! Ganz einfach Freelancer finden und direkt Kontakt aufnehmen.