2019 - 2019: Supported analysts with data management and timely flows
Role: Data & MLOps Engineer
Customer: BRANDAID, Media
Tasks:
Skills:
Python, PySpark, Airflow, MLflow, GitLab, CI/CD, Docker, Terraform, GCP, BigQuery, Big Table
2018 - 2019: Led migration from on-premise infrastructure to the cloud
Role: Data & Cloud Engineer
Customer: KASSANDRA, Media
Tasks:
Skills:
Python, PySpark, Airflow, Hortonworks, Hive, Bash, GitLab, CI/CD, Docker, Terraform, GCP, BigQuery, Big Table
2017 - 2018: Trained and supported analysts
Role: Data Engineer
Customer: WLZ PROGNOSE, Media
Tasks:
Skills:
Python, PySpark, Airflow, Hortonworks, Hive, Bash, GitLab, Docker
2017 - 2017: Delivered crawlers for job postings and text mining to structure
Role: Data Analyst & Engineer
Customer: PROJECT FINDER, Consultancy
Tasks:
Skills:
Python, PySpark, MongoDB, Hortonworks, Hive, Bash, GitLab, Flask
2016 - 2017: Built extraction and processing of platform data
Role: Data Analyst & Engineer
Customer: ID MARKETING, Commercial Marketing
Tasks:
Skills:
Python, PySpark, REST API, MongoDB, Hortonworks, Hive, Bash, GitLab, Gephi
03/2022
M.Sc. BigData & Business Analytics, FOM Hochschule, Cologne, Germany, (discontinued)
09/2015
B.Sc. Business Information Systems, TH Köln, Cologne, Germany, (Specialized in Lean Big Data)
Professional Education
Present
Databricks Data Engineer Professional, Databricks Academy, Cologne, Germany
Profile
With over 9 years of experience in data, ML, AI and leadership, I combine technical excellence with strategic thinking. I design and implement data platforms and AI systems, cloud architectures and data pipelines, and establish effective data governance approaches to create sustainable business value.
TECHNICAL COMPETENCES
Dev. Platform
Hortonworks Data Platform, Google Cloud, Platform, Amazon Web Services
File System
Hadoop Distributed File System, Google Buckets, AWSS3
Dev. Operations
Docker, Kubernetes, GitLab CI/CD, Terraform
Development tools
Visual Studio Code, PyCharm, Gitlab
ML Engineering
MLflow, Scikit-Learn, nltk nlp
Data Processing
Airflow, Pig, Sqoop
METHODICAL COMPETENCES
Business Administration
Accounting, Finance & Investment, Marketing & Sales, Controlling & Management
Business Processes
Objective Key Results, Business Process Management, Service-oriented Architecture
Agile Development
Scrum, Lean, Design Thinking
FORCES
PROFESSIONAL EXPERIENCE
01/2025 - Present
Role: Software Engineer
Customer: on request, Germany
Tasks:
Developing end-to-end data and Al solutions enabling SMEs and larger organizations to leverage modern Al, including LLMs, for smarter and more efficient decision-making:
Skills:
Data Engineering, LLM Engineering, RAG, MLOps, Data Architecture, Data Governance, Cloud (Azure), Terraform, Docker, API, WebApp
03/2021 - 07/2024
Role: Head of Data Engineering
Customer: RTL TECHNOLOGY GMBH, Germany
Tasks:
Led RTL's data engineering strategy, enabling scalable analytics infrastructure and team capability growth:
Skills:
SQL, GCP, Airflow, Terraform, GitLab, Confluence, Data Mesh, Agile Leadership, Team Management
11/2019 - 02/2021
Role: Data Engineer
Customer: RTL TECHNOLOGY GMBH, Germany
Tasks:
Contributed to the development of automated data solutions and cloud migration:
Skills:
SQL, Python, Airflow, BigQuery, GCP, Cloud Composer, Linux, CI/CD
12/2016 - 10/2019
Role: IT Consultant Data Analytics
Customer: ACT GRUPPE, Germany
Tasks:
Delivered end-to-end analytics and data engineering projects for public and private clients:
Skills:
SQL, Python, MongoDB, Scrapy, Hadoop, Hive, REST APIS, Data Visualization, Gephi
2019 - 2019: Supported analysts with data management and timely flows
Role: Data & MLOps Engineer
Customer: BRANDAID, Media
Tasks:
Skills:
Python, PySpark, Airflow, MLflow, GitLab, CI/CD, Docker, Terraform, GCP, BigQuery, Big Table
2018 - 2019: Led migration from on-premise infrastructure to the cloud
Role: Data & Cloud Engineer
Customer: KASSANDRA, Media
Tasks:
Skills:
Python, PySpark, Airflow, Hortonworks, Hive, Bash, GitLab, CI/CD, Docker, Terraform, GCP, BigQuery, Big Table
2017 - 2018: Trained and supported analysts
Role: Data Engineer
Customer: WLZ PROGNOSE, Media
Tasks:
Skills:
Python, PySpark, Airflow, Hortonworks, Hive, Bash, GitLab, Docker
2017 - 2017: Delivered crawlers for job postings and text mining to structure
Role: Data Analyst & Engineer
Customer: PROJECT FINDER, Consultancy
Tasks:
Skills:
Python, PySpark, MongoDB, Hortonworks, Hive, Bash, GitLab, Flask
2016 - 2017: Built extraction and processing of platform data
Role: Data Analyst & Engineer
Customer: ID MARKETING, Commercial Marketing
Tasks:
Skills:
Python, PySpark, REST API, MongoDB, Hortonworks, Hive, Bash, GitLab, Gephi
03/2022
M.Sc. BigData & Business Analytics, FOM Hochschule, Cologne, Germany, (discontinued)
09/2015
B.Sc. Business Information Systems, TH Köln, Cologne, Germany, (Specialized in Lean Big Data)
Professional Education
Present
Databricks Data Engineer Professional, Databricks Academy, Cologne, Germany
Profile
With over 9 years of experience in data, ML, AI and leadership, I combine technical excellence with strategic thinking. I design and implement data platforms and AI systems, cloud architectures and data pipelines, and establish effective data governance approaches to create sustainable business value.
TECHNICAL COMPETENCES
Dev. Platform
Hortonworks Data Platform, Google Cloud, Platform, Amazon Web Services
File System
Hadoop Distributed File System, Google Buckets, AWSS3
Dev. Operations
Docker, Kubernetes, GitLab CI/CD, Terraform
Development tools
Visual Studio Code, PyCharm, Gitlab
ML Engineering
MLflow, Scikit-Learn, nltk nlp
Data Processing
Airflow, Pig, Sqoop
METHODICAL COMPETENCES
Business Administration
Accounting, Finance & Investment, Marketing & Sales, Controlling & Management
Business Processes
Objective Key Results, Business Process Management, Service-oriented Architecture
Agile Development
Scrum, Lean, Design Thinking
FORCES
PROFESSIONAL EXPERIENCE
01/2025 - Present
Role: Software Engineer
Customer: on request, Germany
Tasks:
Developing end-to-end data and Al solutions enabling SMEs and larger organizations to leverage modern Al, including LLMs, for smarter and more efficient decision-making:
Skills:
Data Engineering, LLM Engineering, RAG, MLOps, Data Architecture, Data Governance, Cloud (Azure), Terraform, Docker, API, WebApp
03/2021 - 07/2024
Role: Head of Data Engineering
Customer: RTL TECHNOLOGY GMBH, Germany
Tasks:
Led RTL's data engineering strategy, enabling scalable analytics infrastructure and team capability growth:
Skills:
SQL, GCP, Airflow, Terraform, GitLab, Confluence, Data Mesh, Agile Leadership, Team Management
11/2019 - 02/2021
Role: Data Engineer
Customer: RTL TECHNOLOGY GMBH, Germany
Tasks:
Contributed to the development of automated data solutions and cloud migration:
Skills:
SQL, Python, Airflow, BigQuery, GCP, Cloud Composer, Linux, CI/CD
12/2016 - 10/2019
Role: IT Consultant Data Analytics
Customer: ACT GRUPPE, Germany
Tasks:
Delivered end-to-end analytics and data engineering projects for public and private clients:
Skills:
SQL, Python, MongoDB, Scrapy, Hadoop, Hive, REST APIS, Data Visualization, Gephi