Senior AWS Cloud Data Engineer, Expert Kafka Developer
Aktualisiert am 25.06.2024
Profil
Freiberufler / Selbstständiger
Remote-Arbeit
Verfügbar ab: 25.06.2024
Verfügbar zu: 100%
davon vor Ort: 10%
Kafka
AWS
Python
Kotlin
Azure
DevOps
Git
SQL

Einsatzorte

Einsatzorte

Deutschland, Schweiz, Österreich
möglich

Projekte

Projekte

2 Jahre 9 Monate
2022-01 - heute

Developed streaming and batch data pipelines

Senior Data Engineer CDK+TypeScript Terraform GitLab CI ...
Senior Data Engineer
  • Developed streaming and batch data pipelines on AWS Fargate with Kotlin, Kafka Streams and Connect, Avro, Confluent Cloud, Lambda, DynamoDB, S3, Docker
  • Developed REST APIs for mobile devices and internal data APIs with OpenApi, ApiGateway, Lamdba, DynamoDB
  • Created reports from retail datasets of 2,5B+ records using S3, Glue and Athena Recreated existing SAP data feeds by converting ABAP to SQL with PostgreSQL, Airflow, Python
CDK+TypeScript Terraform GitLab CI Splunk CloudWatch PostgreSQL Airflow Python APIs OpenApi ApiGateway Lamdba DynamoDB Kotlin Kafka Streams and Connect Avro Confluent Cloud Lambda S3 Docker
Largest retail chain
Remote, Finland
1 Jahr 1 Monat
2020-12 - 2021-12

HUS Datalake

Senior Data Engineer Terraform Ansible Azure DevOps ...
Senior Data Engineer
  • Developed healthcare data integrations for HUS DataLake on Azure with Kubernetes, Data Lake Storage, Data Factory, Databricks, Scala
  • Built the BM-OR pipeline to send operating room records via HUS DataLake to a third party for analysis. 


Orchestration:

  • Azure Data Factory, transformations:
    • ?Databricks/Scala and Delta Table, public key encryption for in-transit data 

Terraform Ansible Azure DevOps Jenkins Grafana+Prometheus Kubernetes Data Lake Storage Data Factory Databricks Scala
TietoEVRY Integration Team
1 Jahr 4 Monate
2019-01 - 2020-04

Advance Auto Parts Price Execution Upgrade Project

Integration Architect/Developer
Integration Architect/Developer
  • Design and implement streaming data platform for customer product, pricing and related data from enterprise sources to JDA Pricer
  • Modeled data with Apache Avro
  • Designed complex streaming topology with cross-topic validation and enrichment
  • Implemented streaming pipelines as Kafka Streams Java applications
  • Fully responsible for Phase I of the project with scope of over 6000 retail stores
  • Confluent Certified Developer for Apache Kafka 5/2020
Remote / US East Coast
2 Jahre 11 Monate
2016-05 - 2019-03

Pricing Tool Upgrade Project

Integration Architect/Developer
Integration Architect/Developer
  • Fully responsible for data integrations between JDA Pricer, customer enterprise planning tools and store point-of-sale systems
  • Built a SQL/Python database for inbound data with change data capture from full data feeds
  • Developed a price and promotion data pipeline to store point-of-sale and online ecommerce tool
  • Customer was able to execute price changes and promotions in near realtime (URL on request)
Guess Jeans U.S.A.

Aus- und Weiterbildung

Aus- und Weiterbildung

1992 - 1997

Electrical Engineering

M.Sc.

Tampere University of Technology

Position

Position

Senior Data Engineer

Kompetenzen

Kompetenzen

Top-Skills

Kafka AWS Python Kotlin Azure DevOps Git SQL

Produkte / Standards / Erfahrungen / Methoden

Profile

Senior Data Engineer with 20+ years of experience in building data pipelines and 5+ years of experience with Apache Kafka, recently on AWS with a modern cloud architecture. Very experienced in batch data pipelines of various sizes and types including handling sensitive healthcare data. Currently helping my team to become the best source of shopping data within the company.


Data

  • Kafka
  • Data Pipelines 
  • Databricks/Spark
  • AWS Glue 
  • AWS Athena 
  • REST API
  • Azure Data Factory 
  • PostgreSQL 
  • Avro
  • Parquet 
  • Delta Table 
  • Airflow


Cloud/Infra

  • AWS 
  • Azure 
  • AWS CDK 
  • Terraform
  • Docker 
  • Kubernetes 
  • Git 
  • CI/CD

Programmiersprachen

Kotlin Scala
Python
Java
SQL
Functional Programming


Einsatzorte

Einsatzorte

Deutschland, Schweiz, Österreich
möglich

Projekte

Projekte

2 Jahre 9 Monate
2022-01 - heute

Developed streaming and batch data pipelines

Senior Data Engineer CDK+TypeScript Terraform GitLab CI ...
Senior Data Engineer
  • Developed streaming and batch data pipelines on AWS Fargate with Kotlin, Kafka Streams and Connect, Avro, Confluent Cloud, Lambda, DynamoDB, S3, Docker
  • Developed REST APIs for mobile devices and internal data APIs with OpenApi, ApiGateway, Lamdba, DynamoDB
  • Created reports from retail datasets of 2,5B+ records using S3, Glue and Athena Recreated existing SAP data feeds by converting ABAP to SQL with PostgreSQL, Airflow, Python
CDK+TypeScript Terraform GitLab CI Splunk CloudWatch PostgreSQL Airflow Python APIs OpenApi ApiGateway Lamdba DynamoDB Kotlin Kafka Streams and Connect Avro Confluent Cloud Lambda S3 Docker
Largest retail chain
Remote, Finland
1 Jahr 1 Monat
2020-12 - 2021-12

HUS Datalake

Senior Data Engineer Terraform Ansible Azure DevOps ...
Senior Data Engineer
  • Developed healthcare data integrations for HUS DataLake on Azure with Kubernetes, Data Lake Storage, Data Factory, Databricks, Scala
  • Built the BM-OR pipeline to send operating room records via HUS DataLake to a third party for analysis. 


Orchestration:

  • Azure Data Factory, transformations:
    • ?Databricks/Scala and Delta Table, public key encryption for in-transit data 

Terraform Ansible Azure DevOps Jenkins Grafana+Prometheus Kubernetes Data Lake Storage Data Factory Databricks Scala
TietoEVRY Integration Team
1 Jahr 4 Monate
2019-01 - 2020-04

Advance Auto Parts Price Execution Upgrade Project

Integration Architect/Developer
Integration Architect/Developer
  • Design and implement streaming data platform for customer product, pricing and related data from enterprise sources to JDA Pricer
  • Modeled data with Apache Avro
  • Designed complex streaming topology with cross-topic validation and enrichment
  • Implemented streaming pipelines as Kafka Streams Java applications
  • Fully responsible for Phase I of the project with scope of over 6000 retail stores
  • Confluent Certified Developer for Apache Kafka 5/2020
Remote / US East Coast
2 Jahre 11 Monate
2016-05 - 2019-03

Pricing Tool Upgrade Project

Integration Architect/Developer
Integration Architect/Developer
  • Fully responsible for data integrations between JDA Pricer, customer enterprise planning tools and store point-of-sale systems
  • Built a SQL/Python database for inbound data with change data capture from full data feeds
  • Developed a price and promotion data pipeline to store point-of-sale and online ecommerce tool
  • Customer was able to execute price changes and promotions in near realtime (URL on request)
Guess Jeans U.S.A.

Aus- und Weiterbildung

Aus- und Weiterbildung

1992 - 1997

Electrical Engineering

M.Sc.

Tampere University of Technology

Position

Position

Senior Data Engineer

Kompetenzen

Kompetenzen

Top-Skills

Kafka AWS Python Kotlin Azure DevOps Git SQL

Produkte / Standards / Erfahrungen / Methoden

Profile

Senior Data Engineer with 20+ years of experience in building data pipelines and 5+ years of experience with Apache Kafka, recently on AWS with a modern cloud architecture. Very experienced in batch data pipelines of various sizes and types including handling sensitive healthcare data. Currently helping my team to become the best source of shopping data within the company.


Data

  • Kafka
  • Data Pipelines 
  • Databricks/Spark
  • AWS Glue 
  • AWS Athena 
  • REST API
  • Azure Data Factory 
  • PostgreSQL 
  • Avro
  • Parquet 
  • Delta Table 
  • Airflow


Cloud/Infra

  • AWS 
  • Azure 
  • AWS CDK 
  • Terraform
  • Docker 
  • Kubernetes 
  • Git 
  • CI/CD

Programmiersprachen

Kotlin Scala
Python
Java
SQL
Functional Programming


Vertrauen Sie auf Randstad

Im Bereich Freelancing
Im Bereich Arbeitnehmerüberlassung / Personalvermittlung

Fragen?

Rufen Sie uns an +49 89 500316-300 oder schreiben Sie uns:

Das Freelancer-Portal

Direktester geht's nicht! Ganz einfach Freelancer finden und direkt Kontakt aufnehmen.