Database/ETL Developer, Data Engineer, Database Architect, Cloud Data Engineer
Aktualisiert am 10.08.2025
Profil
Freiberufler / Selbstständiger
Remote-Arbeit
Verfügbar ab: 10.08.2025
Verfügbar zu: 100%
davon vor Ort: 25%
ETL
Datenbankentwicklung
Datawarehouse
AWS
SQL-Programmierung
Python
Git
Talend
Oracle
MySQL
MS SQL Server
Jenkins
MS SQL Server Reporting Services
Amazon RDS
amazon Redshift
Atlassian Confluence
Atlassian JIRA
Agile Softwareentwicklung
Data Modeling
Tamil
Mother Tongue
English
Proficient User
German
Independent User (C1/B2)

Einsatzorte

Einsatzorte

Deutschland
möglich

Projekte

Projekte

1 year 11 months
2024-01 - now

various projects in the field of data engineering

Founder Apache Iceberg Python SQL ...
Founder
  • Database migration/development, data engineering consultation
  • Data migration coaching program
Apache Iceberg Python SQL ELT Data Modelling AWS Database Migration Services (DMS) Amazon Managed Streaming for Apache Kafka (MSK) AWS Aurora MySQL AWS CDK (Typescript) GitLab Kinesis Data Firehose Amazon OpenSearch Serverless Amazon API Gateway Amazon S3 Redshift AWS Glue Lambda DynamoDB RDS MariaDB etc.
on request
Rastatt (Germany)
1 year
2024-04 - 2025-03

Data Engineering/ Hybrid

Data Engineering Consultant (Freelance) AWS CDK (Typescript) GitLab CI/CD Apache Iceberg ...
Data Engineering Consultant (Freelance)
  • Architected and built a scalable data lake with Apache Iceberg and analytics platform from scratch
  • Designed data replication solutions from RDS MariaDB to an S3-based data lake and Amazon Redshift using AWS DMS and MSK, improving data availability for reporting and reducing RDS performance bottlenecks by 30%
  • Implemented audit log ingestion pipelines using Amazon Kinesis Data Firehose, streaming to OpenSearch and S3 for advanced analytics and long-term storage
  • Developed secure, scalable RESTful APIs using AWS API Gateway and Amazon Cognito, enabling authorized access to data
  • Developed AWS Bedrock-based AI agents to enable top management to query data lake datasets for ad hoc analytics
  • Designed fault-tolerant data processing architecture using ECS with auto-scaling task definitions, enabling seamless processing of records daily across distributed containerized services for charging station data synchronization and API integration
AWS CDK (Typescript) GitLab CI/CD Apache Iceberg Python SQL Data Modelling AWS DMS MSK Kinesis Data Firehose Amazon OpenSearch Serverless Amazon API Gateway Amazon S3 Redshift AWS Glue Lambda DynamoDB RDS MariaDB AWS Bedrock Amazon ECS Bash scripting
Chargecloud GmbH
Köln (Germany)
9 months
2023-04 - 2023-12

DataOps Helpdesk

Dataops Engineer Python SQL ETL ...
Dataops Engineer

  • Delivered critical support to internal customers on AWS-hosted Data Science and Engineering projects, optimizing data utilization, resolving issues & provisioning AWS infrastructure within SLA
  • Engineered a robust architecture for integration testing of Terraform AWS modules using Python and Boto3, streamlining validation for 5+ internal teams
  • Automated testing workflows via GitLab CI/CD, enabling continuous testing of customer AWS Infrastructure modules, reducing errors by 70% and enhancing operational reliability

Python SQL ETL Database Modelling Terraform AWS Database Migration Services (DMS) AWS Aurora MySQL GitLab Bitbucket Ansible Microsoft SQL Server SQL Server Reporting Services (SSRS) Airflow AWS CDK Kafka Flyway GitLab CI/CD Amazon S3 AWS Glue Lambda DynamoDB Amazon EMR Amazon CloudWatch Amazon RDS Grafana Apache Spark Bash scripting
Deutsche Bahn AG
Frankfurt (Germany)
2 years
2022-01 - 2023-12

NLP project

Cloud Data Engineer Python SQL ETL ...
Cloud Data Engineer
  • Engineered scalable data pipelines using Apache Airflow and transient AWS EMR clusters, integrated with GitLab CI/CD, to process 500GB of daily data for an NLP project, achieving 99.9% uptime and reducing processing time by 40% compared to non-distributed Python and AWS Glue ETL
  • Collaborated with data scientists to define and fulfill complex data requirements, delivering curated datasets for NLP model training and analysis, enabling improvement in model accuracy
  • Developed Grafana dashboards to visualize metrics from AWS CloudWatch logs and other AWS service logs (EMR, Airflow), enabling proactive monitoring of NLP pipeline health and ensuring 99.9% uptime
Python SQL ETL Database Modelling Terraform AWS Database Migration Services (DMS) AWS Aurora MySQL GitLab Bitbucket Ansible Microsoft SQL Server SQL Server Reporting Services (SSRS) Airflow CI/CD Amazon S3 AWS Glue Lambda DynamoDB Amazon EMR Amazon CloudWatch
TruBridge GmbH
Mannheim (Germany)
1 year 6 months
2022-02 - 2023-07

Database Migration to AWS Cloud

Data Migration & Reporting Specialist Python SQL ETL ...
Data Migration & Reporting Specialist
  • Orchestrated the migration of business critical on-premises SQL Server databases (2TB) to AWS Aurora MySQL using AWS DMS, achieving zero data loss
  • Automated AWS infrastructure provisioning using Terraform and Bitbucket pipelines, reducing deployment time and eliminating manual configuration errors
  • Designed a reporting database on Amazon RDS SQL Server, developing optimized stored procedures for data transformation, enabling faster report generation
  • Developed power BI and SSRS reports empowering 5 different internal departments with actionable analytics for strategic decision-making
Python SQL ETL Database Modelling Terraform AWS Database Migration Services (DMS) AWS Aurora MySQL AWS CDK GitLab Ansible Microsoft SQL Server SQL Server Reporting Services (SSRS) Airflow Kafka Flyway Bitbucket pipelines Amazon S3 Lambda AWS DMS Amazon RDS for SQL Server
Swiss Life AG
Hannover (Germany)
1 year 9 months
2020-04 - 2021-12

pearheaded the development and maintenance of ELT solutions

Business Intelligence Engineer Python ELT Database Modelling ...
Business Intelligence Engineer
  • Spearheaded the development and maintenance of ELT solutions on the Oracle platform, utilizing Oracle Data Integrator (ODI) to ensure seamless data extraction, loading, and transformation
  • Established Jenkins from scratch, creating a robust CI/CD data warehouse rollout pipeline, which streamlined the release of ELT packages and SQL scripts to various production and test instances
  • Enhanced deployment efficiency and reduced manual intervention by automating data warehouse releases
  • Built ELT solutions on Oracle using Oracle Data Integrator (ODI), processing approximately >50GB of daily data
  • Architected a Jenkins-based CI/CD pipeline from the ground up for data warehouse rollouts, automating ELT package and SQL script releases across production and test environments, reducing deployment time by 60% and eliminating 90% of manual interventions
  • Developed Java and Groovy scripts to enhance Oracle Data Integrator (ODI) workflows and Jenkins CI/CD pipelines, automating data validation and reducing error rates by 80% across 10+ production releases

Python ELT Database Modelling Oracle Data Integrator (ODI) Oracle Database Jenkins Groovy Scripting PostgreSQL Subversion PL/SQL Java
Riverty (Former Arvato Financial Solutions)
Baden-Baden (Germany)
1 year 4 months
2018-12 - 2020-03

Developed and maintained high-performance ETL solutions

Big Data Engineer Python SQL Database Modelling ...
Big Data Engineer
  • Developed and maintained high-performance ETL solutions utilizing Talend Open Studio (TOS) and MySQL, ensuring efficient data extraction, transformation, and loading to deliver data to the Business Intelligence/Data Science teams
  • Spearheaded the migration of legacy ETL scripts, originally written in PHP and Python, to Talend, ensuring a seamless transition with minimal disruption to ongoing operations
  • Developed real-time data pipelines using Apache Kafka to process daily web events (clicks, user interactions) for ingestion into MySQL, enabling web analytics capabilities
  • Stepped into the role of Interim Product Owner, effectively prioritizing and ranking backlogs on the sprint board, ensuring alignment with strategic objectives and timely delivery of data team projects
  • Engineered high-performance ETL/ETL solutions using Talend Open Studio and MySQL, processing daily event data enabling real-time analytics for Business Intelligence and Data Science teams
  • Orchestrated the migration of legacy PHP and Python ETL scripts to Talend Open Studio, achieving zero downtime and reducing processing time by 40% for 10+ data workflows
  • Served as Interim Product Owner for the data engineering team, prioritizing backlogs during Agile sprints and aligning 5+ projects with cross-functional teams

Python SQL Database Modelling MySQL Database Redis Talend Open Studio AWS Kafka MySQL PostgreSQL ETL/ELT PHP
Sovendus GmbH
Karlsruhe (Germany)
2 years
2016-12 - 2018-11

Designed and deployed high-performance data integration solutions

Database Developer Python SQL ETL ...
Database Developer
  • Designed and optimized complex database schemas and models in SQL Server, AWS Redshift, and Aurora
  • Architected and implemented high-performance ETL pipelines using SQL Server Integration Services (SSIS) to support enterprise data warehousing
  • Developed and automated dynamic, user-centric reports and dashboards using SSRS and Tableau, enabling real-time decision-making for cross-functional teams
  • Engineered Python-based web scraping scripts to extract and process data from internal sources, enhancing data availability for analytics and reporting
  • Implemented data quality checks and validation processes within ETL workflows, ensuring high accuracy and reliability of data for downstream analytics
Python SQL ETL Database Modelling SQL Server Database SQL Server Integration Services (SSIS) SQL Server Reporting Services (SSRS) AWS Redshift & Aurora Tableau Redshift
Misumi Europa GmbH
Frankfurt (Germany)
9 months
2016-03 - 2016-11

Designed a comprehensive process from scratch within SQL Server

Database Developer/Internship and Master Thesis SQL ETL Database Modelling ...
Database Developer/Internship and Master Thesis

  • Architected and implemented a comprehensive data quality assessment framework from scratch within SQL Server, enabling robust validation and quality assurance across diverse data sources
  • Designed and deployed an intuitive front-end interface using Microsoft Access, empowering users to efficiently manage and interact with data quality processes, improving operational efficiency
  • Authored my master?s thesis on ?Investigation and Evaluation of the Quality of Customer-Supplied Raw Data,? developing methodologies to enhance data reliability and inform data quality best practices

SQL ETL Database Modelling SQL Server Database SQL Server Integration Services (SSIS) Microsoft Access
Management Services Helwig Schmitt GmbH
Hofgeismar (Germany)
2 years
2012-08 - 2014-07

Engaged in the project development phase

Data Warehouse Analyst SQL ETL Database Modelling ...
Data Warehouse Analyst
  • Engaged in the project development phase, ensuring the quality and compatibility of code in adherence to Verizon?s Quality standards and industry best practices
  • Developed Teradata SQL test scripts, translating test procedures into executable scripts and executing them using Teradata SQL Assistant and BTEQ
  • Analyzed business and functional requirements, deriving detailed test plans, test cases, and procedures to ensure thorough testing of UNIX/Teradata-based Data Warehouse applications
  • Coordinated integration testing activities among various Data Warehouse teams and Upstream/Downstream application test teams, ensuring comprehensive testing and quality assurance
  • Acted as a pivotal Point of Contact (POC) among the Production, Development, and System Integration Testing (SIT) teams, ensuring seamless communication and issue tracking within the Production box
  • Conducted daily status calls with the development team and clients, ensuring all stakeholders were aligned and updated regarding project status and any emerging issues
SQL ETL Database Modelling Teradata Database HP Quality Center Database Testing
Verizon Data Services India Pvt. Ltd.
Chennai (India)

Aus- und Weiterbildung

Aus- und Weiterbildung

1 month
2025-05 - 2025-05

Azure Data Fundamentals (DP-900)


1 month
2021-09 - 2021-09

Individual lessons - Data engineering

Final Project: Batch & Streaming pipelines to analyze Credit Card Transactions (GitHub Repo)
2 years 3 months
2014-09 - 2016-11

Study - Communication and Media Engineering/ software engineering

Master of Science, Offenburg University of Applied Sciences, Offenburg (Germany)
Master of Science
Offenburg University of Applied Sciences, Offenburg (Germany)
4 years 1 month
2008-06 - 2012-06

Study - Electrical and Electronics Engineering

Bachelor of Engineering, St. Joseph?s College of Engineering (Affiliated to Anna University), Chennai (India)
Bachelor of Engineering
St. Joseph?s College of Engineering (Affiliated to Anna University), Chennai (India)
1 year 11 months
2006-06 - 2008-04

Higher Secondary Certificate

Alpha Plus Matriculation Higher Secondary School, Trichy (India)
Alpha Plus Matriculation Higher Secondary School, Trichy (India)


9 years 11 months
1996-06 - 2006-04

Secondary School Leaving Certificate

St.James Higher Secondary School, Palakurichy (India)
St.James Higher Secondary School, Palakurichy (India)


                                    

Position

Position

Freelance Data Engineer

Kompetenzen

Kompetenzen

Top-Skills

ETL Datenbankentwicklung Datawarehouse AWS SQL-Programmierung Python Git Talend Oracle MySQL MS SQL Server Jenkins MS SQL Server Reporting Services Amazon RDS amazon Redshift Atlassian Confluence Atlassian JIRA Agile Softwareentwicklung Data Modeling

Produkte / Standards / Erfahrungen / Methoden

SQL
Data Modelling
ETL/ELT
Python
Pandas, PySpark, Boto3, etc.
AWS
RDS, Aurora, DMS, S3, Glue, Lambda, Kinesis, EMR, QuickSight, Redshift, CDK
Terraform
Airflow
Jenkins
GitLab
Ansible
AWS CDK
Microsoft SQL Server
MySQL
SSRS
SSIS
Talend Open Studio
MariaDB
Teradata
Oracle Database
Oracle Data Integrator
Kafka
NoSQL
Redis, DynamoDB
PowerBI
Bitbucket
SVN
Groovy
Flyway
Apache Iceberg
PostgreSQL
AWS API Gateway
OpenSearch
REST APIs
Grafana
Amazon CloudWatch
Azure
Profile:
  • Results-driven Data Engineer with end-to-end expertise across the data engineering lifecycle, including infrastructure provisioning, Data Modelling, ETL/ELT pipelines, data warehousing, and cloud architecture
  • Proven ability to design and implement high-performance, scalable data platforms for diverse organizations


Data Modeling & Data Warehousing:

OLTP Database Design, Data Vault 2.0, Dimensional Modeling (Star/Snowflake Schema, Kimball Methodology), Dalalake


Big Data Technologies:

Apache Spark, Kafka, Iceberg, Amazon Kinesis, Data Firehose, EMR


Data Migration:

AWS Database Migration Service (DMS), Debezium


ETL/ELT & Orchestration:

SQL Server Integration Services (SSIS), Oracle Data Integrator (ODI), Talend Open Studio, AWS Glue, dbt, Apache Airflow


DevOps & CI/CD:

Docker, Jenkins, Terraform, Ansible, AWS CDK, Amazon ECS, Git (GitLab, Bitbucket, GitHub), Subversion (SVN), Jenkins


Data Visualization & BI Tools:

Power BI, Amazon QuickSight, SQL Server Reporting Services

Programmiersprachen

Python
Java
SQL
Bash
Groovy
Typescript

Datenbanken

Microsoft SQL Server
MySQL
MariaDB
Oracle
Amazon RDS
Redshift
DynamoDB
Redis

Einsatzorte

Einsatzorte

Deutschland
möglich

Projekte

Projekte

1 year 11 months
2024-01 - now

various projects in the field of data engineering

Founder Apache Iceberg Python SQL ...
Founder
  • Database migration/development, data engineering consultation
  • Data migration coaching program
Apache Iceberg Python SQL ELT Data Modelling AWS Database Migration Services (DMS) Amazon Managed Streaming for Apache Kafka (MSK) AWS Aurora MySQL AWS CDK (Typescript) GitLab Kinesis Data Firehose Amazon OpenSearch Serverless Amazon API Gateway Amazon S3 Redshift AWS Glue Lambda DynamoDB RDS MariaDB etc.
on request
Rastatt (Germany)
1 year
2024-04 - 2025-03

Data Engineering/ Hybrid

Data Engineering Consultant (Freelance) AWS CDK (Typescript) GitLab CI/CD Apache Iceberg ...
Data Engineering Consultant (Freelance)
  • Architected and built a scalable data lake with Apache Iceberg and analytics platform from scratch
  • Designed data replication solutions from RDS MariaDB to an S3-based data lake and Amazon Redshift using AWS DMS and MSK, improving data availability for reporting and reducing RDS performance bottlenecks by 30%
  • Implemented audit log ingestion pipelines using Amazon Kinesis Data Firehose, streaming to OpenSearch and S3 for advanced analytics and long-term storage
  • Developed secure, scalable RESTful APIs using AWS API Gateway and Amazon Cognito, enabling authorized access to data
  • Developed AWS Bedrock-based AI agents to enable top management to query data lake datasets for ad hoc analytics
  • Designed fault-tolerant data processing architecture using ECS with auto-scaling task definitions, enabling seamless processing of records daily across distributed containerized services for charging station data synchronization and API integration
AWS CDK (Typescript) GitLab CI/CD Apache Iceberg Python SQL Data Modelling AWS DMS MSK Kinesis Data Firehose Amazon OpenSearch Serverless Amazon API Gateway Amazon S3 Redshift AWS Glue Lambda DynamoDB RDS MariaDB AWS Bedrock Amazon ECS Bash scripting
Chargecloud GmbH
Köln (Germany)
9 months
2023-04 - 2023-12

DataOps Helpdesk

Dataops Engineer Python SQL ETL ...
Dataops Engineer

  • Delivered critical support to internal customers on AWS-hosted Data Science and Engineering projects, optimizing data utilization, resolving issues & provisioning AWS infrastructure within SLA
  • Engineered a robust architecture for integration testing of Terraform AWS modules using Python and Boto3, streamlining validation for 5+ internal teams
  • Automated testing workflows via GitLab CI/CD, enabling continuous testing of customer AWS Infrastructure modules, reducing errors by 70% and enhancing operational reliability

Python SQL ETL Database Modelling Terraform AWS Database Migration Services (DMS) AWS Aurora MySQL GitLab Bitbucket Ansible Microsoft SQL Server SQL Server Reporting Services (SSRS) Airflow AWS CDK Kafka Flyway GitLab CI/CD Amazon S3 AWS Glue Lambda DynamoDB Amazon EMR Amazon CloudWatch Amazon RDS Grafana Apache Spark Bash scripting
Deutsche Bahn AG
Frankfurt (Germany)
2 years
2022-01 - 2023-12

NLP project

Cloud Data Engineer Python SQL ETL ...
Cloud Data Engineer
  • Engineered scalable data pipelines using Apache Airflow and transient AWS EMR clusters, integrated with GitLab CI/CD, to process 500GB of daily data for an NLP project, achieving 99.9% uptime and reducing processing time by 40% compared to non-distributed Python and AWS Glue ETL
  • Collaborated with data scientists to define and fulfill complex data requirements, delivering curated datasets for NLP model training and analysis, enabling improvement in model accuracy
  • Developed Grafana dashboards to visualize metrics from AWS CloudWatch logs and other AWS service logs (EMR, Airflow), enabling proactive monitoring of NLP pipeline health and ensuring 99.9% uptime
Python SQL ETL Database Modelling Terraform AWS Database Migration Services (DMS) AWS Aurora MySQL GitLab Bitbucket Ansible Microsoft SQL Server SQL Server Reporting Services (SSRS) Airflow CI/CD Amazon S3 AWS Glue Lambda DynamoDB Amazon EMR Amazon CloudWatch
TruBridge GmbH
Mannheim (Germany)
1 year 6 months
2022-02 - 2023-07

Database Migration to AWS Cloud

Data Migration & Reporting Specialist Python SQL ETL ...
Data Migration & Reporting Specialist
  • Orchestrated the migration of business critical on-premises SQL Server databases (2TB) to AWS Aurora MySQL using AWS DMS, achieving zero data loss
  • Automated AWS infrastructure provisioning using Terraform and Bitbucket pipelines, reducing deployment time and eliminating manual configuration errors
  • Designed a reporting database on Amazon RDS SQL Server, developing optimized stored procedures for data transformation, enabling faster report generation
  • Developed power BI and SSRS reports empowering 5 different internal departments with actionable analytics for strategic decision-making
Python SQL ETL Database Modelling Terraform AWS Database Migration Services (DMS) AWS Aurora MySQL AWS CDK GitLab Ansible Microsoft SQL Server SQL Server Reporting Services (SSRS) Airflow Kafka Flyway Bitbucket pipelines Amazon S3 Lambda AWS DMS Amazon RDS for SQL Server
Swiss Life AG
Hannover (Germany)
1 year 9 months
2020-04 - 2021-12

pearheaded the development and maintenance of ELT solutions

Business Intelligence Engineer Python ELT Database Modelling ...
Business Intelligence Engineer
  • Spearheaded the development and maintenance of ELT solutions on the Oracle platform, utilizing Oracle Data Integrator (ODI) to ensure seamless data extraction, loading, and transformation
  • Established Jenkins from scratch, creating a robust CI/CD data warehouse rollout pipeline, which streamlined the release of ELT packages and SQL scripts to various production and test instances
  • Enhanced deployment efficiency and reduced manual intervention by automating data warehouse releases
  • Built ELT solutions on Oracle using Oracle Data Integrator (ODI), processing approximately >50GB of daily data
  • Architected a Jenkins-based CI/CD pipeline from the ground up for data warehouse rollouts, automating ELT package and SQL script releases across production and test environments, reducing deployment time by 60% and eliminating 90% of manual interventions
  • Developed Java and Groovy scripts to enhance Oracle Data Integrator (ODI) workflows and Jenkins CI/CD pipelines, automating data validation and reducing error rates by 80% across 10+ production releases

Python ELT Database Modelling Oracle Data Integrator (ODI) Oracle Database Jenkins Groovy Scripting PostgreSQL Subversion PL/SQL Java
Riverty (Former Arvato Financial Solutions)
Baden-Baden (Germany)
1 year 4 months
2018-12 - 2020-03

Developed and maintained high-performance ETL solutions

Big Data Engineer Python SQL Database Modelling ...
Big Data Engineer
  • Developed and maintained high-performance ETL solutions utilizing Talend Open Studio (TOS) and MySQL, ensuring efficient data extraction, transformation, and loading to deliver data to the Business Intelligence/Data Science teams
  • Spearheaded the migration of legacy ETL scripts, originally written in PHP and Python, to Talend, ensuring a seamless transition with minimal disruption to ongoing operations
  • Developed real-time data pipelines using Apache Kafka to process daily web events (clicks, user interactions) for ingestion into MySQL, enabling web analytics capabilities
  • Stepped into the role of Interim Product Owner, effectively prioritizing and ranking backlogs on the sprint board, ensuring alignment with strategic objectives and timely delivery of data team projects
  • Engineered high-performance ETL/ETL solutions using Talend Open Studio and MySQL, processing daily event data enabling real-time analytics for Business Intelligence and Data Science teams
  • Orchestrated the migration of legacy PHP and Python ETL scripts to Talend Open Studio, achieving zero downtime and reducing processing time by 40% for 10+ data workflows
  • Served as Interim Product Owner for the data engineering team, prioritizing backlogs during Agile sprints and aligning 5+ projects with cross-functional teams

Python SQL Database Modelling MySQL Database Redis Talend Open Studio AWS Kafka MySQL PostgreSQL ETL/ELT PHP
Sovendus GmbH
Karlsruhe (Germany)
2 years
2016-12 - 2018-11

Designed and deployed high-performance data integration solutions

Database Developer Python SQL ETL ...
Database Developer
  • Designed and optimized complex database schemas and models in SQL Server, AWS Redshift, and Aurora
  • Architected and implemented high-performance ETL pipelines using SQL Server Integration Services (SSIS) to support enterprise data warehousing
  • Developed and automated dynamic, user-centric reports and dashboards using SSRS and Tableau, enabling real-time decision-making for cross-functional teams
  • Engineered Python-based web scraping scripts to extract and process data from internal sources, enhancing data availability for analytics and reporting
  • Implemented data quality checks and validation processes within ETL workflows, ensuring high accuracy and reliability of data for downstream analytics
Python SQL ETL Database Modelling SQL Server Database SQL Server Integration Services (SSIS) SQL Server Reporting Services (SSRS) AWS Redshift & Aurora Tableau Redshift
Misumi Europa GmbH
Frankfurt (Germany)
9 months
2016-03 - 2016-11

Designed a comprehensive process from scratch within SQL Server

Database Developer/Internship and Master Thesis SQL ETL Database Modelling ...
Database Developer/Internship and Master Thesis

  • Architected and implemented a comprehensive data quality assessment framework from scratch within SQL Server, enabling robust validation and quality assurance across diverse data sources
  • Designed and deployed an intuitive front-end interface using Microsoft Access, empowering users to efficiently manage and interact with data quality processes, improving operational efficiency
  • Authored my master?s thesis on ?Investigation and Evaluation of the Quality of Customer-Supplied Raw Data,? developing methodologies to enhance data reliability and inform data quality best practices

SQL ETL Database Modelling SQL Server Database SQL Server Integration Services (SSIS) Microsoft Access
Management Services Helwig Schmitt GmbH
Hofgeismar (Germany)
2 years
2012-08 - 2014-07

Engaged in the project development phase

Data Warehouse Analyst SQL ETL Database Modelling ...
Data Warehouse Analyst
  • Engaged in the project development phase, ensuring the quality and compatibility of code in adherence to Verizon?s Quality standards and industry best practices
  • Developed Teradata SQL test scripts, translating test procedures into executable scripts and executing them using Teradata SQL Assistant and BTEQ
  • Analyzed business and functional requirements, deriving detailed test plans, test cases, and procedures to ensure thorough testing of UNIX/Teradata-based Data Warehouse applications
  • Coordinated integration testing activities among various Data Warehouse teams and Upstream/Downstream application test teams, ensuring comprehensive testing and quality assurance
  • Acted as a pivotal Point of Contact (POC) among the Production, Development, and System Integration Testing (SIT) teams, ensuring seamless communication and issue tracking within the Production box
  • Conducted daily status calls with the development team and clients, ensuring all stakeholders were aligned and updated regarding project status and any emerging issues
SQL ETL Database Modelling Teradata Database HP Quality Center Database Testing
Verizon Data Services India Pvt. Ltd.
Chennai (India)

Aus- und Weiterbildung

Aus- und Weiterbildung

1 month
2025-05 - 2025-05

Azure Data Fundamentals (DP-900)


1 month
2021-09 - 2021-09

Individual lessons - Data engineering

Final Project: Batch & Streaming pipelines to analyze Credit Card Transactions (GitHub Repo)
2 years 3 months
2014-09 - 2016-11

Study - Communication and Media Engineering/ software engineering

Master of Science, Offenburg University of Applied Sciences, Offenburg (Germany)
Master of Science
Offenburg University of Applied Sciences, Offenburg (Germany)
4 years 1 month
2008-06 - 2012-06

Study - Electrical and Electronics Engineering

Bachelor of Engineering, St. Joseph?s College of Engineering (Affiliated to Anna University), Chennai (India)
Bachelor of Engineering
St. Joseph?s College of Engineering (Affiliated to Anna University), Chennai (India)
1 year 11 months
2006-06 - 2008-04

Higher Secondary Certificate

Alpha Plus Matriculation Higher Secondary School, Trichy (India)
Alpha Plus Matriculation Higher Secondary School, Trichy (India)


9 years 11 months
1996-06 - 2006-04

Secondary School Leaving Certificate

St.James Higher Secondary School, Palakurichy (India)
St.James Higher Secondary School, Palakurichy (India)


                                    

Position

Position

Freelance Data Engineer

Kompetenzen

Kompetenzen

Top-Skills

ETL Datenbankentwicklung Datawarehouse AWS SQL-Programmierung Python Git Talend Oracle MySQL MS SQL Server Jenkins MS SQL Server Reporting Services Amazon RDS amazon Redshift Atlassian Confluence Atlassian JIRA Agile Softwareentwicklung Data Modeling

Produkte / Standards / Erfahrungen / Methoden

SQL
Data Modelling
ETL/ELT
Python
Pandas, PySpark, Boto3, etc.
AWS
RDS, Aurora, DMS, S3, Glue, Lambda, Kinesis, EMR, QuickSight, Redshift, CDK
Terraform
Airflow
Jenkins
GitLab
Ansible
AWS CDK
Microsoft SQL Server
MySQL
SSRS
SSIS
Talend Open Studio
MariaDB
Teradata
Oracle Database
Oracle Data Integrator
Kafka
NoSQL
Redis, DynamoDB
PowerBI
Bitbucket
SVN
Groovy
Flyway
Apache Iceberg
PostgreSQL
AWS API Gateway
OpenSearch
REST APIs
Grafana
Amazon CloudWatch
Azure
Profile:
  • Results-driven Data Engineer with end-to-end expertise across the data engineering lifecycle, including infrastructure provisioning, Data Modelling, ETL/ELT pipelines, data warehousing, and cloud architecture
  • Proven ability to design and implement high-performance, scalable data platforms for diverse organizations


Data Modeling & Data Warehousing:

OLTP Database Design, Data Vault 2.0, Dimensional Modeling (Star/Snowflake Schema, Kimball Methodology), Dalalake


Big Data Technologies:

Apache Spark, Kafka, Iceberg, Amazon Kinesis, Data Firehose, EMR


Data Migration:

AWS Database Migration Service (DMS), Debezium


ETL/ELT & Orchestration:

SQL Server Integration Services (SSIS), Oracle Data Integrator (ODI), Talend Open Studio, AWS Glue, dbt, Apache Airflow


DevOps & CI/CD:

Docker, Jenkins, Terraform, Ansible, AWS CDK, Amazon ECS, Git (GitLab, Bitbucket, GitHub), Subversion (SVN), Jenkins


Data Visualization & BI Tools:

Power BI, Amazon QuickSight, SQL Server Reporting Services

Programmiersprachen

Python
Java
SQL
Bash
Groovy
Typescript

Datenbanken

Microsoft SQL Server
MySQL
MariaDB
Oracle
Amazon RDS
Redshift
DynamoDB
Redis

Vertrauen Sie auf Randstad

Im Bereich Freelancing
Im Bereich Arbeitnehmerüberlassung / Personalvermittlung

Fragen?

Rufen Sie uns an +49 89 500316-300 oder schreiben Sie uns:

Das Freelancer-Portal

Direktester geht's nicht! Ganz einfach Freelancer finden und direkt Kontakt aufnehmen.