Data Science and Engineering Expert with knowledge of legacy and modern technology stack.
Aktualisiert am 20.09.2024
Profil
Freiberufler / Selbstständiger
Remote-Arbeit
Verfügbar ab: 01.10.2024
Verfügbar zu: 20%
davon vor Ort: 100%
Python
Big Data
Cloud
SQL
Datawarehouse
ETL
docker
Git
Apache NiFi
MongoDB
MySQL
pyspark
English
Expert (C1 Level)
German
Intermediate (B1 Level with Integration course)

Einsatzorte

Einsatzorte

Erlangen (D-91056, D-91058, D-91052, D-91054) (+50km)
möglich

Projekte

Projekte

2 years 4 months
2022-06 - now

B2B Transformation

Data Architect Manager python pyspark docker
Data Architect Manager

  • Led a team of architect & developers to create Delta Lakehouse on Databricks.
  • Helped the organization to lower the BI cost by implementing solutions using Power BI and Databricks.
  • Evaluated capabilities and limitations of the Databricks and Power BI and created architecture which can scale and respond to user queries in seconds over large dataset. (30 seconds of SLA over 40Gb of fact table).
  • Took decisions to create solution which can satisfy all the technical, business and process constraints.
  • Designed and created multiple data pipeline for data extraction, cleansing andmapping business logic using spark API for Databricks.
  • Fully understand the Databricks data science, ML and BI offerings, compute resources, SQL Warehouse, analytical engine (spark, catalyst, photon) etc. and able to create multi-cloud, hybrid BI solutions.

Databricks. PowerBI
python pyspark docker
Adidas AG., Germany
Herzogenaurach
2 years 1 month
2020-06 - 2022-06

Real time Analytics Application

Data Engineer & Architect Python SQL Apache NiFi ...
Data Engineer & Architect

  • Designed and developed data pipeline using Google cloud services (Docker, python, GCS, GKE, Cloud Compose, Big Table, Big Query, Cloud function, and scheduler)
  • Created real-time vehicle tracking application for route planning, passenger, and fleet management for Nuremberg Public transport association (VGN).
  • Independently created analytics platform, which can scale based on the demand. based on Apache NiFi, Mongo DB, and MySQL.
  • The platform was capable to receive real time streaming data from 10000 devices from 2000 locations.
  • Used mongo dB to store over 4 Tb of large JSON Objects and MySQL to store Facts, Dimensions, and aggregated data for presentation.
  • Extensively used mongo aggregation framework to aggregate large JSON objects into small objects flattened them to store in RDBMS.
  • Independently created Apache NiFi cluster on virtual machine as an alternative to Cloudera Data Flow (CDF).
  • Created HA load balancer using Nginx to distribute the load on Apache NiFi.
  • Managed NiFi internal and external security using a self-signed and letsencrypt  certificates.
  • Created an observability platform for infrastructure and application auditing, logging, monitoring, data quality, and alerting using Prometheus and Grafana.
  • Guiding data scientists to take architectural decisions for video analytics framework.
  • Coordinated with the product owner, data scientist, and backend/frontend engineers for the fast delivery of the data product.
  • Fast pace, agile mindset with hands-on approach for solving problem

Apache NiFi MongoDB
Python SQL Apache NiFi MongoDB MySQL PostgreSQL Kubernetes Docker Git
Abl Solutions
Nuremberg
2 years
2018-07 - 2020-06

Enterprise reporting, Operation Analytics

Scrum Master & Senior Data Engineer Python3 Java Shell Scripting ...
Scrum Master & Senior Data Engineer

  • Created a CI/CD pipeline for integration of changes in the operational analytics platform.
  • Used multiple technologies (python, shell, git, and Jenkin cluster, bit-bucket repository) to create a data pipeline to move data from many sources (relational database, rest-API, weblogs, streaming platform) to exasol analytics platform.
  • Designed Master and Transactional data management layers to capture the business logic using SQL and Lua scripting.
  • Utilized python based Meta data driven integration tool m3d-api for data ingestion in data lake and EMR cluster and used m3d-engine for data transformation.
  • Designed multiple tools to maintain data consistency and auditing. (Master data reconciliation, Automated Data Quality, Management of Change data capture, Implementation of Global semantic at various regions).
  • Created multiple ansible playbooks to create on demand infrastructure in the cloud.
  • Proposed scalable analytics platform using D3.js, node.js, Docker and Kubernetes.
  • Played role of scrum master ?
    • Preparing user stories with clear and concise descriptions.
    • Hosting daily scrum calls and updating the JIRA, keeping the PO up to date.
    • Planning and executing retrospective, review and backlog grooming meeting, following up on the feedback given by the team.

Analytics Database Red hat ? Linux bit-bucket Jenkins aws-s3.
Python3 Java Shell Scripting SQL Lua
Adidas AG, Germany
Herzogenaurach
2 years 6 months
2016-02 - 2018-07

Customer Intelligence Application

Senior Data Engineer Python Java SQL ...
Senior Data Engineer

  • Created data pipeline to pull data from Salesforce, Live-ops, Live-Person, Self-service portal, Lithium and Google Analytics API
  • Designed a real time streaming pipeline to receive telemetry data from game servers using kinesis events stream.to S3 buckets
  • Utilized AWS ? lambda to decompress, transform and enrich raw data into JSON format
  • Created data pipeline to move data from s3-bucket to Redshift.

AWS ? S3 Lambda Kinesis and Data pipeline Firehose Informatica
Python Java SQL shell scripting
Electronics Arts, USA
Offshore India
3 years 4 months
2013-07 - 2016-10

BI transformation

Senior Business Intelligence Developer SQL Shell Scripting
Senior Business Intelligence Developer

As a Senior Business Intelligence  Developer at Virgin Media in the UK, I was responsible for leading a team of six in the design and development of four data warehouse applications. I used a range of tools and technologies, including Aginity, Toad, SQL Developer, Power center Designer, and IBM Netezza, and programming languages such as SQL and Shell Scripting.

In my role, I worked closely with client SMEs, business users, and owners to plan and implement changes in the application. This involved actively participating in discussions and providing expert guidance on the technical aspects of the project. I also performed impact analysis, created High/Low level design documents, and planned for development and testing to ensure that changes were deployed effectively.

To translate business processes into Informatica mappings and load Star Schema, I utilized my technical expertise in data warehousing and ETL development. I also created automated scripts to perform data cleansing and loading, ensuring that the data was accurate and reliable.

Overall, my work was critical in ensuring that Virgin Media had reliable and effective data warehouse applications to support its business operations. I collaborated closely with other members of the team, including business analysts, designers, and testers, to ensure that the applications met the project requirements and were delivered on time.

Aginity Toad SQL Developer Power center Designer IBM Netezza
SQL Shell Scripting
Virgin Media, UK
Hook, United Kingdom
2 years 8 months
2013-07 - 2016-02

BI applications

Senior Database Developer SQL Shell Scripting
Senior Database Developer
  • Worked as Team Lead/On-site of BI applications for Virgin media, UK
  • Led team of 6 to design and develop 4 Data Warehouse applications.
  • Actively involved in discussion with client SME, Business user and owners for planning and implementation of changes in the application.
  • Performed impact analysis, created High/Low level design documents, planned for development and testing, deployment of the change in the systems.
  • Translated Business processes into Informatica mappings to load Star Schema.
  • Created automated scripts to perform data cleansing and data loading
Aginity Toad SQL Developer Power center Designer IBM Netezza
SQL Shell Scripting
Virgin Media, UK
3 years 3 months
2010-05 - 2013-07

Activity based Business Intelligence

Senior Database Developer SQL shell scripting
Senior Database Developer

  • Worked as Team Lead of BI applications for Outokumpu Oyj. (Steel Manufacturer) of Finland.
  • Lead team of 5 people to design and develop 3 BI applications.
  • Performed data cleansing and loading.
  • Designed facts, dimensions and aggregate tables and oracle procedures, packages to load tables in type 1/2.
  • Created several KPIs in fact and aggregate tables for reporting.
  • Worked as site coordinator and interacted with offshore and client to understand and explain the requirement.
  • Performed impact analysis and created High/Low level design documents

Informatica 9.1 Oracle 11g Congo?s Oracle-PL/SQL
SQL shell scripting
Outokumpu Oyj, Finland
Espoo, Finland
2 years 11 months
2007-07 - 2010-05

Openreach Business Information Toolkit (ORBIT)

Database Developer SQL Shell Scripting
Database Developer

  • Worked as senior developer of the ETL mappings based on the design provided by the designer.
  • Extensively used various transformations in mappings, Workflow, Pl/SQL packages, procedures, materialized view, UNIX scripts to load data from source to target.

Oracle11g UNIX Informatica 9.1 Apex
SQL Shell Scripting
British Telecom
Offshore India

Aus- und Weiterbildung

Aus- und Weiterbildung

3 years 1 month
2004-08 - 2007-08

Computer Application

Masters of Computer Application, Vellore Institute of Technology, India
Masters of Computer Application
Vellore Institute of Technology, India

Kompetenzen

Kompetenzen

Top-Skills

Python Big Data Cloud SQL Datawarehouse ETL docker Git Apache NiFi MongoDB MySQL pyspark

Produkte / Standards / Erfahrungen / Methoden

Objective

Looking for a position, where I can use skills and over 15 years of IT experience to define enterprise data architecture, strategy, and roadmap, ensure data quality, governance, lead a team which can create data products, implement data mesh, data fabric, which can help organizations to become truly data driven


Professional experience

  • A polyglot programmer with the ability to learn and use programming languages based on the need.
  • Have used python3, java, node JS, Shell, Lua, Ansible, SQL in multiple scenarios.
  • Created hybrid real-time Analytics solutions based on Microservices & Distributed monolith architecture using
    • GCP: Compute Engine, GCS, GKE, Cloud Functions, Cloud Compose, Big Table Big Query
    • AWS: EC2, S3, Data pipeline, Lambda, Kinesis: Streaming, Firehose, Redshift
    • Queue: Confluent Kafka, Rabbit MQ, and MQTT.
    • ETL & Databases: Apache NiFi, Mongo DB, MySQL
  • Effectively used GitHub, Bit Bucket, and Gitlab for version control with CI/CD pipeline using Jenkins and Gitlab.
  • Used multiple SQL & NoSQL based database technologies based on the use cases.
    • Mongo DB: Created multi shard + replica DB and used mongo aggregation for JSON-based document.
    • Redis: Created & Used HA Redis sentinel cluster for real-time application.
    • RDBMS: Oracle, MySQL, Netezza, SQL Server, Exasol.
  • Thorough understanding of DWH concept with Dimensional modeling, Data Mesh, Data Fabric, Delta Lake, and Lakehouse architecture.
  • Coordinated with the product owner, data scientist, and backend/frontend engineers for the fast delivery of the data product.
  • Independently created data governance framework, evaluated data catalog offerings, implemented Datahub as data catalog.
  • Used and administrated multiple ETL Tools, Informatica, Pentaho Spoon and Airflow.
  • Worked on Agile framework of software delivery and played role of scrum master where I ?
    • Prepared user stories with clear and concise descriptions.
    • Hosted daily scrum calls and updating the JIRA, keeping the PO up to date.
    • Planned and executed retrospective, review, and backlog grooming meetings, following up on the feedback given by the team.
  • Very good knowledge of Big Data Analytics tools: Data Bricks, Spark, Presto, Hadoop, Hive, Delta


Technical Skills

  • Cloud - GCP: GCS, Cloud Compose, Big Query, AWS:ec2, s3, and Data pipeline, Lambda,
  • Kinesis: Streaming, Firehose, Redshift
  • Queue: Confluent Kafka & Rabbit MQ.
  • Data catalog: Amundsen, Datahub
  • CI/CD - git, Bit Bucket, GitLab, Jenkins, Docker, Kubernetes
  • ETL Tools - Apache NiFi, Informatica, Lua scripting, Pentaho Spoon, Airflow
  • Data modeling using Erwin data modeler
  • Visualization - OBIEE, Tableau, QlikView, Micro strategy, Cognos
  • Web technology - JavaScript, react JS, node JS, molecular - microservices
  • Machine-Learning Libraries - Scikit-learn, Tensor-flow, IBM Watson services ? Knowledge of popular ML libraries.
  • Statistics - Probability, Linear Algebra, Linear regression, Bayesian classification, Logistic regression, Support Vector Machines, Artificial NN ? Knowledge of Math and the algorithms, can read/write and customize using Python or R.


Employer Detail

2022

LTIMindtree Ltd., India & Germany


2020

Abl Social Federation GmBH, Germany, Abl solutions is a managed wireless solution provider in Nüremberg


2016 ? 2020

Mindtree Ltd., India & Germany, Mindtree Ltd is an Indian multinational information technology and outsourcing company headquartered in Bangalore, India with more than 307 active clients and 43 offices in over 18 countries.


2010 ? 2016 

Accenture Technology, India, Finland & UK, Accenture Plc, based in Dublin, Ireland, is one of the world's largest service providers in the field of corporate and strategy consulting as well as technology and outsourcing.


2007 ? 2010 

Tata Consultancy Service, India Tata Consultancy Services (TCS) third largest IT service company globally, after Accenture and IBM. TCS take care of 80 German and Austrian companies

Programmiersprachen

python
Experte
SQL
Experte
Database
Experte
Data warehouse
Experte
Big data
Experte
Cloud
Experte
Agile
Experte
Python3
Shell Scripting
Lua
Ansible

Datenbanken

Oracle
SQL Server
Netezza
Exasol ? Analytics database

Branchen

Branchen

  • Banking
  • Insurance and Finance
  • Telecom
  • Retail

Einsatzorte

Einsatzorte

Erlangen (D-91056, D-91058, D-91052, D-91054) (+50km)
möglich

Projekte

Projekte

2 years 4 months
2022-06 - now

B2B Transformation

Data Architect Manager python pyspark docker
Data Architect Manager

  • Led a team of architect & developers to create Delta Lakehouse on Databricks.
  • Helped the organization to lower the BI cost by implementing solutions using Power BI and Databricks.
  • Evaluated capabilities and limitations of the Databricks and Power BI and created architecture which can scale and respond to user queries in seconds over large dataset. (30 seconds of SLA over 40Gb of fact table).
  • Took decisions to create solution which can satisfy all the technical, business and process constraints.
  • Designed and created multiple data pipeline for data extraction, cleansing andmapping business logic using spark API for Databricks.
  • Fully understand the Databricks data science, ML and BI offerings, compute resources, SQL Warehouse, analytical engine (spark, catalyst, photon) etc. and able to create multi-cloud, hybrid BI solutions.

Databricks. PowerBI
python pyspark docker
Adidas AG., Germany
Herzogenaurach
2 years 1 month
2020-06 - 2022-06

Real time Analytics Application

Data Engineer & Architect Python SQL Apache NiFi ...
Data Engineer & Architect

  • Designed and developed data pipeline using Google cloud services (Docker, python, GCS, GKE, Cloud Compose, Big Table, Big Query, Cloud function, and scheduler)
  • Created real-time vehicle tracking application for route planning, passenger, and fleet management for Nuremberg Public transport association (VGN).
  • Independently created analytics platform, which can scale based on the demand. based on Apache NiFi, Mongo DB, and MySQL.
  • The platform was capable to receive real time streaming data from 10000 devices from 2000 locations.
  • Used mongo dB to store over 4 Tb of large JSON Objects and MySQL to store Facts, Dimensions, and aggregated data for presentation.
  • Extensively used mongo aggregation framework to aggregate large JSON objects into small objects flattened them to store in RDBMS.
  • Independently created Apache NiFi cluster on virtual machine as an alternative to Cloudera Data Flow (CDF).
  • Created HA load balancer using Nginx to distribute the load on Apache NiFi.
  • Managed NiFi internal and external security using a self-signed and letsencrypt  certificates.
  • Created an observability platform for infrastructure and application auditing, logging, monitoring, data quality, and alerting using Prometheus and Grafana.
  • Guiding data scientists to take architectural decisions for video analytics framework.
  • Coordinated with the product owner, data scientist, and backend/frontend engineers for the fast delivery of the data product.
  • Fast pace, agile mindset with hands-on approach for solving problem

Apache NiFi MongoDB
Python SQL Apache NiFi MongoDB MySQL PostgreSQL Kubernetes Docker Git
Abl Solutions
Nuremberg
2 years
2018-07 - 2020-06

Enterprise reporting, Operation Analytics

Scrum Master & Senior Data Engineer Python3 Java Shell Scripting ...
Scrum Master & Senior Data Engineer

  • Created a CI/CD pipeline for integration of changes in the operational analytics platform.
  • Used multiple technologies (python, shell, git, and Jenkin cluster, bit-bucket repository) to create a data pipeline to move data from many sources (relational database, rest-API, weblogs, streaming platform) to exasol analytics platform.
  • Designed Master and Transactional data management layers to capture the business logic using SQL and Lua scripting.
  • Utilized python based Meta data driven integration tool m3d-api for data ingestion in data lake and EMR cluster and used m3d-engine for data transformation.
  • Designed multiple tools to maintain data consistency and auditing. (Master data reconciliation, Automated Data Quality, Management of Change data capture, Implementation of Global semantic at various regions).
  • Created multiple ansible playbooks to create on demand infrastructure in the cloud.
  • Proposed scalable analytics platform using D3.js, node.js, Docker and Kubernetes.
  • Played role of scrum master ?
    • Preparing user stories with clear and concise descriptions.
    • Hosting daily scrum calls and updating the JIRA, keeping the PO up to date.
    • Planning and executing retrospective, review and backlog grooming meeting, following up on the feedback given by the team.

Analytics Database Red hat ? Linux bit-bucket Jenkins aws-s3.
Python3 Java Shell Scripting SQL Lua
Adidas AG, Germany
Herzogenaurach
2 years 6 months
2016-02 - 2018-07

Customer Intelligence Application

Senior Data Engineer Python Java SQL ...
Senior Data Engineer

  • Created data pipeline to pull data from Salesforce, Live-ops, Live-Person, Self-service portal, Lithium and Google Analytics API
  • Designed a real time streaming pipeline to receive telemetry data from game servers using kinesis events stream.to S3 buckets
  • Utilized AWS ? lambda to decompress, transform and enrich raw data into JSON format
  • Created data pipeline to move data from s3-bucket to Redshift.

AWS ? S3 Lambda Kinesis and Data pipeline Firehose Informatica
Python Java SQL shell scripting
Electronics Arts, USA
Offshore India
3 years 4 months
2013-07 - 2016-10

BI transformation

Senior Business Intelligence Developer SQL Shell Scripting
Senior Business Intelligence Developer

As a Senior Business Intelligence  Developer at Virgin Media in the UK, I was responsible for leading a team of six in the design and development of four data warehouse applications. I used a range of tools and technologies, including Aginity, Toad, SQL Developer, Power center Designer, and IBM Netezza, and programming languages such as SQL and Shell Scripting.

In my role, I worked closely with client SMEs, business users, and owners to plan and implement changes in the application. This involved actively participating in discussions and providing expert guidance on the technical aspects of the project. I also performed impact analysis, created High/Low level design documents, and planned for development and testing to ensure that changes were deployed effectively.

To translate business processes into Informatica mappings and load Star Schema, I utilized my technical expertise in data warehousing and ETL development. I also created automated scripts to perform data cleansing and loading, ensuring that the data was accurate and reliable.

Overall, my work was critical in ensuring that Virgin Media had reliable and effective data warehouse applications to support its business operations. I collaborated closely with other members of the team, including business analysts, designers, and testers, to ensure that the applications met the project requirements and were delivered on time.

Aginity Toad SQL Developer Power center Designer IBM Netezza
SQL Shell Scripting
Virgin Media, UK
Hook, United Kingdom
2 years 8 months
2013-07 - 2016-02

BI applications

Senior Database Developer SQL Shell Scripting
Senior Database Developer
  • Worked as Team Lead/On-site of BI applications for Virgin media, UK
  • Led team of 6 to design and develop 4 Data Warehouse applications.
  • Actively involved in discussion with client SME, Business user and owners for planning and implementation of changes in the application.
  • Performed impact analysis, created High/Low level design documents, planned for development and testing, deployment of the change in the systems.
  • Translated Business processes into Informatica mappings to load Star Schema.
  • Created automated scripts to perform data cleansing and data loading
Aginity Toad SQL Developer Power center Designer IBM Netezza
SQL Shell Scripting
Virgin Media, UK
3 years 3 months
2010-05 - 2013-07

Activity based Business Intelligence

Senior Database Developer SQL shell scripting
Senior Database Developer

  • Worked as Team Lead of BI applications for Outokumpu Oyj. (Steel Manufacturer) of Finland.
  • Lead team of 5 people to design and develop 3 BI applications.
  • Performed data cleansing and loading.
  • Designed facts, dimensions and aggregate tables and oracle procedures, packages to load tables in type 1/2.
  • Created several KPIs in fact and aggregate tables for reporting.
  • Worked as site coordinator and interacted with offshore and client to understand and explain the requirement.
  • Performed impact analysis and created High/Low level design documents

Informatica 9.1 Oracle 11g Congo?s Oracle-PL/SQL
SQL shell scripting
Outokumpu Oyj, Finland
Espoo, Finland
2 years 11 months
2007-07 - 2010-05

Openreach Business Information Toolkit (ORBIT)

Database Developer SQL Shell Scripting
Database Developer

  • Worked as senior developer of the ETL mappings based on the design provided by the designer.
  • Extensively used various transformations in mappings, Workflow, Pl/SQL packages, procedures, materialized view, UNIX scripts to load data from source to target.

Oracle11g UNIX Informatica 9.1 Apex
SQL Shell Scripting
British Telecom
Offshore India

Aus- und Weiterbildung

Aus- und Weiterbildung

3 years 1 month
2004-08 - 2007-08

Computer Application

Masters of Computer Application, Vellore Institute of Technology, India
Masters of Computer Application
Vellore Institute of Technology, India

Kompetenzen

Kompetenzen

Top-Skills

Python Big Data Cloud SQL Datawarehouse ETL docker Git Apache NiFi MongoDB MySQL pyspark

Produkte / Standards / Erfahrungen / Methoden

Objective

Looking for a position, where I can use skills and over 15 years of IT experience to define enterprise data architecture, strategy, and roadmap, ensure data quality, governance, lead a team which can create data products, implement data mesh, data fabric, which can help organizations to become truly data driven


Professional experience

  • A polyglot programmer with the ability to learn and use programming languages based on the need.
  • Have used python3, java, node JS, Shell, Lua, Ansible, SQL in multiple scenarios.
  • Created hybrid real-time Analytics solutions based on Microservices & Distributed monolith architecture using
    • GCP: Compute Engine, GCS, GKE, Cloud Functions, Cloud Compose, Big Table Big Query
    • AWS: EC2, S3, Data pipeline, Lambda, Kinesis: Streaming, Firehose, Redshift
    • Queue: Confluent Kafka, Rabbit MQ, and MQTT.
    • ETL & Databases: Apache NiFi, Mongo DB, MySQL
  • Effectively used GitHub, Bit Bucket, and Gitlab for version control with CI/CD pipeline using Jenkins and Gitlab.
  • Used multiple SQL & NoSQL based database technologies based on the use cases.
    • Mongo DB: Created multi shard + replica DB and used mongo aggregation for JSON-based document.
    • Redis: Created & Used HA Redis sentinel cluster for real-time application.
    • RDBMS: Oracle, MySQL, Netezza, SQL Server, Exasol.
  • Thorough understanding of DWH concept with Dimensional modeling, Data Mesh, Data Fabric, Delta Lake, and Lakehouse architecture.
  • Coordinated with the product owner, data scientist, and backend/frontend engineers for the fast delivery of the data product.
  • Independently created data governance framework, evaluated data catalog offerings, implemented Datahub as data catalog.
  • Used and administrated multiple ETL Tools, Informatica, Pentaho Spoon and Airflow.
  • Worked on Agile framework of software delivery and played role of scrum master where I ?
    • Prepared user stories with clear and concise descriptions.
    • Hosted daily scrum calls and updating the JIRA, keeping the PO up to date.
    • Planned and executed retrospective, review, and backlog grooming meetings, following up on the feedback given by the team.
  • Very good knowledge of Big Data Analytics tools: Data Bricks, Spark, Presto, Hadoop, Hive, Delta


Technical Skills

  • Cloud - GCP: GCS, Cloud Compose, Big Query, AWS:ec2, s3, and Data pipeline, Lambda,
  • Kinesis: Streaming, Firehose, Redshift
  • Queue: Confluent Kafka & Rabbit MQ.
  • Data catalog: Amundsen, Datahub
  • CI/CD - git, Bit Bucket, GitLab, Jenkins, Docker, Kubernetes
  • ETL Tools - Apache NiFi, Informatica, Lua scripting, Pentaho Spoon, Airflow
  • Data modeling using Erwin data modeler
  • Visualization - OBIEE, Tableau, QlikView, Micro strategy, Cognos
  • Web technology - JavaScript, react JS, node JS, molecular - microservices
  • Machine-Learning Libraries - Scikit-learn, Tensor-flow, IBM Watson services ? Knowledge of popular ML libraries.
  • Statistics - Probability, Linear Algebra, Linear regression, Bayesian classification, Logistic regression, Support Vector Machines, Artificial NN ? Knowledge of Math and the algorithms, can read/write and customize using Python or R.


Employer Detail

2022

LTIMindtree Ltd., India & Germany


2020

Abl Social Federation GmBH, Germany, Abl solutions is a managed wireless solution provider in Nüremberg


2016 ? 2020

Mindtree Ltd., India & Germany, Mindtree Ltd is an Indian multinational information technology and outsourcing company headquartered in Bangalore, India with more than 307 active clients and 43 offices in over 18 countries.


2010 ? 2016 

Accenture Technology, India, Finland & UK, Accenture Plc, based in Dublin, Ireland, is one of the world's largest service providers in the field of corporate and strategy consulting as well as technology and outsourcing.


2007 ? 2010 

Tata Consultancy Service, India Tata Consultancy Services (TCS) third largest IT service company globally, after Accenture and IBM. TCS take care of 80 German and Austrian companies

Programmiersprachen

python
Experte
SQL
Experte
Database
Experte
Data warehouse
Experte
Big data
Experte
Cloud
Experte
Agile
Experte
Python3
Shell Scripting
Lua
Ansible

Datenbanken

Oracle
SQL Server
Netezza
Exasol ? Analytics database

Branchen

Branchen

  • Banking
  • Insurance and Finance
  • Telecom
  • Retail

Vertrauen Sie auf Randstad

Im Bereich Freelancing
Im Bereich Arbeitnehmerüberlassung / Personalvermittlung

Fragen?

Rufen Sie uns an +49 89 500316-300 oder schreiben Sie uns:

Das Freelancer-Portal

Direktester geht's nicht! Ganz einfach Freelancer finden und direkt Kontakt aufnehmen.