User

Azure Devops

BMT Score
86
86%
  • Remote

Available for

About VINOD P

Senior Data Engineer having 9 years of experience. 
I worked for HEALTHCARE, LIFE SCIENCES and MARKETING domains. 
Hands on Microsoft Azure Data bricks and SNOWFLAKE 
Experienced in ETL (Extraction, Transformation and Load) data loading from various sources into Data Warehouse and Data Marts using Informatics Power Center 9.6. 
Good knowledge on Data warehouse concepts like SCD, fact tables, STAR schema, SNOWFLAKE schema, Fact table, Dimension tables. 
Worked on GitHub to store the code and used Jenkins to deploy the code. 
Worked on Stream sets (ETL) with Apache Kafka 
Strong knowledge in Data warehousing concepts, dimensional STAR SCHEMA, SNOWFLAKES SCHEMA methodologies, ETL methodology for Data Extraction, transformation and loading processing.
Hands on Python and Unix shell scripting. 
Experienced in performance tuning for mapping, sessions, workflows and designed mapping according to business requirements. 
Performed key roles throughout the Design, Code, test, modify, debug, implementation and maintenance of ETL mappings/workflows using informatics, oracle and shell scripts. 
Good knowledge on Data bricks.
 

Work Experience

Images

Azure Developer

  • January 2015 - April 2023 - 8 Year
  • India

Projects

Images

ORG – Hierarchy Community

  • May 2022 - March 2023 - 11 Months
Role & Responsibility
    The primary purpose of this project is to set up data community data from snowflake to 
    Databricks tables through notebooks. 

    Roles and responsibilities: 

     Actively involved in gathering requirements and acquiring application knowledge from the Business. Involved in creation of SWAG Document, WO Documents for the new projects. 
     Performed effort estimation and initial assessment for the new requirements in the project. 
     Created the STT (source to Target) document and prepared the logic accordingly 
     Created the Azure databricks notebook with scala scripting to load the data to hive tables. 
     Scheduled the job through the Apache airflow to run on a particular time 
     Performed and validated the code in all the scenarios. 
     Performed UAT and validated the data through all test scenarios. 
     Used Jenkins to migrate the code to production and validated the code. 
     Prepared all the relevant documents and handed over the code to the Prod support team.

     
...see less
Images

Client: John Muir Health

  • March 2021 - January 2022 - 11 Months
Technologies
Role & Responsibility

    Migration of on- premises RDMS to snowflake cloud. Analyzing the preventive health screening test results of patients like Annual wellness visit, Colorectal cancer screening, mammogram, Blood pressure, Diabetic eye exam, cardiac screening etc. Physician billing cost and creating the QlikView reports.

    Roles and responsibilities: 

    Created ETL pipelines and workflows to load the data into database 
     Created Unix scripts to preprocess the data 
     Migrated the on-premises oracle to snowflake cloud data warehouse 
     Worked on snowflake utilities like time travel 
     Implemented snowflake stored procedure 
     Done the UAT, migration and production support. 
     Extensively used Mapping Variables, Mapping Parameters to execute complex business logic 
     Independently perform complex troubleshooting, root-cause analysis, and 
       solution development. 
     Effectively worked on Onsite and Offshore work model 
...see less
Images

PINE

  • August 2019 - December 2020 - 17 Months
Technologies
Role & Responsibility
    To find the correct and updated contact information about the customers who are interested in DELL products like Laptops, storage devices and other accessories. We will collect this information of the customers who visited the dell.com or flipkart or amazon or social media. As a marketing team we have to campaign those customers with Additional offers and informing them about new launches.

    Roles & responsibilities
    Created the Unix shell script in identifying the source layout as per the agreed format 
    Designed the ETL mappings and workflows to load the data into stg and main tables
    Scheduled the jobs through the control m
    Used the KAFKA as the data brokerage so that other consumers can subscribe the data in future
    Implemented CDC to get recent transacted data
    Done the UAT, migration and Prod support
    Used the KAFKA as the data brokerage so that other consumers can subscribe the data in future 
    Implemented CDC to get recent transacted data
    Done the UAT, migration and Prod support.
     
...see less
Images

TRUVEN

  • March 2018 - April 2019 - 14 Months
Technologies
Role & Responsibility

    The primary purpose of this project is to send Outbound extracts to External Vendor partner (TRUVEN) which includes fields in the data layout that are currently available in the ODW along with vendor partner data that is provided to ODW. Historical data extract was also sent to Truven from Jan 2014.

    Roles and Responsibilities:

    Created ETL detail design document and ETL standards document. 
    Developed ETL mapping to generate Encounters monthly file using Informatica Power Center and integrated code with other extracts code to create complete workflow.
    Developed ETL routines using Informatica Power Center and created mappings involving 
    transformations like Lookup, Aggregator, Expressions, 
    Filter, joiner transformation SQL
    overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers. 
    Extensively used Mapping Variables, Mapping Parameters to execute complex business logic 
    Developed UNIX shell scripts to create parameter files, rename files and for prebalancing 
    the flat file extracts.
    Involved in historical file generation, postproduction activities to resolve all the issues.
     
...see less

Industry Expertise

Education

Education

in bachelor of Technology

Hyderabad University
  • June 2011 - June 2014

Our Suggestions