Talent's Information
-
Location
Agra, India
-
Rate
$14.0 per Hour
-
Experience
8.10 Year
-
Languages Known
English,Hindi
Available for
Background Verification
40 Hr/Week
About Kriti J
Results-oriented Azure Data Engineer with a focus on the design and implementation. 8+ years of experience in developing cutting-edge engineering solutions with a wide range of Azure Cloud Services & good exposure to ETL technologies.
Tech Stack Expertise
-
Azure
Azure Cloud,Azure Synapse,Azure Devops,Azure Synapse DB,Azure Synapse DB
8 Years
Work Experience
ADF Developer
- January 2014 - December 2022 - 9 Year
- India
Projects
Bat
- January 2018 - January 2019 - 13 Months
Technologies
Role & Responsibility
-
Build ADF pipeline to transform/process data from storage layer to Analytical server.
Build Databricks Notebooks to perform transformation/filter & auditing using Pyspark.
Engaged in end-to-end testing before QA/Prod Deployment.
Enhancing already built ADF pipelines using ADF and Databricks Notebook.
Trustmark
- January 2019 - January 2020 - 13 Months
Technologies
Role & Responsibility
-
Designed End-to-End Pipeline to Pull the records from Database/File and send it back to Client.
Performing Data Validation on files/tables using Mapping Data Flow activities like Conditional Splits, Derived Column, Aggregate.
Deploying the Pipeline to QA/Prod using ARM Templates.
Integrated pipeline with Logic Apps, Custom Activity.
Using Rest API’s to Extract/Insert the data.
Performing POC to do ETL Operation on bulk data in Databricks using PySpark.
AT&T
- January 2020 - October 2020 - 10 Months
Technologies
Role & Responsibility
-
ETL Pipeline Creation using Azure Data Factory.
Performing Data Validation on files/tables using Mapping Data Flow activities like Conditional Splits, Derived Column, Aggregate.
Performing Auditing & logging using ADf pipelines.
Implementation of slowly Changing Dimension with Azure Databricks using Pyspark.
Integrated pipeline with Logic Apps, Custom Activity.
Performing POC to do ETL Operation on bulk data in Databricks using PySpark.
CIMCDR
- January 2021 - January 2022 - 13 Months
Technologies
Role & Responsibility
-
Migrating the Ongoing Datastage application to Azure Cloud using Azure Services like Azure DataLake / Blob storage, Azure SQL/DWH, Cosmos DB, Azure Data lake analytics, Azure Monitoring, LogicApp, Azure Data Factory, Azure Databricks with Pyspark/Python, Polybase and creating CI/CD pipelines for code deployment using Azure DevOps.
Soft Skills
Industry Expertise
Education
in B.Tech
Uttar Pradesh- June 2010 - June 2013