$11.0 per Hour
Around 4 years of professional IT experience in this area of Informatica Power center and snowflake cloud data warehouse.
Having 1.8 years of Professional Experience in working with Snowflake cloud data warehouse, AWS S3.
Experience in working different areas of RDBMS, Data loading through Sql loader.
Experience with snowflake Datawarehouse,deep understanding of snowflake architecture and processing.
Experience with performance tuning of snowflake data warehouse with query profiler, Caching and virtual data warehouse scaling.
Created clone objects to maintain Zero copies in snowflake.
Used COPY/INSERT, PUT, GET commands for loading data into snowflake table various sources like AWS S3.
Good exposure in snowflake Cloud Architecture and SnowSQL and SNOWPIPE for continuous data ingestion.
Experience in working with AWS S3 and Snowflake cloud data warehouse.
Hands-on experience in bulk loading and unloading data into Snowflake tables using COPY command.
Worked on TASK creation for scheduling/Automate Snowflake jobs.
Having good knowledge of ETL-processing using Informatica tool.
Good command on Power center Designer, Workflow Manager and Workflow Monitor.
Extensively worked on Informatica Power Center Designer which also Include-Source, Analyzer Target Designer, Transformation Developer, Mapplet designer and Mapping designer.
Experience in tuning of sources system / target system, Mappings & Sessions.
Good knowledge in UNIX Environment.
Created mappings using Joiner, Sorter, Union, Filter, Router, Aggregator and Lookup.
Implemented Performance Tuning techniques on Mappings.
Implemented and Tested Slowly Changing Dimensions – SCD Type 2.
Worked on UNIX environment while reading the files and for archival process.
Good Knowledge on SQL.
Preparation of Unit test cases also one of my responsibilities as per the business requirement.
Tech Stack Expertise
Oracle 11G1 Years
- January 2019 - January 2023 - 4 Year
Sales Enhancement System
- January 2020 - January 2021 - 13 Months
The Genuine parts company is a service organization engaged in the distribution of automobile replacement parts, industrial replacement parts, office products and electrical/electronic materials. Their products and services are offered through a network of over 2000 operations, geographically located across the United States, Canada and Mexico. The main aim of this data warehouse is to help the senior and middle level management of the organization to improve the sales and get the knowledge of the new business opportunities.
Extensively used Informatica Client Tools – Source Analyzer, Target Designer, Transformation Developer, Mapping Designer, and Mapplets Designer to develop mappings.
Used designer debugger to test the data flow and fix the mappings.
Performed Data Extractions, Data Transformations, Data Loading, Data Conversions and Data Analysis.
Working on different transformations like Source Qualifier, Lookup, Joiner, Aggregator, Expression, Filter, Router, Sequence Generator, Update Strategy etc.
Involved in creating Sessions & Workflows.
Developed workflows and tasks using Informatica Power Center Work flow Manager.
Documented the purpose of mapping so as to facilitate the personnel to understand the process and incorporate the changes when necessary.
Developed mapping to load the data in slowly changing dimension (Type-1).
Involved in Unit Testing.
Prepared the process documents and handed over to production support team to monitor the production runs
- January 2019 - January 2020 - 13 Months
Novartis is a full member of the European Federation of Pharmaceutical Industries and Associations (EFPIA), the International Federation of Pharmaceutical Manufacturers and Associations (IFPMA), and the Pharmaceutical Research and Manufacturers of America (Pharma).
Develop, Test, Deploy Various Informatica ETL Mappings for data sources like Flat Files, Salesforce, Postgres,Snowflake Cloud DB.
Responsible for all activities related to the development, implementation, administration and support of ETL processes for large scale Snowflake cloud data warehouse.
Bulk loading from external stage (AWS S3) to internal stage (snowflake) using COPY command.
Loading data into Snowflake tables from internal stage and on local machine.
Used COPY, LIST, PUT and GET commands for validating internal and external stage files.
Used import and export from internal stage (Snowflake) VS external stage (S3 Bucket).
Writing complex Snowflake Sql scripts in Snowflake cloud data warehouse to Business Analysis and reporting.
Responsible for task distribution among the team.
Perform troubleshooting analysis and resolution of critical issues.
in BEKarnataka University
- June 2015 - June 2017