Talent's Information
-
Location
Bengaluru, India
-
Rate
$10.0 per Hour
-
Experience
4.3 Year
-
Languages Known
English,Hindi
Available for
About Sravani
Having around 4.0 years of professional IT experience in this area of SQL, snowflake cloud data warehouse.
Having 2.0 years of Professional Experience in working with Snowflake cloud data warehouse, AWS S3.
Created clone objects to maintain Zero copies in snowflake.
Experience with snowflake Data warehouse, deep understanding of snowflake architecture and processing.
Used COPY/INSERT, PUT, GET commands for loading data into snowflake table various sources like AWS S3.
Experience in working with AWS S3 and Snowflake cloud data warehouse.
Hands-on experience in bulk loading and unloading data into Snowflake tables using COPY command. Good exposure in snowflake Cloud Architecture and Snow SQL and SNOWPIPE for continuous data ingestion.
Experience with performance tuning of snowflake data warehouse with query profiler, Caching and virtual data warehouse scaling.
Experience in writing complex SQL scripts using Statistical Aggregate functions and Analytical functions.
Well-acquainted in working with complex SQL queries with Joins, Functions, Sub-queries, Views and Set operators.
Handling large and complex data sets like JSON, ORC, PARQUET, CSV files from various sources like AWS S3.
Good knowledge of Database management system implementation with SQL.
Good communication skills, problem solving skills and ability to analyze quickly and come-up with an efficient industry standard solution for a given problem.
Experience in creating the tables, sequences, synonyms and constraints for the experimental data load capture.
Good exposure on Object oriented concepts and Advanced topics Like Analytical Functions, Indexes and Partitioned Tables in SQL
Tech Stack Expertise
-
AWS
AWS S3,AWS EC2
4 Years -
Oracle
Oracle11g
2 Years
Work Experience
SnowFlake Developer
- January 2018 - January 2023 - 5 Year
- India
Projects
SDW_Migration.
- January 2020 - January 2022 - 25 Months
-
Optus is an Australian telecommunications company headquartered in New South Wales, Australia. It gained the second general carrier licence in January 1991. It is the second-largest wireless carrier after Telstra in Australia, with 10.5 million subscribers as of 2019.
Optus owns and operates its own network infrastructure. It provides services both directly to end users and also acts as a wholesaler to other service providers. Optus is divided into four major business areas Mobile, Wholesale, Consumer & Multimedia.
Optus was only able to offer local and long-distance calls to residential customers through Telstra's local phone network. It designed their cable network to provide telephony services in addition to broadcast television.
Roles and Responsibilities:
Responsible for all activities related to the development, implementation, for large scale Snowflake cloud data warehouse.
Responsible for task distribution among the team.
Used import and export from internal stage (Snowflake) VS external stage (S3 Bucket).
Writing complex Snowflake SQL scripts in Snowflake cloud data warehouse to Business Analysis and reporting.
Loading data into Snowflake tables from internal stage and on local machine.
Used COPY, LIST, PUT and GET commands for validating internal and external stage files.
Bulk loading from external stage (AWS S3) to internal stage (snowflake) using COPY command.
Perform troubleshooting analysis and resolution of critical issues
Soft Skills
Industry Expertise
Education
in BE
Karnataka University- June 2015 - June 2017