$13.0 per Hour
Overall, 8.5 years of professional experience in various industrial sectors like T&H, Aviation and Healthcare Domain.
4.5 years of work experience in Data Warehousing, ETL, Visualization, 2 years in Python and 1.5 year of hands-on experience in Cloud platform (AWS).
Hands-on experience with Amazon Lambda, Step Functions, Event Bridge, S3, EMR, CFN, CloudWatch.
Good Programming skills at a higher level of abstraction using Python
Extensive experience in developing applications that perform Data Processing tasks using Teradata, Greenplum, Oracle, SQL Server database
Experience in writing Complex SQL queries, Views, Functions, triggers, etc.
Hands on experience in Visualization tools like Qliksense, Quicksight, Spotfire and Tableau.
Used Informatica PowerCenter (9.1) for supporting Data Extraction, transformations and loading data to various target systems.
Experience in OLTP and OLAP design, development, testing and support of enterprise Data warehouses.
Independently perform complex troubleshooting, root cause analysis and solution development.
Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities and possess good communication skills.
Team player, motivated, able to grasp things quickly with analytical, troubleshooting, and problem- solving skills.
Experience developing applications embracing Agile principles
Worked with SCRUM team in delivering agreed user stories on time for every Sprint.
Responsible for supporting and leading project tasks
Aws Cloud Engineer
- January 2015 - November 2022 - 7 Year
- April 2022 - November 2022 - 8 Months
- Working as a data analyst with Philips in the data analytics team for asset management
- Responsible for exploring the data, recommending actionable insights, building visualization/dashboard to provide information that matters to business.
- Generate business insights using BI dashboards that provides clear directions/actions to management & customers to support decision-making
- Current use cases include analytics reporting, workflow optimization, advice on use cases, analysis of raw data.
- All the extractions and pre-processing implementation in Python and the KPI’s were built on Qliksense.
Data Engineer (AWS)
- May 2019 - March 2022 - 35 Months
- Worked as a data engineer involved in migrating the platform to run and validate the data science algorithms for big data analytics from Hortonworks to cloud (AWS). This platform enables deployment, orchestration, execution, and monitoring of the big data analytics so that data and insights are easily accessible to applications to drive business outcomes.
- The platform coding was done taking advantage of the AWS services.
- Some of the features that were built as part of the platform include metadata management, intuitive log management, monitoring and visualization, cost optimized data strategy and Workflow failure management.
- Worked in an agile environment to quickly analyze, develop, and test potential use cases for the business.
- Experience in writing and deploying AWS Lambda functions to make entries to the DB
- Proficient in writing Cloud Formation Templates (CFT) in YAML and JSON format to build the AWS services like lambda, step functions, event bridge with the paradigm of Infrastructure as a Code.
- Experience deploying Hadoop applications on EMR Cluster
- Took a lead in creating step Functions and using different states to orchestrate the workflows
- Responsible for refactoring the analytic scheduling strategy to use event bridge which has high availability.
- Involved in designing and implementing the strategy for archival which transfers the data to s3 once analytic is executed.
- Undertook the role of scrum master to align with agile methodology and best practices
Data Engineering Specialist
- March 2017 - April 2019 - 26 Months
- Worked as a data engineering specialist in building the data warehouse that builds and validates the next generation diagnostic system to support Aircraft engine health monitoring and service opportunities.
- I was primarily involved in building processes to extract, transform and load the flight, alert and asset data to the Greenplum warehouse.
- Creating dashboards in Spotfire to monitor the analytics and find the effectiveness of each analytic, i.e., if every analytic is producing the necessary parameter was also part of the work.
- Performance Tuning and Optimization of the queries was also done to ensure that there was no adverse impact on all the critical applications and all the jobs were running smoothly.
- Came up with smart ways to retrieve the data in 5 hours as opposed to 5 days post a server migration
- November 2013 - November 2016 - 37 Months
- Worked as a Programmer Analyst in Travel and Hospitality domain and was involved in Teradata/ETL Development and Enhancement.
- I was responsible in gathering the requirements, technical designing, coding in SQL and creating testcases. Also involved in maintaining the existing ETL workflows, data query components
- As part of the role, I worked in developing ETL program for supporting Data Extraction, transformations and loading data to the target tables using Informatic PowerCenter and later building warehouse on Teradata.
- Reports were built in Tableau for the revenue that was generated from each restaurant.
Computer Engineering in B.EKarnataka Institute And Technology
- May 2011 - June 2013