$13.0 per Hour
A good team worker, pragmatic and optimistic.
Self motivated and confident.
Have good adaptability to changing circumstances.
Possesses strong problem-solving, trouble-shooting and analytical skills which is clearly demonstrated in the past projects.
Intend to build a career with leading corporate of hi-tech global environment with committed & dedicated people, which will help me to explore myself fully and realize my potential.
Service Commitment Award
(S)Miles Award was awarded
Learning Achievement Award was awarded.
Fresco Play Miles Award was awarded
Feather in My Cap was awarded by WIPRO Technologies twice. One for showing effective effort in driving the Testing Project towards successful completion and another for showing good aptitude in the field of Development.
Tech Stack Expertise
Spring Boot,Spring MVC,Java-8,Java8 Years
Oracle 10G0 Years
Microsoft SQL Server
Manual Testing2 Years
Full Stack Developer
- January 2011 - January 2023 - 12 Year
WESTERN UNION – 1P INQ
- January 2018 - January 2019 - 13 Months
Western Union had legacy code in TAL language that was required to be migrated to 1P-Platform. 1P-Platform was based on AWS-Java technology.
The purpose of 1P-Platform are as follows :-
1) Shift technology spends to support revenue generation.
2) Mainframe technology would be stopped mid-year.
3) Less legacy maintenance.
4) Elastic capacity
4.a : Respond to transactional peaks
4.b : Optimize valleys
INQ is the brain of WESTERN UNION. It acts as an Orchestration layer which deals with requests from POS channels. It does initial business validations and route to the appropriate internal and external interfaces to complete the need. It returns success/failure to POS channels.
Primary aim of this assignment was to re-architect/design the legacy INQ orchestration layer sitting in HPNS system so that it can be deployed to AWS Cloud by leveraging Java/Springboot and Camel as orchestration engine. As part of the scope of the project, list of in-scope INQ transactions had to be migrated to Cloud.
I was involved both in Development and in helping other members of the team. Before starting the development activity, I had to do PoC on Apache Camel Routing and externalization of Routes. After that, I had to sit with Reverse Engineering team to understand the TAL functionality of the Proc-s and Sub-Procs. I had to go through the TAL code and translate the lines of code into Java.
There was everyday meeting in the morning that had to be attended for status-update of the Development activity. Besides, since INQ interacted with external applications so there were separate teams with whom clarification was sought regarding the JSON request/response structure.
After committing the code to GITLAB, there were STAGEs in JENKINS where SONARQUBE Scanning was also included. In SONARQUBE Scanning, code-coverage and Major Blockers were also taken care of. After the Code passed through all the stages in JENKINS then code was finally deployed in AWS Fargate.
- February 2012 - January 2013 - 12 Months
Citi-group has four geographical Regional Consumer Banking (RCB) businesses that provide traditional banking services to retail customers through retail banking, commercial banking, Citi-branded cards and Citi retail services.
Citi Branded Cards provides payment and credit solutions to consumers and small businesses around the world. This project operates from Texas.
The business-model operates on micro-services architecture which are taken care of by the Developers after which profiling is done. Here two types of micro-services are maintained: Domain and CRUD. Domain relates to the downstream services and CRUD refers to the upstream services. Profiling is basically done on the newly incorporated feature of both kinds of micro-services applications.
RESPONSIBILITIES: I was involved in profiling the micro-services. For carrying out this activity I had to follow up with the Developers to get the working-SOAPs. Then the target URI, residing in a particular Environment, was hit with the SOAP for which snapshots got captured in the Yourkit-tool.
3 snapshots were taken per API and then these three snapshots were used to generate the report. The report showed the time taken by the functions and also had a Recommendation tab.
The execution time taken for each of the functions were investigated by doing code-analysis. Code analysis was done with the purpose of time savings. Observations were incorporated in the Recommendation tab and then the Report was shared with the Developer.
After the Developer made the changes as per the Recommendations, the changes were reviewed and then go-ahead was given for LAB Testing.
MRD(Machine Readable Design)
- March 2016 - January 2017 - 11 Months
Ally Financial is a bank holding company organized in Delaware and headquartered in Detroit, Michigan. The company provides financial services including car finance, online banking via a direct bank, corporate lending, vehicle insurance, mortgages, credit cards, and an electronic trading platform to trade financial assets.
Ally, is the largest car finance company in the U.S. by volume and serves over 6 million customers. The company was known as GMAC, an acronym for General Motors Acceptance Corporation, until 2010.
RESPONSIBILITIES: I was involved in creating CRUD operations in MRD Project. An Entity file in some remote location has to be picked up and based on that Entity file, CRUD layers were created automatically by the project. To achieve this, two projects were created – mrd-crud and mrd-microservice. Entity file was created from yml file – this was a separate project which was developed by another developer.
mrd-crud served as an utility jar file which was developed and placed in Nexus repository.
mrd-microserve on the other hand acted as an archetype from where “n” no of microservices could be generated. These microservices had “pom.xml” file in root directory where dependency for “mrd-crud” was included.
The microservices that got generated from mrd-microservice archetype were operational which could be booted up by the embedded Tomcat server in Spring-boot and Database tables could be created in in-memory database named as “h2” on the fly.
KYC Remediation Project
- January 2012 - January 2013 - 13 Months
KYC tool was a standalone installation to run concurrently with its current GWM on boarding systems. The KYC tool was implemented in three separate instances, one in New York for PB US/LATAM and JPMS, second in Geneva for PB Geneva and third in Geneva for PB EMEA/APAC. Each KYC tool instance was implemented as a standalone system with no integration with current JPMC systems (no real-time) aside from single sign-on (SSO) for authentication purposes. KYC information will be implemented using a non-GUI based functionality which will load data fields that map directly from the existing (legacy) KYC systems to the KYC Tool backend. This process may need to be run more than once during the remediation. The KYC tool reporting capabilities will be covered by four reports. A region wide KYC remediation progress report, completed KYC remediation record report, a KYC remediation record overlap report in excel format and a report specifically with information for the GS&I group.The KYC tool will accept a batch feed of existing client data. Similarly, the KYC tool will produce a report/file to facilitate manual data transfer back to existing KYC systems (this may only be required for JPMS/AMS and Geneva/ICR).
The KYC remediation process was completed in the KYC tool using a workflow process. Each step of the workflow belonged to a different role in the KYC tool. The KYC tool ensured that two roles do not simultaneously edit the same KYC record. The movement of the KYC record within each step of the workflow was determined on the basis of completion of certain conditions.
As a Developer, I implemented RESTful webservice for the first timein the project. For the purpose of security like in order to avoid Cross-site Scripting XSS, I ensured that the string query variables received by the Controllers were validated against a particular pattern through EASPI (OWASP Enterprise Security API). Secondly, the application listed HTTP verbs that could be handled but failed to block verbs that were not listed. So, HTTP verbs under security constraints in web.xml were removed and a separate Java class was coded to take care of authorization policy by making the security constraint applicable to all HTTP verbs.
- April 2013 - January 2014 - 10 Months
For the first time, when a company comes then facility i.e. location details of the company is registered after which a profile of the company gets created. Once a profile is created then Quotation is done which again depends on the kind of certification selected by the company – half-yearly or annually. In the Quotation process, first a Quote is created, secondly a Quote is Build, in the third part Quote is approved, in the fourth part Quote is accepted and finally Audit is generated which in turn is associated with Job. One Audit can have more than one Job. Every Job can have either one of two states – Schedule Pending and Schedule Confirm. Once a Job is assigned to a Lead Auditor, who again can be chosen based on the kind of Certification he is assigned to, after which a Time span is allotted to Auditor. Once the Auditor approves the Job then Certificate is generated for the company with Effective Certification Date and Expiry Date.
As a Developer, I was responsible for making enhancements, which used to come as Requirements in the Project. Sometimes, I also had to fix issues in the application. The application had three layers – User Interface Layer, Service Layer and DAO Layer. GWT was used for the front-end of the application which made asynchronous AJAX calls to the Service Layer through Service Interface and Service Layer in turn made calls to DAO Layer through DAO interface. There were application-context files for service and DAO layers where respective beans were registered.
I developed certain features of the application with the given functionality and tested thoroughly before releasing for the final delivery.
XML Parser Project
- January 2017 - January 2018 - 13 Months
This Project had 3 applications – Orion, LexConnect and OnDemand.
Orion system sent manufacturing test data which is to be stored in Teradata Database. Test data was sent in the form of SOAP request over https. Orion application parsed and validated the SOAP Envelope, then again parsed and validated SOAP body and finally loaded the content in Teradata Database as per mapping rules.
LexConnect reported supplied usage information about Lexmark products. If the customer agreed to participate in the program of collecting data from customer’s system then only XML post from printers/computers was collected. The application supported the HTTP post. The XML file was parsed and validated and finally loaded the content in Teradata Database as per mapping rules.
OnDemand was developed to provide better customer service. The customer triggered the sending of data for support and diagnostic purpose. The application supported the http post through which XML file was received. The XML file was parsed and validated. Finally, the content was loaded to Teradata Database as per mapping rules.
As a Developer, I was responsible for code-setup in the local system. I had to understand the code-flow of the application. Initially, the application was unstable and fixes needed to be done in order to execute properly in the environment where the applications got deployed.
Sometimes wrong XML files got generated which applications were unable to parse and so I had to communicate to the customer regarding that. In addition, I also had to sometimes explain to them the working of the application from technical point of view.
- January 2018 - January 2019 - 13 Months
Marketing CRM Upgrade Project under RCCL, was an upgradation Project where applications namely, Siebel-CRM and OBIEE(Oracle Business Intelligence Enterprise Edition) were upgraded. Siebel - CRM was upgraded from version 7 to version 8 and OBIEE (Oracle Business Intelligence Enterprise Edition) was upgraded from version 10g to version11g. Siebel and OBIEE were integrated to serve Business needs of performing requisite activities on Database.
As a Software Test-Engineer, I had to understand the functionalities of Siebel CRM Marketing and OBIEE applications. Then, the SIT Test-Cases were drafted, which were later communicated to the QA team of Client-side for approval. After receiving the approval, I uploaded those Test-cases in HP-QC and executed the same. While executing the Test-cases, I located bugs and logged those as Defects in HP-QC accordingly. Once the developer fixed and gave clarifications, those fixes were tested and closed with proper comments. In addition to it, on-site calls and meetings were also required to be attended
- January 2019 - January 2020 - 13 Months
Following were the Lines Of Business in this Project :-
Lines Of Business
>DFS(Direct Financial Services)
>Leads and Campaign Management
I was a part of DFS Team. Sanity Testing, Regression Testing & Ad-hoc Testing were performed by myself pertaining to the project needs. There were Iterations in the Project and every Iteration had 3 cycles. Group-discussions, brain-storming sessions and onsite clarifications were a regular practice whenever fresh Iteration came up.
As a Software Test-Engineer, my responsibilities included studying the Application of Microsoft Dynamics CRM, Executing Test Cases, Designing Test Scenarios and Test Cases, Finding out Bugs and Bug Reporting. I also had to attend sessions whenever new Iterations came up. I had to log in Queries in Clarification Log and send it to On-site for Clarifications. On-site calls and meetings required attention and putting in Extra hours at work was also important at times. I got an opportunity to explore Badboy which is an Automation Tool selected for the project. Exploring different applications while trying to execute Test-scenarios and Test-cases interested me a lot.
Besides, searching for underlying architecture and other
in B.TECHGujarat Universiy
- June 2006 - June 2008