About me (Registered since 13/12/2024)
13+ years of experience in analysis, design, development, testing, documentation,
production deployment and support of various ETL and automation projects .
Well-versed in numerous programming languages including Unix/Linux scripting, Python,
PySpark, C, C++ and Core Java.
Have strong experience in SQL query building and query optimizing.
Have good experience in designing and performance tuning of data pipelines using
Informatica 10, IDMC.
Have good experience in Hadoop ecosystem components such as HIVE, HDFS and Spark.
Hands on experience in debugging in Linux environment using logs, core dump analysis, KDB
and IDEs.
Have good experience in multi-threading, Inter process communication, Message Queues
and TCP/IP server/client socket programming.
Have Good experience in working with Data Structures and Algorithms, OOP concepts, Data
Encryption and Decryption.
Worked as Onsite Coordinator (at Client Site) by handling a team of 5 members. Involved in
Business meetings with Clients, Estimation for Tasks, Task allocation and handled the
offshore team
Worked on product specific frameworks designed by client.
IT Skills
Skills
Portfolio
Education
-
2010 - 2012
Birla Institute of Technology & Science, Pilani, Goa Campus
Masters Degree
India
Masters in Software Systems
-
2004 - 2008
Birla Institute of Technology & Science, Pilani, Goa Campus
Bachelor Degree
Singapore
Bachelor Degree in Electronics & Instrumentation Engineering
Work Experience
-
2024 - Present
National University of Singapore
Data Engineer
– Develop an MDM Data Quality tool using PySpark to generate reports on data quality.
– Design and Develop Informatica workflows/pipelines to provide data for Data Analysis for
Higher Management to view Procurement and Contract Analysis PowerBI dashboards
– Developed Data Models using Erwin Data Modeler.
– Develop Data pipelines using Informatica 10 and IDMC. -
2022 - 2023
Great Eastern Life, Singapore
Data Engineer
– Transformed data from different sources into hive using Hive Queries, Python, Pyspark, and Unix/Linux shell scripting.
– Design and Develop Informatica workflows/pipelines to generate reports for data quality
data mart.
– Developed Informatica workflows for data quality, profiling, and analysis.
– Work with Business Analyst to gather requirement and prepare technical design document.
– Responsible for production deployment and environment configurations
– Responsible for monitoring production batch runs
– Responsible for design, code implementation, testing, production deployment, and
documentation. -
2016 - 2020
Standard Charted Bank, Singapore
Application Consultant
– Involved in the full project implementation cycle, from gathering requirements with users, designing, developing, testing, training, documentation, and handover.
– Developed complex Hive queries to extract data from the Data lake, transformed data and
Implemented stored procedures in Oracle to provide data to UI
– Created workflows using Dataiku DSS and Paxata to analyze data and generate reports.
– Developed Unix shell and PySpark scripts for automating data extraction from different
source systems, generating reports, Data encryption/decryption, and housekeeping of data in hive/HDFS.
– Experienced with CI/CD practices and tools like DevOps, Bitbucket, Jenkins, etc.
– Responsible for the development of Control-M job schedules that integrate the upstream and downstream systems.
– Handled a team of 5 members offshore. Responsible for Requirement Analysis, Effort
Estimation and Task Allocation for offshore team.
– Responsible for presenting production change requests to the change management team.
– Responsible for production deployment and environment configurations.
– Responsible for monitoring production batch runs. -
2014 - 2015
Port of Singapore Authority, Singapore
Application Consultant
– Implemented new features and fixing issues of application developed in Python.
– Responsible for design, code implementation, unit testing and documentation.
– Created UNIX shell scripts to verify the improvement in the performance of the Engine after implementing new features. -
2012 - 2013
L & T Infotech Ltd, India
Senior Software Developer
– Developed a tool using C, C++ to verify the functionality of all drivers (like display, sensors,
vibrator, keys, camera, etc) and hardware of an Android Smart Phone.
– Responsible for requirement analysis, design, code implementation, and unit testing of the
tool.
– Implementing features and fixing issues in the phone software from device drivers to
applications level.
– Involved in porting iTap (predictive text technology) features.
– Implemented features and fixed issues in the phone software. -
2008 - 2012
Cognizant Technology Solutions, India
Software Developer
– Developed(using C,C++) an inbuilt compiler i.e., CDL Compiler – CDL stands for “Control
Description Language”. Application Engineers customize iVIU operation using a CDL
program. The CDL compiler converts the text into binary format for the unit to execute.
– Developed CDL Engine module using C,C++ API – The CDL engine executes the instructions created by the CDL compiler. An instruction could send an alarm, change an output, read an input, change an LED, etc.
– Developed configuration manager – This module provides public interface for all other
modules to update configuration data into the database using SQLite.
– Involved in the development of Hardware Acceptance Test Program.
– Hands on experience in Multi threading and socket programming.
– Part of code reviews which happened during the course of development. -
2007 - 2007
CA India Technologies Pvt. Ltd, India
Intern
– Developed tool for reducing the dimensionality of the data and for information visualisation, which includes the visualisation of groups of similar items.
– Used Self-Organizing Map (SOM) techniques to project high-dimensional data onto a two-dimensional map.