About me (Registered since 02/04/2025)

Experienced Data Scientist/Engineer specializing in creating Data Engineering Solutions, Time Series
Analysis and Forecasting, seeking a challenging role. Dedicated to leveraging extensive digital data to
drive forecasting initiatives, inform strategic decisions, and contribute to digital growth.

Skills

Tech Skills

Education

  • December 2019 - December 2022
    JNU Jaipur

    Degree

    India

  • August 2015 - August 2017
    GPC Jodhpur

    Diploma

    India

Key Skills and Competencies

Skills ______________________________________________________________ Programming: Python, R, Bash, JS, Java, SQL, NodeJS Machine Learning: Sklearn, Bigquery ML, Anomaly Detection, Time Series Forecasting Big Data: Pyspark, Apache Beam, ETL Statistics: Pandas, Numpy, dplyr, Statistical Modeling Visualization: Power BI, Looker Studio, Seaborn, Matplotlib, Tableau, Plotly, ggplot2 Database: MySQL, MongoDB Cloud: Google Cloud Platform, Amazon Web Services GCP: Bigquery, Dataflow, Composer, Vertex AI, Pub/Sub, Compute Engine, Cloud SQL, Firestore, GCS, Dataproc, Cloud Function, Cloud Scheduler, Cloud Deploy, Cloud Run, Cloud Spanner AWS: S3, lambda, Redshift, Glue, EC2 Devops: Jenkins, CI/CD, ELK, Grafana, Zabbix Project Management: Agile, Scrum, Jira

Work Experience

  • May 2024 - Present
    Tredence Analytics, India

    Associate Manager

    Project 1 – BFL Group (24 May 2024 – till date)
    Role – Associate Manager (Data Engineering)
    Description -> BFL Group is a retail project from UAE. Below are my day to day responsibilities.
    Technology -> Python, SQL, Pyspark, Bigquery, Composer, Dataflow, Batch, Scikit-learn, CICD, POCs,
    Looker Studio, Cloud run, Cloud build, Pubsub, Power BI
    • Working on Data Modeling, ETL Batch, Realtime PIpelines and CICD using Cloud run and build.
    • Created Automation framework to automate ETL pipeline.
    • Created factory codes for Source to BQ and BQ to BQ data ingestions.
    • Working on BQ ML and Spark ML based Regression Models and representing in dashboards using PBI.
    • Collaborate with key stakeholders to analyze and forecast digital business trends using time series
    techniques.
    • Work closely with data engineering, analytics engineering, and business intelligence teams to ensure
    seamless data integration and analysis.
    • Translate business needs from various teams across web, streaming, and social into data-driven solutions.
    • Monitor and validate model performance continuously, ensuring accuracy and relevance in a dynamic
    digital landscape.
    • Present findings and insights to stakeholders, providing actionable recommendations based on thorough
    analysis.

  • July 2022 - May 2024
    Telus International, India

    Lead Data Engineer

    Project – Telus Communications
    Role – Lead Data Engineer
    Description -> Telus Communication is a Canadian Telecommunications organization. Below are my day
    to day responsibilities.
    Technology -> Python, SQL, Pyspark, Bigquery, Composer, Dataflow, Batch, Scikit-learn, CICD, POCs,
    Looker Studio, Cloud run, Cloud build, Pubsub, Kafka, Glue
    • Working on Data Modeling, ETL Batch, Realtime PIpelines and CICD using Cloud run and build.
    • Worked on AWS Glue for creating some crawlers and ETL jobs.
    • Created Automation framework to automate ETL pipeline and SRE pipeline.
    • Created factory codes for GCS to BQ, BQ to BQ, SRE Framework and BQ to Onprem data ingestions.
    • Working on BQ ML and Spark ML based Regression Models and representing in dashboards using Tableau.
    • Creating Statistical models using pandas and scikit-learn and using Looker Studio to visualize the results.
    • Collaborate with key stakeholders to analyze and forecast digital business trends using time series
    techniques.
    • Work closely with data engineering, analytics engineering, and business intelligence teams to ensure
    seamless data integration and analysis.
    • Translate business needs from various teams across web, streaming, and social into data-driven solutions.
    • Monitor and validate model performance continuously, ensuring accuracy and relevance in a dynamic
    digital landscape.
    • Present findings and insights to stakeholders, providing actionable recommendations based on thorough
    analysis.

  • October 2021 - July 2022
    Capgemini, India

    Software Engineer

    Project – Loreal Space Hongkong
    Role – GCP Data Engineer
    Description -> Loreal Space is a Hongkong based Beauty product organization. Below are my day to day
    responsibilities.
    Technology -> Python, SQL, Pyspark, Bigquery, Composer, Dataflow, Batch, BQ ML, CICD, POCs, Looker
    Studio, Cloud run, Cloud build, Cloud Spanner, Cloud SQL, Pubsub, Kafka, Power BI
    • Worked on Data Modeling, ETL Batch, Realtime PIpelines and CICD using Cloud run and build.
    • Worked on BQ ML and Spark ML based Regression Models and represented in dashboards using Python.
    • Created Python/Flask based APIs, CICD using Cloud run and build.
    • Creating Machine Learning Models using Regression and Classification Algorithms
    • Collaborated with Client (Loreal Space, Hong Kong) Analytics team to build ETL solutions.
    • Setup Monitoring and Alerting on the different solutions I created in a real time manner.

  • June 2020 - October 2021
    Bharti Airtel, India

    Software Engineer

    Project – Airtel Thanks for Business
    Role – GCP Devops & Data Engineer
    Description -> Bharti Airtel is Indian Telecommunications Organization and Aitel thanks for business is
    the ecommerce like platform for airtel customers where they can manage and monitor all of their
    profiles. Below are my day to day responsibilities.
    Technology -> Python, SQL, Bigquery, Looker studio, composer, cloud run, cloud build, Pyspark, Jenkins,
    Zabbix, Grafana, ELK, NLTK, Seaborn, Plotly, Pubsub, Kafka, MySQL, MongoDB
    • Creating Automation and Telegram Chatbot using NLTK
    • Dashboarding and Monitoring using Zabbix, Grafana, ELK
    • CICD using Jenkins
    • Creating Real Time ETL PIpelines
    • Deployment on AWS
    • Datawarehouse used – Redshift, Datalake used – S3
    • Data Analysis using numpy and pandas and representing them using seaborn and plotly.
    • Automation Script for Zero Customer Issues, Chatbot functionality
    • Visualizations, Dashboards for anomaly detection and monitor server and application matrices
    • Alerting, Crons, DB Backup, Log purge, RCA and CAPA
    • Implemented Anomaly Detection algorithm to capture issues in the applications

  • August 2017 - May 2020
    Jindal Motors Pvt Ltd, India

    Software Engineer

    Project 1 – Intruder Detection
    Role – Data Scientist
    Description -> In this project, we have installed the cctv camera on premise and I have created the
    software part which was detecting the intruders if any.
    Technology -> Python, SQL, Kafka, scikit-image, Opencv, Cloud spanner
    Project 2 – Trivago Analysis
    Role – Data Analyst
    Description -> In this project, we have collected the data from trivago website and did analysis and
    represented that on maps and corresponding dashboards.
    Technology -> Python, SQL, Kafka, PowerBI, flask, django, plotly
    Project 3 – Discord bots
    Role – Chatbot developer
    Description -> In this project, I have created gaming bots for discord and integrated github to get the
    updates from the github such as failures/success of CICD, push/pull of data.
    Technology -> Python, Discord, NLTK, Github, MongoDB
    Day to Day Activities ->
    • Stock market website showing current trends and their analysis and future predictions
    • Collaborate with key stakeholders to analyze and forecast digital business trends using advanced time
    series techniques.
    • Work closely with data engineering, analytics engineering, and business intelligence teams to ensure
    seamless data integration and analysis.
    • Translate business needs from various teams across web, streaming, and social into data-driven solutions.
    • Develop and refine predictive models, enhancing decision-making processes within the organization.
    • Monitor and validate model performance continuously, ensuring accuracy and relevance in a dynamic
    digital landscape.
    • Present findings and insights to stakeholders, providing actionable recommendations based on thorough
    analysis.

Languages

English
Professional
Hindi
Professional