- Working on the design and development of a dynamic ingestion pipeline where configuration details are received through FastAPI and used to automatically provision and execute new data pipelines in Databricks.
- Responsible for storing and managing processed metadata in Databricks Catalog and ensuring seamless integration across Azure cloud infrastructure.
- Additionally, implemented workflow automation, monitoring, and notification mechanisms to support pipeline reliability, observability, and operational efficiency.
Senior Software Engineer
Sep 2023 - Sep 2025
Vibrent Health
- Led and scaled data engineering initiatives, building robust data pipelines using DBT, Dagster, Meltano, Snowflake, Python, and AWS to support analytics and business intelligence needs.
- Owned data orchestration and architecture, managing tool upgrades, schema evolution, and data modeling decisions, while contributing to performance and reliability improvements across the stack.
- Optimized cost and performance, reducing Snowflake usage costs via DuckDB-based in-memory computation and cutting query times by 50% through DBT model and table design optimization.
- Contributed to the open-source community by enhancing a PipelineWise extractor used in Meltano, improving ingestion reliability and integration capabilities.
Senior Engineer Cloud Services and Software
March 2022 to Sep 2023
Larsen & Toubro Infotech Limited
- As a Senior Data Engineer, engaged in designing, implementing and optimizing the data pipeline along with Data Migration.
- While Working as a module lead, managing a team of 4-6 data engineers and analysts
- Utilized AWS cloud Platform services like Athena, Glue, EMR, SageMaker, Cloud-formation, S3, Lambda, DynamoDB, IAM, AWS-SAM, SNS, API Gateway
- Involved in Code development and peer reviewing using Python, Pyspark, Pandas, Docker and SQL
- Writing unit test cases using Pytest
- Creating report for products and collaborating with cross functional teams
- Involved in creating documentation for data pipeline architecture and Standard Operating Procedure
- Implemented data transformation processes, improving the quality of data
- Optimized and tuned database performance to achieve faster query response times and reduced latency.
- Created SDK for the user-event–logger ( iife, typescript, lambda, s3, cdn, npm package)
- MongoDb for user-event storage MongoDb
- Kafka(AWS MSK) for user event record
Software Engineer
June 2020 to March 2022
Clearmind Consultancy
- As a Data Engineer, Managing the transaction ledger and data transformation as per business need. Writing Stock market related scripts(ticker) and Analysis. Designing Cloud Architect for data pipeline, ETL , web app, server and data migration
- Data Encryption and decryption of the transaction ledger( Python, DynamoDb)
- Kafka for the Share price update
- Auto invoice and billing generation
- Razorpay integration
- Designing algorithm for StockMarket Price Analysis based on ticker values using Pine Script.
- Using CCXT created serverless portfolio handler (with AWS-SAM)
- Created sdk for aadhar offline ekyc(Lambda, S3, API gateway, Javascript, AWS-SAM)
- React SSR
- Developed, designed and productionize Android & IOS App and WebApp.
- Developed the server based on Node Js, mongo, restApi, Mysql, firebase
- Designed database for the cycle app and created graphql api based access
- Pod deployment of webApp and database pipeline on aws
- Created reverse proxy using nginx for cycle server and cycle webapp
- Created an api for scrapping of nse bse data, feed it to the analysis of ticker
Associate Engineer
June 2018 to August 2019
Solabot Technologies
- As an Engineer, Writing automation scripts for sensor data. Analysis of sensor data based on bot activity. Creating the layout and design of the Bots
- Bot real time location, working status Tracking
- AWS lambda for API building
- Generated the crash analytics report Used python for the report analysis.
- Static webPage hosted in S3
- Used Python, Embedded C for Arduino Programming.
- Created layout design for solar panel cleaning bot using SolidWorks, AutoCad.
- Writing automation scripts for reducing manual and redundant tasks.
Education
Post Graduation Diploma in Internet of Things
2019 - 2020
Center for Development of Advanced Computing, Pune
Bachelor of Technology, Mechatronics
2014 - 2018
Kurukshetra University
Paper Published
Quantum Computing for Data Integration and Optimization