Experience is a hard teacher because she gives the test first, the lesson afterward.
Data Engineer at Schlumberger, Pune
July 2018 - Present
- Author and maintainer of Source to Hub project which loads data directly from source into google bigquery. Project mainly aimed at eliminating intermediate data loading in native hadoop clusters for more efficiency, reliability and speed
- Responsible for implementing and managing an end-to-end CI/CD Pipeline with custom validations for Informatica migrations which brought migration time to 1.5 hours from 9 hours without any manual intervention
- Enhancing, auditing and maintaining custom data ingestion framework that ingest around 1TB of data each day to over 70 business units
- Working with L3 developer team to ensure the discussed Scrum PBI’s are delivered on time for data ingestions
- Planning and Executing QA and Production Release Cycle activities
Full Stack Developer Intern at Truso, Pune
June 2018 - July 2018
- Created RESTful apis
- Tried my hands on Angular 5/6
- Was responsible for Django backend development
Data Engineering Intern at Propeluss, Pune
October 2017 - January 2018
- Wrote various automation scripts to scrape data from various websites.
- Applied Natural Language Processing to articles scraped from the internet to extract different entities in these articles using entity extraction algorithms and applying Machine Learning to classify these articles.
- Also applied KNN with LSA for extracting relevant tags for various startups based on their works.
Technical Content Writer at GeeksForGeeks
July 2017 - September 2017
- Published 4 articles for the topics such as Data Structures and Algorithms and Python
Web Developer Intern at Softtestlab Technologies, Pune
June 2017 - July 2017
- Was responsible for creating an internal project for the company using PHP and Laravel for testing purposes
- Worked on a live project for creating closure reports using PHP and Excel