You can find more about me by scrolling down or by clicking on the menu bar.
You can connect with me through some of the links listed below.
These are my top skills that I have gained through research, internships, and classes.
The following are some projects that I have worked on through my internships and classes
In this project, I was given a large real world dataset, and I was asked to classify the emails as spam or ham. I performed feature engineering on text data, utilized sklearn libraries for data processing, model fitting, and validation, and achieved 91% accuracy on the test set for the spam classifier. Additionally, I generated and analyzed precision-recall curves to assess the model's performance and make necessary improvements.
In this project, I conducted an in-depth investigation of the Cook County property tax system, utilizing a data set of over 500,000 records, to analyze property valuation practices. Utilized Pandas, feature engineering, machine learning (one hot encoding) and Linear Algebra techniques to analyze data and identify disparities, highlighting reduced taxation for wealthier homeowners and increased taxation for low income households.
This project was the capstone project of CS61C where I used logisim-evolution to implement a 32-bit two-cycle processor based on RISC-V. During the CPU design process, I had several tasks to accomplish. Such as working on creating an Arithmetic Logic Unit (ALU) to handle calculations and logical operations. I also developed Register Files to store and manage data efficiently. The Data Path was another crucial component that I worked on, ensuring smooth data flow within the processor. In addition, I tackled the intricate task of decoding instructions, making sure the processor could understand and execute them properly. Finally I integrated Memory modules into the design, allowing the processor to interact with data stored in memory as well as the write back operations, ensuring that processed data was correctly stored.
Implemented a Java-based browser tool motivated by Googleโs Ngram Viewer to analyze the historical usage of words in English texts throughout the history. Added cool data visualization feature in the browser tool, allowing users to interpret word usage trends through interactive charts and graphs.
During my internship as a software engineer at CereVox AI, I joined a small team of three developers to develop a top-notch, scalable custom parser. Our main goal was to build a parser capable of efficiently managing millions of customer requests. The parser's key task is to convert unstructured files into CereVox's specific format, making sure the data is consistent and standardized. We also made sure our parser could smoothly convert data from the CereVox format to any format a customer might need.
Lead group capstone project creating a video game engine that uses a randomized algorithm in Java to generate unique interactive 2D worlds in which the user can engage with the game using the GUI. Features random map generation, keyboard input, save and load structures, random encounters, various avatar skins, and unique seeds for corresponding unique worlds. Implemented a robust engine within the video game, enabling realistic object interactions, collision detection, and dynamic simulations for enhanced gameplay experiences.
As a Data Science intern, I worked alongside a team of six Data Scientists on developing a predictive model to predict lead water service lines across the nation. My role involved collecting data, scraping websites, building predictive models, and designing both the front end and back end of the User Interface. As the User Interface team leader, I employed the MapBox API to develop an informative and searchable parcel map. At the end of this endeavor, we were honored with the Data Science Insights Award from UC Berkeley, a recognition that really highlighted the importance of what we had accomplished.
I'm currently immersed in an exciting side project/startup, with a mission to create tailor-friendly software simplifying the management of customer clothing sizes and styles. This endeavor involves crafting a user-friendly interface that enables tailors to effortlessly manage customer data, including measurements and clothing types like Punjabi, Thobe, and more. To bring this vision to life, I'm delving into the world of PostgreSQL (psql) for data storage and exploring diverse software tools for both the back end and the user-friendly front end. My deep commitment to utilizing my Data Science and Software Engineering skills to serve my community and generate a positive impact drives this initiative.