Right after coming back from spring break, I had a mentor visit at SMU, which was helpful for setting up the next steps for my model. I had planned to set up cross validation and hyperparameter tuning for my XGBoost models, and I thankfully figured out how to do that. I also learned that it is not necessary to perform hyperparameter tuning for a benchmark run. You can use the whole training/validation set for training because cross validation is mainly for tuning the model effectively. When running benchmarks on more tasks, I got very promising results on the first two datasets I tried, although my model performed more poorly on classification tasks. Still, I’m hoping I can at least submit some of my results. I also need to begin the presentation aspect of my final product soon: the scientific poster. Looking at examples from undergraduate students has helped, but I’m still somewhat confused about framing my work as an experiment. Hopefully, talking with my mentor will clear some things up. Aside from my final product, I plan to work on my digital portfolio and invitations. Plus, I have my Chemistry Olympiad local exam this Saturday. I wish I had studied more, but I feel more prepared than last year. Most of all, I’m looking forward to my rescheduled birthday party on Saturday.
top of page
Rajas Ketkar
bottom of page
Comments