Last week, I tried to get another interview and made progress on my original work. I’ve been trying to email local professors so that I can get another perspective for my research and have another potential mentor. Of course, some of them didn’t respond or refused, but one so far has said he would be willing, and I’m looking forward to the interview.
This past week, I also kept trying to improve the results of my machine learning models. Although not directly related to my original work so far, I learned how to code a generative adversarial network (GAN), using the MNIST database of handwritten digits. After nearly a continuous 22 hours training the model to replicate the digits and still not getting very good results, I stopped going down that path. Machine learning can be very resource expensive, but the good thing is that my chemical models have not yet been that needy. I was able to train a model to predict solubility based on the Delaney dataset using two models: a graph convolutional network and a multitask regressor. The hardest part was the hyperparameter tuning, which is necessary to obtain better results, but I made progress on that as well. After I’m able to tune the models so they have a good R^2 value, I should hopefully be able to compare them.
This week, I plan on continuing my original work and starting my research speech, as well as preparing for my computational chemistry interview.
Comments