While I do hate Java, I do have to thank it for getting me back into Python and more importantly, machine learning.
During my Java class, I was introduced to the idea of a sentiment value generator; a way to find out whether or not a word has a positive or negative connotation. In this class, we were not, in fact, using Java to create one of these machines but rather using a list generated by researchers at Stanford in order to rate different reviews.
The idea was that we would loop through all of the words in a review…
The 19th of April marked the end of Ludum Dare’s 46th Game Jam, a competition in which developers are challenged to create a video game in under 72 hours. Even though my team spent the entire weekend frantically coding, animating, and designing, the competition ended up being a ton of fun and overall, an experience that I would highly recommend to anyone interested in game design. Even though we were all rather new to the craft, we were still able to create a product that we were proud of.
This isn’t going to be an inspiring story nor will it…
Recently I had the not so original idea of using tracking enabled Apple’s ARKit in order to create a 6DOF mobile VR experience. After some research partnered with a healthy dose of trial and error, I was able to develop a method that allowed me to do this using Unity. While there are still problems with it, mainly that the experience is somewhat laggy with bugging tracking, I wanted to share the procedure in the hopes that others could replicate it and improve upon it. I personally think that the ability to walk around in mobile VR will help people…
I‘ve been writing code for a decent period of time, around seven years, and even though I’m not the best at it, it’s changed the way I look at problems in life. As I’ve become better at coding, and solving the problems that come with it, I have begun to see some common patterns in the ways that I solve problems on the screen and in real life.
This is by far the biggest thing that I learned while attempting to write my first code, and it’s something I don’t see talked about a lot. When I first started out…
Using some of my free time this week, I began to delve into creating a transformer with the Keras functional API. Now, my goal for this week is to actually understand what I am doing. I could try to write a bit more, but for now, the goal is just to keep up the habit.
This week was actually pretty amazing. I felt relax and productive at the same time, which is an improvement over usual. The main focus of this week for me was examining BERT transformers. While I had looked at using a transformer architecture for NLP and sentiment classification, it wasn’t clear how I would go about doing this. However, after a bit of research, I discovered that the BERT transformer was created specifically for this purpose. To be honest, I don’t quite understand it yet, but I will definitely report back when I do.
Adios.
This week was really busy, so I didn’t really manage to do that much. While I hoped to have come back with a big update on a personal project, I wasn’t able to do it. It doesn’t look great or feel great to keep posting nothing updates, but I think it good to be able to continually admit that you failed. Anyway, hopefully this week is better.
Adios
As a result of some unexpected changes in my schedule, I was able to work a little bit on my transformer! Progress has been incredibly slow, but I am proud that I have something to write about. After deciding last week that I was going to write my model using the Hugging Face transformers library, I started this week off by researching how I was going to do that.
First off, I needed to figure out what sort of model I was going to use, as hugging face had a variety of different model types available. After a little bit…
This week, I actually managed to make a small amount of progress on my transformer! A bit of better time management enabled me to actually spend valuable time writing code and doing research for my emotion classification transformer. Overall, I complete two main tasks: the first is that I was able to successfully begin managing my data for the transformer and the second is that figured out how I am going to go about creating my NN.
👋 My musings here are basically a public stream-of-consciousness journal, so please forgive my writing: it’s getting better, I promise.