Creating hundreds or thousands of solutions for a problem might sometimes sound like an overkill but what if a method inherently creates thousands of diverse solutions while looking for the best performing ones on the way? Not only could we find the one solution we might have been looking for all along, but we find hundreds of solutions that solve a problem in different ways.
This is a short introduction about me. I decided to start this blog to share my thoughts with everybody. I will be mainly posting about Machine Learning and related stuff (since I do this every day). Currently I split my time between my PhD lab and IBM which probably influences my writing in the future.Continue reading “Hello World.”
What is semi-supervised learning? Every machine learning algorithm needs data to learn from. But even with tons of data in the world, including texts, images, time-series, and more, only a small fraction is actually labeled, whether algorithmically or by hand. Most of the time, we need labeled data to do supervised machine learning. I particular,Continue reading “Introduction to semi-supervised learning and adversarial training”
An Introduction to Transformers and Sequence-to-Sequence Learning for Machine Learning New deep learning models are introduced at an increasing rate and sometimes it’s hard to keep track of all the novelties. That said, one particular neural network model has proven to be especially effective for common natural language processing tasks. The model is called a TransformerContinue reading “What is a Transformer?”
Everyone who’s done a Deep Learning project knows how long it takes to train a single model not to mention optimizing it. As a current student, I don’t have access to any GPUs which sometimes leads to frustration because I can spend most of my time waiting for the model to see all the trainingContinue reading “Leveraging Watson’s Machine Learning GPUs to accelerate your Deep Learning project in Python”