Sign in

Training a machine learning model with sparse data sometimes requires data augmentation, or tricking the model into thinking that the same image is something new, in order for it to learn to recognize patterns from different perspectives of the same image. There are a number of different types of image augmentations, including rotation, shearing, zooming, cropping, flipping, changing the brightness level, swapping the background or other layers, and many others. You can even enhance images using more advanced techniques, such as principal component analysis, or manual operations on pixels to affect the image to your taste.


Hyperparameter Optimization using bayesian optimization | LaptrinhX

Write a python script that optimizes a machine learning model of your choice using GPyOpt:

  • Your script should optimize at least 5 different hyperparameters. E.g. learning rate, number of units in a layer, dropout rate, L2 regularization weight, batch size
  • Your model should be optimized on a single satisficing metric
  • Your model should save a checkpoint of its best iteration during each training session
  • The filename of the checkpoint should specify the values of the hyperparameters being tuned
  • Your model should perform early stopping
  • Bayesian optimization should run for a maximum of 30 iterations
  • Once optimization has been performed, your…

Abstract

This will be my first attempt to build a successful classifier on the CIFAR 10 dataset through experimentation with various models and optimization techniques, having never sought to tackle a machine learning problem without at least a little bit of guidance. The goal is to take a pretrained model, and utilizing the concept of transfer learning, apply it to a different dataset hoping to achieve an evaluated accuracy of at least 87% on the new dataset. I will attempt to document the steps, along with my confusion and insights throughout this writeup.

Introduction

I’m given instructions to train on the CIFAR10…


Researchers at the University of Toronto improved state of the art models for classifying under the ImageNet dataset. Krizhevsky et. al.‘s 2012 paper ImageNet Classification with Deep Convolutional Neural Networks describes their research. After submitting to competitions ILSVRC-2010 and ILSVRC-2012, their model improved previous state-of-the-art deep neural network considerably for both top-1 and top-5 error rates.

ImageNet is a database of millions of images with 22,000 different categories. Humans have contributed to classifying and labeling the test data. At the time, the best models were not very efficient and computationally expensive. …


  • L1 regularization
  • L2 regularization
  • Dropout
  • Data Augmentation
  • Early Stopping

Neural networks use activation functions to determine whether each node in the network should be activated or not, returning a non-zero values for the output of the layer. Given a certain set of inputs, the neuron output is either activated or not depending on the type of activation function, and whether the input reaches a certain criteria to return a non-zero value, thus allowing the neural network to find the best combination of the model to reduce the error on the predicted values of the final output.

Non-linear activation functions are often better suited to neural networks than linear ones…


Gradient descent is the process of updating the weights for the nodes in each layer of a neural network during back propagation in order to reduce the error in the predicted values to maximize the accuracy of the model. This is the “learning” process in supervised machine learning. You train a model by giving it a dataset to learn from. Often times these datasets are extremely large. The larger the dataset, the more accurate the model becomes after training, generally. However, regular ole batch gradient descent can be extremely slow. …


TLDR; here’s a snapshot of what you need in ~/.vimrc:

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store