Week 5 Neural Networks

Cards (15)

  • Artificial Neural Networks:
    mathematical functions simulating biological neurons in the brain, represented by 'units' connected in a graph
  • Activation Function:
    function determining the point output is considered active
  • Most basic structure of an ANN:
  • Structure of an ANN with multiple outputs:
  • Gradient Descent:
    algorithm to minimise loss in training an ANN, by demonstrating direction to move weights
  • Multilayered neural networks allow ANN to solve non-linearly by adding a hidden layer(s)
  • Backpropagation:
    deep NN can backpropagate output errors to estimate error for each node within the hidden layer
  • Deep Neural Network:
    an ANN with multiple hidden layers
  • Tensor Flow:
    popular library used for ANN
  • Computer Vision:
    computational methods for analysing/understanding digital images (by considering images as a matrix of coloured pixels and thus possible to represent numerically)
  • Image Convolution:
    method to abstract useful information from a picture, by filtering pixel values based on neighbours (weighted according to a kernel matrix)
    Different filters can be used depending on what's being identified
  • Pooling:
    sampling regions of an input to reduce the size of data to be analysed
  • Convolutional NN:
    convolutional images result in feature maps, that are pooled to decrease size and flattened into an ANN
    convolutional and pooling steps may be repeated, first time finding low-level features (curves, edges, etc.), second time high-level features (objects)
  • Feed-Forward ANN:
    connection only in 1 direction, particularly useful in classification
  • Recurrent ANN:
    output of previous calculation is fed back into the ANN
    its one-to-many relation between inputs & outputs enables a versatile sized output (useful in text generation, as size is not known ahead of time)