Thursday, January 14, 2016

Here is an article I found in my Time Magazine about AI
December 24, 2015-January 4, 2016
Pg. 44

Progress Presentation

Here is my progress presentation.

Monday, January 4, 2016

Perceptron Algorithm Notes

Introduction

  • Perceptron models a neuron
  • n inputs are received by it that correspond to features
  • It takes the inputs and sums them, then produces an output after checking the result
  • Its main use is to classify groups that are linearly separable
  • It is often used for binary classification
Components
  • It is composed of a summation processor, and activation function (does the thresholding) and weights.
  • It takes a weighted sum of the inputs and outputs 
    • e.g. 1 if the sum is greater than the threshold and 0 if it isn't
    • Usually the input and output weights are real numbers
  • Perceptron can also have an input called the bias
    • Bias allows the transfer function curve to be shifted horizontally along the input axis while the curve is left unaltered.
  • Weight determines the slope 
    • e.g. in the eq. y = mx + b, m is the weight and b is the bias
  • Activation Function
    • Also know as transfer function
    • It translates input signals into output signals 
    • The output is produced using a threshold
    • The 4 types of activation functions most commonly used are: Unit step, sigmoid, piecewise linear, and Gaussian
      • The unit step is a simple threshold where the output is 1 of 2 levels and it depends on whether the input is greater than or less than the threshold value
      • The sigmoid contains 2 functions: tangential and logistic. Values of tangential functions are from -1 to 1 and the values of logistic functions are from 0 to 1.
      • In the piecewise linear function the output arrived at from being proportional to the total weighted output.
      • The Gaussian functions are continuous bell shaped curved functions. The node output which is high or low is interpreted from class membership which is 1 or 0, and depends on how close the net input is to a chosen value or average. 
  • Learning Rate
    • Used when updating the weights and bias to get a smaller error
    • Helps control how much the weights and bias are changed
  • Learning In Perceptrons 
    • when trying to find a line to separate two classes or groups it responds to the input vectors with a target value of 0 or 1
    • If the classes can be separated the solution will be found in a finite amount of time.