Skip to content

EE456 2022 mini project implementation of two-moons problem using multi-layer-perceptron with back-propagation with analyzing performance of initializing methods and momentum rule

Notifications You must be signed in to change notification settings

Efesasa0/mlp-backprop-two-moons

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multi-Layer Perceptron for Moon Data Classification

  • seperation of two classes that resemble two moons output
  • Summary: This project is part of EE456 at Penn State University. The main objective was to implement a multi-layer perceptron using MATLAB to classify two classes of moon-shaped data.

File Descriptions

  1. MLPBackProbv3.m: The MATLAB script containing the implementation of the multi-layer perceptron with backpropagation algorithm. It includes functions for forward and backward pass, weight updates, and training the network.

  2. feedforwardApplication.m: A script used for applying the trained perceptron model to new data, demonstrating the model's ability to classify unseen moon data.

  3. tanhD.m: A MATLAB function that computes the derivative of the hyperbolic tangent function, used in the backpropagation step of the neural network.

  4. DataSet1_MP1.mat and DataSet2_MP1.mat: These files contain the moon-shaped data sets used for training and testing the perceptron model.

  5. MiniProjectReport.pdf: A comprehensive report that details the project's goals, methodology, and findings. It includes a thorough explanation of the perceptron model, its architecture, and the results of classifying the moon data.

  • Project Insights: The project successfully demonstrates the application of a multi-layer perceptron in classifying nonlinearly separable data. Key insights include the effectiveness of backpropagation in neural networks and the practical challenges in tuning and applying these models.

This project highlights the use of neural networks, particularly multi-layer perceptrons, in solving complex classification problems, where I show the balance between theory and practical application in machine learning.

Here are links to my other work in EE456

  1. Implemented with pytorch: auto-encoder-maze
  2. Implemented with pytorch: cifar-cnn

About

EE456 2022 mini project implementation of two-moons problem using multi-layer-perceptron with back-propagation with analyzing performance of initializing methods and momentum rule

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages