Skip to content

yh08037/Introduction-to-Intelligent-Systems

Repository files navigation

Introduction to Intelligent Systems

  • 2019년 경북대학교 여름계절학기 지능시스템개론 강의를 듣고 정리하였습니다.
  • 기계학습의 기초적인 수학 이론을 공부하고, 이를 python으로 구현해봅니다.
  • tensorflow, pytorch 등의 딥러닝 라이브러리를 사용하지 않고,
    numpy, matplotlib.pyplot 모듈만을 사용해 모든 것을 구현하는 것을 목표로 합니다.

Lab1 : Non-Regularized Linear Regression

notebook link

Tasks

  • Batch Gradient Descent
  • Stochastic Gradient Descent
  • Closed-form solution (Ordinary Least Square)

Linear Regression

Hypothesis Function

h

Definition of Problem : Cost Minimization

def

Batch Gradient Descent

bgd

Stochastic Gradient Descent

sgd

Closed-form Solution (Ordinary Least Square)

ols


Lab2 : Regularized Regression

notebook link

Tasks

  • Compute and compare solutions for
    1. unregularized linear
    2. unregularized parabolic
    3. unregularized 5th-order polynomial
    4. regularized 5th-order polynomial (RIDGE)

RIDGE and LASSO

ridge and lasso

Problem definition of Regularized Regression

def

Unconstrained version of Problem

laplace

Closed-form Solutoin of RIDGE Problem

ols


Lab 3 : Feed Forward Neural Network

notebook link

Tasks

  • Implementing FFNN for classification problem
  • Back Propagation with Gradient Descent

About Training

training

Model of 2-Layered FFNN

ffnn_model

Gradient Descent of 2-Layered FFNN

Update Rule of FFNN

update rule of ffnn

Gradient of W

Gradient of W

Gradient of V

Gradient of V


Lab 4 : Feed Back Neural Network (Recurrent Neural Network)

notebook link

Tasks

  • Back Propagation
  • Resilient Propagation
  • Gradient Clipping

Elman Model of RNN

elman

Gradient Descent of RNN

Update Rule of RNN

rule

Gradient of Vx

Vx

Gradient of Vf

Vf

Issue : Gradient Vanishing / Explosion

issue

Resilient Propagation : Accelerate / Slow down steps

rp

Gradient Clipping : Prevent Explosion

gc


Lab5 : Unsupervised Learning : K-means & PCA

notebook links

  1. K-means A : Clustering some synthetic data
  2. K-means B : Clustering some real data
  3. PCA A : Reducing the dimension of som synthetic data
  4. PCA B : Reduing demension of some real data

Tasks

  • K-means
  • PCA

K-means

Difference of Classification and Clustering

kmeans

Algorithm for K-means

alg_kmeans

PCA

What is 'Principal Component'

pca

Algorithm for PCA

alg_pca


Lab7 : Generative Model : Naive Bayes

notebook link

Tasks

  • Spam Mail Detector with Naive Bayes Classifier

Discriminitive model and Generative model

  • Discriminative model
    • learns the conditional probability distribution p(y|x)
    • learns p(y|x) directly from the data and then try to classify data
    • generally give better performance in classification tasks
  • Generative model
    • learns the joint probability distribution p(x, y)
    • learns p(x, y) which can be transformed into p(y|x) later to classify the data
    • we can use p(x, y) to generate new data similar to existing data

Naive Bayes Classifier

Prediction Criterion

predict

Model Parameters

params

Principle of the Maximum Likelihood Estimation (MLE)

mle

Issue : divide by zero

issue

Laplace Smoothing : kind of regularization

smooth

About

[2019.S] 지능시스템개론 - 머신러닝 공부!

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published