You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This work attempts to generalize a stock forecasting neural network using Bayesian regularization so that predictions can be performed without an overfitted model, considering the highly volatile market these days.
Classification Using Logistic Regression by Making a Neural Network Model. This project also includes comparison of Model performance when different regularization techniques are used
The objective is to build various classification models, tune them and find the best one that will help identify failures so that the generator could be repaired before failing/breaking and the overall maintenance cost of the generators can be brought down.
In linear regression, regularization is a process of making the model more regular or simpler by shrinking the model coefficient to be closer to zero or absolute, ultimately to address over fitting.
Repository with some implementations of algorithms used in Numerical Analysis. From the solution of determined and overdeterminded systems to regularization and non-linear least square problems.
This repository contains a Python implementation of linear regression, logistic regression, and ridge regression algorithms. These algorithms are commonly used in machine learning and statistical modeling for various tasks such as predicting numerical values, classifying data into categories, and handling multicollinearity in regression models.
Preprocessed the USPS dataset, implemented and compared different network architectures and optimization techniques, applied regularization techniques such as ensembling and dropout, performed adversarial training to evaluate network robustness, and evaluated network performance using metrics such as accuracy, precision, and recall.