Parametric Machine Learning

I have a finite number of parameters for example in case of linear regression, we need to find slope(m) and intercept(c), where ever we need to find the parameters we can say it parametric Machine learning algorithm.

We can also say, when we work with assumptions like data should be normally distributed, multi collinearity etc…

Some more examples of parametric machine learning algorithms include:

  • ˆ Logistic Regression
  • ˆ Linear Discriminant Analysis
  • ˆ Liner regression

Benefits of Parametric Machine Learning Algorithms:

ˆ Simpler: This algorithms are easy to understand, we just need to find the parameters, so no black box

ˆ Speed: Parametric models are very fast to learn from data.

Limitations of Parametric Machine Learning Algorithms:

ˆ Constrained: By choosing a functional form these methods are highly constrained to the specified form.

ˆ Huge data: The methods are more suited to simpler problems and small data.

ˆ Poor Fit: Parameters algorithams are prone to underfitting, as it will have more bias

Non Parametric Machine Learning Algorithms:

Algorithms that do not make strong assumptions about the form of the mapping function are called nonparametric machine learning algorithms. By not making assumptions(data need to be in normally distributed, linearity etc…), they are free to learn any functional form from the training data.

Some more examples of popular nonparametric machine learning algorithms are:

ˆ Decision Trees like CART and C4.5

ˆ Naive Bayes

ˆ Support Vector Machines

ˆ Neural Networks

Benefits of Nonparametric Machine Learning Algorithms:

ˆ Flexibility: Capable of fitting a large number of functional forms.

ˆ Power: No assumptions (or weak assumptions) about the underlying function.

ˆ Performance: Can result in higher performance models for prediction.

Limitations of Nonparametric Machine Learning Algorithms:

ˆ More data: Require a lot more training data to estimate the mapping function.

ˆ Slower: A lot slower to train as they often have far more parameters to train.

ˆ Overfitting: More of a risk to overfit the training data and it is harder to explain why specific predictions are made.

Published by viswateja3

Hi

Leave a comment