Jump to content

User:Makaylapark/Hyperparameter (machine learning)

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Makaylapark (talk | contribs) at 04:58, 3 May 2022 (clarifying batch size and mini-batch size). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Hyperparameters can be classified as model hyperparameters, that cannot be inferred while fitting the machine to the training set because they refer to the model selection task, or algorithm hyperparameters, that in principle have no influence on the performance of the model but affect the speed and quality of the learning process. An example of a model hyperparameter is the topology and size of a neural network. Examples of algorithm hyperparameters are learning rate and batch size as well as mini-batch size. A batch size can refer to the full sample whereas a mini-batch is a smaller sample set.

Article Draft

Lead

Article body

References