Jump to content

Talk:Hyperparameter optimization

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Matumio (talk | contribs) at 20:45, 3 September 2019 (Random search: new section). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Template:Findsourcesnotice

About DeepSwarm

At least for deep neural networks, I realize that this article is now partially conceptually obsolete, in that some modern tools can optimize both the architecture and the hyperparameters simultaneously, with the caveat that this combined optimization doesn't apply to transfer learning. Having said that, to the extent that we're maintaining a separation of the hyperparameter optimization and neural architecture search articles, the preferred location of DeepSwarm would definitely be in the latter. I will try and take some steps to add some prominent software for the same to that article, of course including DeepSwarm. Meanwhile, I need an academic reference for DeepSwarm, and I preferably need it to be listed in its readme. --Acyclic (talk) 23:09, 8 May 2019 (UTC)[reply]

The section about random search says: "Main article: Random search". Is this link actually correct? The linked article talks about Rastrigin, as if this was the established meaning of the term "Random search". (Maybe it is. I don't know.) But the statement on the current page that "[Random Search] replaces the exhaustive enumeration of all combinations by selecting them randomly". Which contradicts the algorithm on the linked article, I think? Which one is it? --Matumio (talk) 20:45, 3 September 2019 (UTC)[reply]