Talk:Hyperparameter optimization
![]() | This article has not yet been rated on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||||||||||||||||||||
Please add the quality rating to the {{WikiProject banner shell}} template instead of this project banner. See WP:PIQA for details.
Please add the quality rating to the {{WikiProject banner shell}} template instead of this project banner. See WP:PIQA for details.
|
About DeepSwarm
At least for deep neural networks, I realize that this article is now partially conceptually obsolete, in that some modern tools can optimize both the architecture and the hyperparameters simultaneously, with the caveat that this combined optimization doesn't apply to transfer learning. Having said that, to the extent that we're maintaining a separation of the hyperparameter optimization and neural architecture search articles, the preferred location of DeepSwarm would definitely be in the latter. I will try and take some steps to add some prominent software for the same to that article, of course including DeepSwarm. Meanwhile, I need an academic reference for DeepSwarm, and I preferably need it to be listed in its readme. --Acyclic (talk) 23:09, 8 May 2019 (UTC)
Random search
The section about random search says: "Main article: Random search". Is this link actually correct? The linked article talks about Rastrigin, as if this was the established meaning of the term "Random search". (Maybe it is. I don't know.) But the statement on the current page that "[Random Search] replaces the exhaustive enumeration of all combinations by selecting them randomly". Which contradicts the algorithm on the linked article, I think? Which one is it? --Matumio (talk) 20:45, 3 September 2019 (UTC)