Bayesian program synthesis
In machine learning, Bayesian Program Synthesis (BPS), Bayesian Programs write (synthesize) new Bayesian programs. This is in contrast to the field of /probabilistic programs where humans write new probabilistic (Bayesian) programs.
Bayesian probabilities is a strategy to learn distributions over Bayesian programs. Gamalon, a machine learning company, invented the term as describing their framework for using Bayesian probabilistic programs to learn specialized probabilistic programs based on input data.[1]
Bayesian Program Synthesis can be compared to the Bayesian program learning efforts of Lake, Salakhutdinov, and Tenenbaum's,[2] where a probabilistic model learned to recognize handwritten characters in a human-like fashion with high accuracy, by simultaneously learning a program for how characters are drawn.[3]
The framework
Bayesian Program Synthesis has been described as a framework related to and utilizing probabilistic programming. Probabilistic programs are generated that are themselves priors over a space of probabilistic programs.[2] This strategy allows more automatic synthesis of new programs via inference and is achieved by the composition of modular component programs. This modularity allows many individuals to work on and test smaller probabilistic programs before being integrated into a larger model.[4]
Bayesian methods and models are frequently used to incorporate prior knowledge. When good prior knowledge can be incorporated into a Bayesian model, effective inference can often be performed with much less data than discriminative approaches that do not make use of structured information about correlations.[5]
This framework can be also be contrasted with the family of automated program synthesis fields, including program synthesis, programming by example, and programming by demonstration. The goal in such fields is to find the best program that satisfies some constraint. In program synthesis, for instance, verification of logical constraints reduce the state space of possible programs, allowing more efficient search to find an optimal program. Bayesian Program Synthesis differs both in that the constraints are probabilistic and the output is itself a distribution over programs that can be further refined.[5]
BPS in practice
While BPS has been studied before, Gamalon released its first mainstream use in February 2017 when they released "Gamalon Match" and "Gamalon Structure," which scour common databases and fix ambiguities, like different spellings for customer names and addresses.[6]
See also
References
- ^ Knight, Will. "AI software writes, and rewrites, its own code, getting smarter as it does". MIT Technology Review. Retrieved 2017-03-04.
- ^ a b Wood, Charlie (2017-02-16). "Startup pairs man with machine to crack the 'black box' of neural networks". Christian Science Monitor. ISSN 0882-7729. Retrieved 2017-03-04.
- ^ Lake, Brenden M.; Salakhutdinov, Ruslan; Tenenbaum, Joshua B. (2015-12-11). "Human-level concept learning through probabilistic program induction". Science. 350 (6266): 1332–1338. doi:10.1126/science.aab3050. ISSN 0036-8075. PMID 26659050.
- ^ "Talking Machines: Probabilistic programming, with Ben Vigoda | Robohub". robohub.org. Retrieved 2017-03-04.
- ^ a b Metz, Cade. "AI's Factions Get Feisty. But Really, They're All on the Same Team". WIRED. Retrieved 2017-03-04.
- ^ "AI software learns to write its own code". 2017-02-14. Retrieved 2017-03-04.