Jump to content

Bayesian program synthesis

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 18.93.15.72 (talk) at 21:29, 9 April 2019 (Fix broken URL.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In programming languages and machine learning, Bayesian program synthesis (BPS) is a program synthesis technique where Bayesian probabilistic programs automatically construct new Bayesian probabilistic programs.[1] This approach stands is in contrast to routine practice in probabilistic programming where human developers manually write new probabilistic programs.

Bayesian program synthesis can be compared to the work on Bayesian program learning, where probabilistic program components were hand-written, pre-trained on data, and then hand assembled in order to recognize handwritten characters.[2]

The framework

Bayesian Program Synthesis (BPS) has been described as a framework related to and utilizing probabilistic programming. In BPS, probabilistic programs are generated that are themselves priors over a space of probabilistic programs.[3] This strategy allows more automatic synthesis of new programs via inference and is achieved by the composition of modular component programs.

The modularity in BPS allows inference to work on and test smaller probabilistic programs before being integrated into a larger model.[4]

Bayesian methods and models are frequently used to incorporate prior knowledge. When good prior knowledge can be incorporated into a Bayesian model, effective inference can often be performed with much less data.[5]

This framework can be also be contrasted with the family of automated program synthesis fields, including program synthesis, programming by example, and programming by demonstration. The goal in such fields is to find the best program that satisfies some constraint. In program synthesis, for instance, verification of logical constraints reduce the state space of possible programs, allowing more efficient search to find an optimal program. Bayesian Program Synthesis differs both in that the constraints are probabilistic and the output is itself a distribution over programs that can be further refined.[5]

See also

References

  1. ^ Saad, Feras A.; Cusumano-Towner, Marco F.; Schaechtle, Ulrich; Rinard, Martin C.; Mansinghka, Vikash K. (January 2019). "Bayesian Synthesis of Probabilistic Programs for". Proc. ACM Program. Lang. 3 (POPL): 37:1--37:32. doi:10.1145/3290350. ISSN 2475-1421.
  2. ^ Lake, Brenden M.; Salakhutdinov, Ruslan; Tenenbaum, Joshua B. (2015-12-11). "Human-level concept learning through probabilistic program induction". Science. 350 (6266): 1332–1338. doi:10.1126/science.aab3050. ISSN 0036-8075. PMID 26659050.
  3. ^ Wood, Charlie (2017-02-16). "Startup pairs man with machine to crack the 'black box' of neural networks". Christian Science Monitor. ISSN 0882-7729. Retrieved 2017-03-04.
  4. ^ "Talking Machines: Probabilistic programming, with Ben Vigoda | Robohub". robohub.org. Retrieved 2017-03-04.
  5. ^ a b Metz, Cade. "AI's Factions Get Feisty. But Really, They're All on the Same Team". WIRED. Retrieved 2017-03-04.