Jump to content

Refocusing (semantics)

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Zinnober9 (talk | contribs) at 18:17, 26 February 2023 (Changed stripped <code> tags to <pre> given the multilines). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.


The semantics of a programming language defines the meaning of the programs written in this programming language. Plotkin's Structural Operational Semantics is small-step semantics where the meaning of a program is defined step by step, where each step is an elementary operation that is carried out with contraction rules.

For example, consider the following minimalistic language of arithmetic expressions over integers with additions and quotients,[1] in the manner of Hutton's razor.

In OCaml:

type operator = Add | Quo;;

type expression = Lit of int | Opr of expression * operator * expression;;

type value = Int of int;;

let expression_of_value (v : value) : expression = match v with Int n -> Lit n;;

The smallest potentially reducible expressions (potential redexes) are operations over values, and they are carried out with a contraction function that maps an actual redex to an expression and otherwise yields an error message:

type potential_redex = PR of value * operator * value;;

type contractum_or_error = Contractum of expression | Error of string;;

let contract (pr : potential_redex) : contractum_or_error =
  match pr with
    PR (Int n1, Add, Int n2) ->
     Contractum (Lit (n1 + n2))
  | PR (Int n1, Quo, Int n2) ->
     if n2 = 0
     then Error (string_of_int n1 ^ " / 0")
     else Contractum (Lit (n1 / n2));;

The addition of two integers is an actual redex, and so is the quotient of an integer and a nonzero integer. So for example, the expression Opr (Opr (Lit 1, Add, Lit 10), Add, Lit 100), i.e., , reduces to Opr (Lit 11, Add, Lit 100), i.e., , and the expression Opr (Opr (Lit 1, Quo, Lit 0), Add, Lit 100), i.e., , reduces to Wrong "1 / 0".

Say that the reduction strategy is leftmost-innermost (i.e., depth first and left to right). The following one-step reduction function implements this strategy:

let rec reduce_d (e : expression) : value_or_expression_or_stuck =
  match e with
    Lit n ->
     Value (Int n)
  | Opr (e1, opr, e2) ->
     match reduce_d e1 with
       Value v1 ->
        (match reduce_d e2 with
           Value v2 ->
            (match contract (PR (v1, opr, v2)) with
               Contractum e ->
                Expression e
             | Error s ->
                Stuck s)
         | Expression e2' ->
            Expression (Opr (expression_of_value v1, opr, e2'))
         | Stuck s ->
            Stuck s)
     | Expression e1' ->
        Expression (Opr (e1', opr, e2))
     | Stuck s ->
        Stuck s;;

In words:

  • a literal reduces to a value;
  • if the expression e1 is stuck, then so is the expression Opr e1 opr e2, for any expression e2;
  • if the expression e1 reduces to an expression e1', then for any expression e2, Opr e1 opr e2 reduces to Opr e1' opr e2;
  • if the expression e1 reduces to a value v1, then
    • if the expression e2 is stuck, then so is the expression Opr e1 opr e2;
    • if the expression e2 reduces to an expression e2', then Opr e1 opr e2 reduces to Opr e1 opr e2';
    • if the expression e2 reduces to a value v2, then Opr e1 opr e2 is a potential redex:
      • if this potential redex is an actual one, then contracting it yields an expression; Opr e1 opr e2 reduces to this expression;
      • if this potential redex is not an actual one, then Opr e1 opr e2 is stuck.

Evaluation is achieved by iterated reduction. It yields either a value or an error message:

type result = Normal_form of value | Wrong of string;;

let rec normalize_d (e : expression) : result =
  match reduce_d e with
    Value v ->
     Normal_form v
  | Expression e' ->
     normalize_d e'
  | Stuck s ->
     Wrong s;;

This one-step reduction function implements a Structural Operational Semantics for this language of arithmetic expressions. For example, to carry the reduction step the reduction function implicitly constructs the following proof tree:

                    -----------
                    5 + 5 -> 10
              ---------------------
              1 - (5 + 5) -> 1 - 10
 -----------------------------------------------
 (1 - (5 + 5)) - (2 - 20) -> (1 - 10) - (2 - 20)

Reformatting this proof tree to emphasize the implicit decomposition yields:

                  -------------------------------------------------->
                ^                 contraction                         |
                |        -----------------------------                |
                |        5 + 5               ->      10               |
    implicit    |    ----------------------------------               | implicit
  decomposition |    1 - (5 + 5)             ->  1 - 10               | recomposition
                |   -----------------------------------------------   |
                |   (1 - (5 + 5)) - (2 - 20) -> (1 - 10) - (2 - 20)   v

A reduction semantics is a small-step operational semantics where the implicit context of a potential redex is made explicit. So one reduction step gives rise to

  1. constructing the context of the redex,
  2. contracting this redex, and
  3. recomposing the context around the contractum to yield the reduct:
  (1 - (5 + 5)) - (2 - 20)  \
 [(1 - (5 + 5)) - (2 - 20)] | explicit
 [[1 - (5 + 5)] - (2 - 20)] | decomposition
 [[1 - [5 + 5]] - (2 - 20)] /
                              contraction
 [[1 - [ 10  ]] - (2 - 20)] \
 [[1 -   10   ] - (2 - 20)] | explicit
 [(1 -   10   ) - (2 - 20)] | recomposition
  (1 -   10   ) - (2 - 20)  /

And pictorially, an arithmetic expression is evaluated in successive steps:

         contract               contract               contract        
       o--------->o           o--------->o           o--------->o      
      /            \         /            \         /            \     
     /    recompose \       /    recompose \       /    recompose \    
    /                \     /                \     /                \   
   / decompose        \   / decompose        \   / decompose        \  
  /                    \ /                    \ /                    \ 
 o--------------------->o--------------------->o--------------------->o
          reduce                 reduce                 reduce        

Transforming the one-step reduction function in Continuation-passing style, delimiting the continuation from type value_or_expression_or_stuck -> 'a to type value_or_expression_or_stuck -> value_or_expression_or_stuck, and splitting this delimited continuation into two (one to continue the decomposition and one to recompose) makes it simple to implement the corresponding normalization function:

type value_or_decomposition_cc =
  Val_cc of value
| Dec_cc of potential_redex * (value -> value_or_decomposition_cc) * (expression -> expression);;

let rec decompose_expression_cc (e : expression) (kd : value -> value_or_decomposition_cc) (kr : expression -> expression) : value_or_decomposition_cc =
  match e with
    Lit n ->
     kd (Int n)
  | Opr (e1, opr, e2) ->
     decompose_expression_cc
       e1
       (fun v1 ->
         decompose_expression_cc
           e2
           (fun v2 ->
             Dec_cc (PR (v1, opr, v2), kd, kr))
           (fun e2' ->
             kr (Opr (expression_of_value v1, opr, e2'))))
       (fun e1' ->
         kr (Opr (e1', opr, e2)));;

let decompose_cc (e : expression) : value_or_decomposition_cc =
  decompose_expression_cc e (fun v -> Val_cc v) (fun e' -> e');;

let rec iterate_cc_rb (vod : value_or_decomposition_cc) : result =
  match vod with
    Val_cc v ->
     Normal_form v
  | Dec_cc (pr, kd, kr) ->
     (match contract pr with
        Contractum e ->
         iterate_cc_rb (decompose_cc (kr e))
      | Error s ->     (*^^^^^^^^^^^^^^^^^*)
         Wrong s);;

let normalize_cc_rb (e : expression) : result =
  iterate_cc_rb (decompose_cc e);;

In the underlined code, the contractum is recomposed and the result is decomposed. This normalization function is said to be reduction-based because it enumerates all the reducts in the reduction sequence.

Refocusing

Extensionally, the refocusing thesis is that there is no need to reconstruct the next reduct in order to decompose it in the next reduction step. In other words, these intermediate reducts can be deforested.

Pictorially:

         contract    refocus    contract    refocus    contract        
       o--------->o---------->o--------->o---------->o--------->o------
      /            \         /            \         /            \     
     /    recompose \       /    recompose \       /    recompose \    
    /                \     /                \     /                \   
   / decompose        \   / decompose        \   / decompose        \  
  /                    \ /                    \ /                    \ 
 o                      o                      o                      o

Intensionally, the refocusing thesis is that this deforestation is achieved by continuing the decomposition over the contractum in the current context. [2] [3]

let rec iterate_cc_rf (vod : value_or_decomposition_cc) : result =
  match vod with
    Val_cc v ->
     Normal_form v
  | Dec_cc (pr, kd, kr) ->
     (match contract pr with
        Contractum e ->
         iterate_cc_rb (decompose_expression_cc e kd kr)
      | Error s ->     (*^^^^^^^^^^^^^^^^^^^^^^^^^^^^^*)
         Wrong s);;

let normalize_cc_rf (e : expression) : result =
  iterate_cc_rf (decompose_cc e);;

In the underlined code, the decomposition is continued. This normalization function is said to be reduction-free because it enumerates none of the reducts in the reduction sequence.

In practice, the two continuations are defunctionalized into traditional first-order, inside-out contexts, which yields an implementation of Felleisen and Hieb's reduction semantics, [4] a small-step semantics that was designed independently of continuations and defunctionalization.


Applications

The construction sketched above is completely formalized using the Coq Proof Assistant. [1]

Over the years, this construction has been used to inter-derive calculi and abstract machines. [5] [6] Besides the CEK Machine, the Krivine machine, and the SECD machine, examples also include the chemical abstract machine and abstract machines for JavaScript. [7] [8] [9] Bach Poulsen and Mosses have also used refocusing to implement Structural Operational Semantics and Modular Structural Operational Semantics. [10]

More broadly, refocusing has been used to derive type systems and implementations for coroutines, [11] to go from type checking via reduction to type checking via evaluation, [12] to derive a classical call-by-need sequent calculus, [13] to derive interpretations of the gradually-typed lambda calculus, [14] and for full reduction. [15] [16]


Correctness

Danvy and Nielsen stated conditions for refocusing, and proved them informally. [3] Sieczkowski, Biernacka, and Biernacki formalized refocusing using the Coq Proof Assistant. [17] Bach Poulsen proved the correctness of refocusing for XSOS using rule induction. [18] Biernacka, Charatonik, and Zielinska generalized refocusing using the Coq Proof Assistant. [19] Using Agda, Swiestra proved the refocusing step as part of his formalization of the syntactic correspondence between the calculus with a normal-order reduction strategy and the Krivine machine. [20] Also using Agda, Rozowski proved the refocusing step as part of his formalization of the syntactic correspondence between the calculus with an applicative-order reduction strategy and the CEK Machine. [21]


References

  1. ^ a b Danvy, Olivier (2023). A Deforestation of Reducts: Refocusing (Technical report).
  2. ^ Danvy, Olivier; Nielsen, Lasse R. (2001). Syntactic theories in practice. Second International Workshop on Rule-Based Programming (RULE 2001). Vol. 59.4. Electronic Notes in Theoretical Computer Science. doi:10.7146/brics.v9i4.21721.
  3. ^ a b Danvy, Olivier; Nielsen, Lasse R. (2004). Refocusing in reduction semantics (Technical report). BRICS. doi:10.7146/brics.v11i26.21851. RS-04-26.
  4. ^ Felleisen, Matthias; Hieb, Robert (1992). "The Revised Report on the Syntactic Theories of Sequential Control and State". Theoretical Computer Science. 103 (2): 235–271.
  5. ^ Biernacka, Małgorzata; Danvy, Olivier (2007). "A Concrete Framework for Environment Machines". ACM Transactions on Computational Logic. 9 (1). Article #6: 1–30. doi:10.7146/brics.v13i3.21909.
  6. ^ Biernacka, Małgorzata; Danvy, Olivier (2007). "A Syntactic Correspondence between Context-Sensitive Calculi and Abstract Machines". Theoretical Computer Science. 375 (1–3): 76–108. doi:10.7146/brics.v12i22.21888.
  7. ^ Danvy, Olivier; Millikin, Kevin (2008). "A rational deconstruction of Landin's SECD machine with the J operator". Logical Methods in Computer Science. 4 (4): 1–67.
  8. ^ Şerbǎnuţǎ, Traian Florin; Roşu, Grigore; Meseguer, José (2009). "A rewriting logic approach to operational semantics". Information and Computation. 207 (2): 305–340.
  9. ^ Van Horn, David; Might, Matthew (2018). An Analytic Framework for JavaScript (Technical report).
  10. ^ Bach Poulsen, Casper; Mosses, Peter D. (2013). Generating specialized interpreters for modular structural operational semantics. Vol. 8901. Logic-Based Program Synthesis and Transformation, 23rd International Symposium, LOPSTR 2013. pp. 220–236.
  11. ^ Anton, Konrad; Thiemann, Peter (2010). Typing Coroutines. Vol. 6546. Trends in Functional Programming - 11th International Symposium (TFP). pp. 16–30.
  12. ^ Sergey, Ilya (2012). Operational Aspects of Type Systems: Inter-Derivable Semantics of Type Checking and Gradual Types for Object Ownership (Thesis). KU Leuven.
  13. ^ Ariola, Zena M.; Downen, Paul; Nakata, Kieko; Saurin, Alexis (2012). Classical call-by-need sequent calculi: The unity of semantic artifacts. Functional and Logic Programming, 11th International Symposium, FLOPS 2012. Springer. pp. 32–46.
  14. ^ García-Pérez, Álvaro; Nogueira, Pablo; Sergey, Ilya (2014). Deriving interpretations of the gradually-typed lambda calculus. Partial Evaluation and Semantics-Based Program Manipulation (PEPM 2014). pp. 157–168.
  15. ^ Munk, Johan (2007). A study of syntactic and semantic artifacts and its application to lambda definability, strong normalization, and weak normalization in the presence of state (Thesis). Aarhus University.
  16. ^ García-Pérez, Álvaro; Nogueira, Pablo (2014). "On the syntactic and functional correspondence between hybrid (or layered) normalisers and abstract machines". Science of Computer Programming. 95 (2): 176–199.
  17. ^ Sieczkowski, Filip; Biernacka, Małgorzata; Biernacki, Dariusz (2011). Automating derivations of abstract machines from reduction semantics. Implementation and Application of Functional Languages. pp. 72–88.
  18. ^ Bach Poulsen, Casper (2015). Extensible Transition System Semantics (Thesis). Swansea University.
  19. ^ Biernacka, Małgorzata; Charatonik, Witold; Zielinska, Klara (2017). Generalized Refocusing: From Hybrid Strategies to Abstract Machines. Vol. 84. 2nd International Conference on Formal Structures for Computation and Deduction (FSCD 2017). pp. 1–17.
  20. ^ Swierstra, Wouter (2012). From mathematics to abstract machine: A formal derivation of an executable Krivine machine. Proceedings of the Fourth Workshop on Mathematically Structured Fsunctional Programming (MSFP 2012). pp. 163–177.
  21. ^ Rozowski, Wojciech (2013). Formally verified derivation of an executable and terminating CEK machine from call-by-value lambda-p-hat-calculus (Thesis). University of Southampton.