File using poly variants compiles in Reason but does not compile in Rescript

This compiles fine in Reason/Ocaml

let f: unit => [ | `A] = () => `A;
let g: unit => [ | `B] = () => `B;

let h = b =>
  if (b) {
    (f(): [ | `A] :> [> | `A]);
  } else {
    (g(): [ | `B] :> [> | `B]);
  };

but does not compile in Rescript

let f: unit => [#A] = () => #A
let g: unit => [#B] = () => #B

let h = b =>
  if b {
    (f(): [#A] :> [> #A])
  } else {
    (g(): [#B] :> [> #B])
  }
Type Errors
[E] Line 8, column 5:
This has type: [#B]
  Somewhere wanted: [#A]
  These two variant types have no intersection

Interesting example! I think the Reason translation of the code is subtly different though. If you check the literal Reason translation of this ReScript sample, it has some extra parentheses:

let h = b =>
  if (b) ((f(): [`A]) :> [> `A]) else ((g(): [`B]) :> [> `B]);

Voila–this version also gives the same error, even in Reason syntax. The ReScript syntax has subtly different precedence and associativity for parsing the type annotation and upcast symbols; by default it interprets them as (exp: TYPE1) :> TYPE1 whereas Reason and OCaml interpret them as exp: (TYPE1 :> TYPE2). The problem is there’s no way to tell ReScript to do the latter as well–parentheses around the (TYPE1 :> TYPE2) are a parse error.

This could plausibly be filed as a syntax issue. In the meantime it should be fairly simple to work around, by not using type annotations. The compiler’s type inference is good enough to just figure out what the correct types should be:

let f = () => #A
let g = () => #B
let h = b => if b { f() } else { g() }

If you need annotations for documentation purposes, it’s better to learn the rules of polymorphic variant type inference. E.g., in this case, the correct way to write the types would be:

module M: {
  let f: unit => [> #A]
  let g: unit => [> #B]
  let h: bool => [> #A | #B]
} = {
  let f = () => #A
  let g = () => #B
  let h = b => if b { f() } else { g() }
}
1 Like

Well, I should have specified that the Rescript code is a result of automatic conversion using bsc -format ...

Ah, in that case definitely something that should be reported as an issue.

1 Like
module M: {
  let f: unit => [> #A]
  let g: unit => [> #B]
  let h: bool => [> #A | #B]
} = {
  let f = () => #A
  let g = () => #B
  let h = b => if b { f() } else { g() }
}

My original (not-reduced) use case is more complicated and includes module interfaces/functors/composable error handling etc with functions f and g being callbacks for certain stages of the pipeline.

Currently (I might be wrong) it seems that going the way of constraints would be an invitation to the “world of hurt”. I am conscientiously trying to not use bounds for those functions (sorry for the poor explanation, still working on the understanding of ergonomics)

I understand, I would still recommend to learn the rules of how polymorphic variant types are inferred from definitions. Composeable error handling with polyvariants only works when they have open bounds so the compiler can compose them. If they don’t, you could end up with ugly (and brittle) code that would be difficult to understand and change. With open bounded polyvariants, it’s still somewhat difficult to understand but at least not densely packed with symbols and a bit easier to read.

I think that my problem with your suggestion and inference is that it propagates to function parameter types and then conflicts with fixed module definitions:

module A: {
  let f: (unit => [> #A | #B]) => unit
} = {
  let f1 = () => #B  // Try changing to #C

  let f = (action: unit => [> #A]) => {
    let b = f1()
    let _c = if true {
      action()
    } else {
      b
    }
  }
}


 Signature mismatch:
  ...
  Values do not match:
    let f: (unit => [> #A | #C]) => unit
  is not included in
    let f: (unit => [> #A | #B]) => unit

This is working as intended. Try removing all the type annotations and then checking the actual inferred types of the functions. According to the ReScript Playground,

let f1: unit => [> #B]
let f: (unit => [> #B]) => unit

After changing f1 to return #C,

let f1: unit => [> #C]
let f: (unit => [> #C]) => unit

The inferred types depend on the actual polymorphic variant values used in the functions. If we give type annotations that do not agree with the actual inferred types, the compiler will of course give a type error. This is what I meant by:

I would still recommend to learn the rules of how polymorphic variant types are inferred from definitions.

In ReScript you can’t really change the types the compiler infers from the actual function bodies by giving a type annotation. The inferred type is always treated as the correct type, and if your annotated type ‘agrees’ with the inferred type then the compiler allows it, otherwise it raises a type error.

I understand that this is intended, but this is exactly what the problem is, the module signatures keep changing. Changing the implementation of f1 makes us change a lot of things we should not have to. Thus the need for a stable signature and a cast in my original example.

I am not sure I am adequately expressing my point, maybe simplified example will help:

handler.resi


    // Signature of this should not be changed by changes in `handle`
    module Authorizer: {
      module Error: {type t = [ | `Exn(exn) | `NotAuthorized(Js.Json.t)];};
      type t = Express.Request.t => result(Request.authorization, Error.t);
    };

    let handle:
      (
        t,
        ~authorizer: Authorizer.t=?,
       ...
        unit
      ) =>
      Js.Promise.t(Express.complete);
  };