Hey folks, I’m working on a library which is written in ReScript but intended to be consumed by TS/JS users. My flow is to write in ReScript with @genType
annotations, build ReScript → TS/JS, then build TS->JS/Typedefs.
So far this seems to work 99% of the way through, but I’m hitting an issue with the way inline module
declarations are named in the resulting files:
// MyFile.res
module Node = {
@genType
type t = {
...etc
}
@genType
let create = (a, b, c): t => { ... etc }
@genType
let isNode = (a: t): bool => { ... etc }
}
@genType
let someOtherFunc (a: Node.t, b: Node.t): () => { ... etc }
With a file like this, abbreviated for clarity, running the rescript build produces .gen.ts
files as well as .bs.js
files. The .gen.ts
files basically provide type-annotated function wrappers which internally just delegate their calls to functions imported from the related .bs.js
file.
An example of this copied verbatim from my generated code output is:
import * as ReconcilerBS__Es6Import from './Reconciler.bs';
const ReconcilerBS: any = ReconcilerBS__Es6Import;
export const Node_isNode: <T1>(a:{ readonly symbol: T1 }) => boolean = ReconcilerBS.Node.isNode;
The problem comes in here. The generated TS file expects a named import Node
on which the function isNode
lives. The corresponding .bs.js
file produces an object called $$Node
, not Node
:
// Reconciler.bs.js
var $$Node = {
create: create,
isNode: isNode
};
export {
$$Node ,
...other stuff
}
So after I’ve compiled ReScript->TS/JS then TS->JS/TypeDefs I get runtime errors here because Node.isNode
is not a thing. If I manually edit the generated .bs.js
to export { $$Node as Node }
then everything works as expected.
Is this a bug? Is there a workaround I can use in the meantime?
Thanks in advance