Dynamic objects for data grid rows

I want to switch my application from JS to Rescript, but there is one obstacle I don’t know how to deal with.
I’m getting a dataset from the backend, that contains an array of rows (objects). In fact, I’m getting dozen of different datasets with a variety of columns.
I’m parsing column definitions by Object.keys first row and applying some options to columns base on some criteria. One parser for all datasets.
So now, I want to switch to Rescript and I have 2 kinds of problems:

  1. In practice I have too many datasets to create records for every column layout, but I’m not sure what data structure I should use.
    Hash map, JS.Dict or list of tuples is not a good idea as columns can have different data types. I found out Object, but with object, I’m not sure how to:
  • parse BE data with bs-json as Object
  • how to do Object.keys on Rescript Object? There is Js.Obj.keys but how to access value?
  • how to check object value type? I want to check for example if the value is a number and if it is, add a property to a column definition
  • How to dynamically extend an object?
    with Js I can do like this:
    const myObject = {}
    if (condition 1) {
    myObject[property1] = x
    if (condition 2) {
    myObject[property2] = y
    return myObject
    Is Js.Obj.assign a good idea?
  1. I want to pass produced column definition to datagrid library (ag-grid in my example). Library accepts an object, I assume Rescript Object should fit, but if I want to pass it in props how to define Object prop?

Is your backend API giving you some kind of a schema for the data? If not, get it to give you a schema, because that’s really the best way to handle this. With a schema, say, JSON Schema, you can map it to AG Grid column definitions and accurately construct a grid for any given data.

Without a schema, things are more difficult. Yes, you can check the runtime type of the data you are getting, using the built-in Js.Types.classify (you don’t need bs-json). But it’s not a great solution because it only gives you the encoding of the data in terms of JSON types, not the intended type of the data. E.g., it’s normal practice to encode BigDecimals as strings over the wire. By not knowing the actual schema and just checking the JSON runtime type, you would think that the value is a string, when it’s really meant to be a BigDecimal.

1 Like

Could you show us an example of rows and columns?

So basically list of objects with the exact same interface. For example:
{name: “mike”, age: 100}
but it also might be:
{company: “EvilCorp”, tag: “XXX”, phone: “11-111-111-111”, value 300}
or it might be anything else.

That would be perfect, unfortunately, I’m not in charge of BE and I think it won’t happen anytime soon, and it’s a bit pity to not switch to Rescript only because of this.
Theoretically, I can write JS chunk of code to parse schema myself and write some binding with interface somewhat like `a → List< GridSchema >. I don’t know if it will solve my problem, but at least I have place to start :grinning_face_with_smiling_eyes:

To me, the data looks like a Map type where the index (key) is a string and the values have their own type:

module Value = {
  type t =
     | VString(string)
     | VInt(int)
     | VPhone(string)
     | ....

If there’s only ever a few variants, use a variant type anyway. You can also consider the case where some of the columns are common to all rows, and then there are some data which is satellite and only present in some cases. There, using a record for the common columns, and a Map (or Json.t) for the auxiliary satellite data might come in handy. Again, if the satellite data is limited in variance, use a variant type for it.

The dataset is similar to a typical Wide Column Store. Hence, I would store it as such, for the most part.

Getting it to the JS side is then either a conversion on the Rescript side, or providing a small shim on the JS side doing the conversion into something ag-grid would like to see.