question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Firestore mapper - expected (plain | mongo) in argument?

See original GitHub issue

I’m trying to build a classToFirestore and firestoreToClass mapping, however the call for deleteExcludedPropertiesFor only takes “mongo” or “plain” target, and this constraint extends to exclude. So I’m not sure how best to wrangle this, other than to drop the deleteExcludedPropertiesFor, which would mean all the excluded fields would be retained.

Or forking the whole project?

Advice would be really appreciated!

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:9 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
marcjcommented, Aug 6, 2020

@leandro-manifesto sorry, forgot to answer you.

Is this a proper way to do it?

Yes, you can do it like that.

Also, is it necessary to register converters for the basic types (boolean, number, string, array and object) as well?

Only if the firestore driver itself returns different types and that need to be translated to JS types. I guess the driver returns boolean already as Boolean, number as Number etc, so nothing to do here.

1reaction
marcjcommented, Aug 6, 2020

It’s done, we got massive performance improvements and I decoupled a lot of stuff. I’m actually very happy about that 😃

The way you add new serialization targets is now easier as well. You have to write basically for every type that needs to be casted a little compiler template. Here is all what is needed for MongoDB support:

https://github.com/marcj/marshal.ts/blob/master/packages/mongo/src/compiler-templates.ts

See documentation about the used functions here: https://github.com/marcj/marshal.ts/blob/master/packages/core/src/compiler-registry.ts

You see code like

//my-compiler-templates.ts
import {moment, PropertyCompilerSchema, registerConverterCompiler} from '@marcj/marshal';
registerConverterCompiler('class', 'firebase', 'moment', (setter: string, accessor: string, property: PropertyCompilerSchema) => {
    return `${setter} = ${accessor}.toDate();`
});
registerConverterCompiler('firebase', 'class', 'moment', (setter: string, accessor: string, property: PropertyCompilerSchema) => {
    return {
        template: `${setter} = moment(${accessor});`,
        context: {moment}
    };
});

This code registers a new compiler template for serialization from class to firebase for the type moment. Since firebase does not have Moment.js support, you need this compiler as well for example. You need a couple of those, especially for arrayBuffer and all typed arrays. Mongo stores those as Binary, and firebase has probably a custom binary abstraction. See how mongo did it for examples.

For class you can take the default compiler:

//my-compiler-templates.ts
import {compilerConvertClassToX, compilerXToClass} from '@marcj/marshal';
registerConverterCompiler('class', 'firebase', 'class', compilerConvertClassToX('firebase'));
registerConverterCompiler('firebase', 'class', 'class', compilerXToClass('firebase'));

Here’s a list of types you probably want to cover, depending on which types the firebase driver supports

  • moment
  • arrayBuffer (maybe firebase driver supports ArrayBuffer already, dunno)
  • all typed arrays (maybe firebase driver supports ArrayBuffer already, dunno)
  • class (use ready to use compiler)
  • uuid (only if firebase supports binary uuids, otherwise just ignore it, it will stored as string)

In best case you have only 4 compilers registered (2x moment and 2x class).

Please only define registerConverterCompiler('firebase', 'class', type, ... and registerConverterCompiler('class', 'firebase', type, .... plainToFirebase and firebaseToPlain should work through classToFirebase and firebaseToClass. For example how mongo did it:

export function plainToMongo<T>(classType: ClassType<T>, target: { [k: string]: any }): any {
    return classToMongo(classType, plainToClass(classType, target));
}

It’s faster to do it so than manually specifying every compiler template possible for plain -> firebase.


You can then provide little wrapper functions around the core to make autoloading of your compiler-templates possible and give users a clear API. Here’s how the mongo implementation is doing it:

https://github.com/marcj/marshal.ts/blob/master/packages/marshal-mongo/src/mapping.ts

Your wrapper function would look like these:

//mapper.ts
import './my-compiler-templates.ts';
import {createClassToXFunction, createXToClassFunction} from '@marcj/marshal';

export function firebaseToClass<T>(classType: ClassType<T>, record: any, parents?: any[]): T {
    return createXToClassFunction(classType, 'firebase')(record, parents);
}

export function classToFirebase<T>(classType: ClassType<T>, instance: T): any {
    if (!(instance instanceof classType)) {
        throw new Error(`Could not classToFirebase since target is not a class instance of ${getClassName(classType)}`);
    }
    return createClassToXFunction(classType, 'firebase')(instance);
}


Read more comments on GitHub >

github_iconTop Results From Across the Web

javascript Map to Firestore -> Error: Value for argument "data ...
It fails with error - Error: Value for argument "data" is not a valid Firestore document. Input is not a plain JavaScript object...
Read more >
Value for argument "data" is not a valid Firestore document ...
Error: Value for argument "data" is not a valid Firestore document. Detected an object of type "DocumentReference" that doesn't match the expected instance....
Read more >
How is MongoDB different from Firestore? - Quora
To query documents in Couchbase, you define a view with the columns of the document you are interested in (called the map); and...
Read more >
metricbeat.reference.yml - Elastic
Defaults to PLAIN when `username` and `password` are configured. ... #timeout: 90 # metricbeat expects Elasticsearch to be the same version or newer...
Read more >
Parquet schema - Hackolade
Apache Parquet is a binary file format that stores data in a columnar fashion for compressed, efficient columnar data representation in the Hadoop...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found