Customizing both decoding and encoding of a type
See original GitHub issueI’m considering adopting pydantic in a project that serializes numpy arrays as base64-encoded gzipped strings. The current serialization solution allows registering type-specific hooks for encoding and decoding, so that after registering the hooks any field annotated as numpy.ndarray will automatically be handled correctly:
class Example(object):
big_array: numpy.ndarray
ex = Example(big_array = numpy.arange(50))
serialized = serialize_to_dict(ex)
print(serialized)
# { 'big_array': 'H4sIAAEAAAAC/xXOyRHCUAwE0VQUgA7MArZjocg/Dfqf+5VG39eOdryTne68dz47186985BOpgsg\nhCDCCCSUYMIZ53MHZ5xxxhlnnHHGBRdcziAuuOCCCy644Iorrriez3DFFVdccX1+f1+8uIe+AAAA\n' }
deserialized = deserialize_dict(serialized, Example)
print(deserialized.big_array)
# array([0, ..., 49])
I noticed that Config
has a json_encoders
field that seems to allow this for encoding, but I haven’t seen a way to customize decoding (maybe I’m just too sleep-deprived). Is there a way to achieve the above behavior using pydantic?
Issue Analytics
- State:
- Created 4 years ago
- Reactions:23
- Comments:48 (17 by maintainers)
Top Results From Across the Web
Encoding and Decoding Custom Types - Apple Developer
Encode and Decode Automatically. The simplest way to make a type codable is to declare its properties using types that are already Codable...
Read more >Customizing how an external Swift type is encoded or decoded
Various ways to handle mismatches between how an external type is expected to be coded and the data format that an app is...
Read more >Customizing both decoding and encoding of a type · Issue #951
I'm considering adopting pydantic in a project that serializes numpy arrays as base64-encoded gzipped strings.
Read more >Encoding and Decoding Custom Types · borer
Encoders and Decoders can be implicitly “unpacked” from a Codec . There are several ways to provide such encoders, decoders or codecs for...
Read more >Customizing encoding and decoding - cbor2 - Read the Docs
Both the encoder and decoder can be customized to support a wider range of types. On the encoder side, this is accomplished by...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
The main implementation would be more or less that, but it would need to work recursively somehow so a field
foobar: List[MyComplexThing]
called__serialise__
on every member of the list.I’m also concerned about what to do with standard types that might need simplifying for output.
Perhaps we should do something like #317, e.g.:
simplify
kwarg todict()
simplify=True
causespydantic_encoder
to be called recursively on the the dict, would need modifying to look more likejsonable_encoder
pydantic_encoder
looks for__serialise__
and calls it if it exists, thusmodel.json()
would work with__serialise__
without the slow down ofsimplify=True
.What do you think?
Looking at that, I’m not sure implementing this will as simple as initially though, especially given that performance is important so we’ll probably need a micro-benchmark. Feel free to start a PR, otherwise I’ll work on it in a couple of weeks.
Has this gone ahead? I’m interested as well, seems like a very useful feature that makes custom types come onto their own.