Forcing `None` on load and skipping `None` on dump
See original GitHub issueI’m using 2.0.0rc2 to validate input data on HTTP requests and to load SQLAlchemy models to JSON on HTTP responses. And i’ve stumbled upon 2 problems:
First, while loading data from JSON on HTTP PUT request, i want to populate all missing fields as None, to correctly overwrite data in SQLAlchemy. Right now i’m using following code:
for name, field in schema.fields.iteritems():
if field.missing == ma.missing:
schema.fields[name].missing = None
It works, but i suppose it’s bugged since i’m messing with marshmallow.Field
instance attached to Schema
class. And after disposing Schema
instance all fields we patched will stuck with new missing instead of default one.
Second, while dumping data from SQLAlchemy to JSON all missing fields are resolved as None, and JSON populated with {"key": null, }
data. It’s unwanted behaviour and i’m cleaning them on post_dump
trigger.
@post_dump
def clean_missing(self, data):
for key in filter(lambda key: data[key] is None, data):
data.pop(key)
return data
Same as previous, it’s working but includes creating some BaseSchema
class witch passes this logic to all inherited classes.
I’ve searched documentation for while, and didn’t find any correct way to swap this behaviours i.e. skip fields on dumping and populate fields with None
on loading. Am I missing something or marshmallow don’t provide such functions?
Issue Analytics
- State:
- Created 8 years ago
- Reactions:1
- Comments:7 (4 by maintainers)
Top GitHub Comments
What is wrong with creating a
BaseSchema
? This is a common usage pattern with marshmallow. You’ll often want shared behavior across all your schemas.You can use the newly-introduced
on_bind_field
hook to override themissing
attribute. So yourBaseSchema
would look something like:Is there a way to pass additional arguments to nested schemas when they’re ‘self’?