Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'coerce' still ignores 'nullable': True #427

Open
andreymal opened this issue Jul 26, 2018 · 9 comments
Open

'coerce' still ignores 'nullable': True #427

andreymal opened this issue Jul 26, 2018 · 9 comments
Labels
Milestone

Comments

@andreymal
Copy link

#269 was fixed, but only for integer:

>>> v = cerberus.Validator({'foo': {'coerce': int, 'nullable': True, 'type': 'integer'}})
>>> v.normalized({'foo': None})
{'foo': None}

Just replace integer to string:

>>> v = cerberus.Validator({'foo': {'coerce': str, 'nullable': True, 'type': 'string'}})
>>> v.normalized({'foo': None})

Expected behavior: {'foo': None}

Actual behavior: {'foo': 'None'}

So I still have to use a workaround 'coerce': lambda x: str(x) if x is not None else None

But now I'm not sure if it's a bug or a feature (duplicate of #142?). If this is a feature, please describe this behavior in the documentation

@funkyfuture
Copy link
Member

this behaviour is to be expected as str(None) is not failing. i'd say a custom coercer is the way to go.

would not coercing values that are None and constrained with nullable: True be the better tactic?

@funkyfuture
Copy link
Member

sorry for not being very verbose on this. i gave it some thoughts:

the behaviour you describe is to be expected because the current implementation works like this:
when a coercion fails on a field that has nullable set to True and the coercion failed due to a TypeError (like int(None) would raise), that failure is ignored - no error is logged, the input value is kept in the document. there's actually no test whether the input value of the failed coercion is None.

to comply with what i understand as your requirement, it would be simpler and clearer: when a field has
the value None and the schema allows this with a positively defined nullable the value would not get coerced at all.

my concern with that second approach is, that there would be no way to coerce a None value of a field that is eventually allowed to be None. that may be the case when the coercion routine depends on the state of other data (document contents or configuration) which may or may not result in another value than None.

a good counter-argument against my concern would be, that such resolutions should be handled with a default_setter. the problem is, that these are never applied to fields that have that positively defined nullable.

i haven't looked into the implications if that ignorance would be changed, but my guts tell me that this would be the clearest concept:

  • default_setters don't ignore nullable fields
  • but coerce does

does this make sense? would that break current uses?

@andreymal
Copy link
Author

there would be no way to coerce a None value of a field that is eventually allowed to be None

Good argument

Perhaps the current behavior is good, but please describe it more explicitly in the documentation to avoid confusion in the future :)

does this make sense?

Dunno

@funkyfuture
Copy link
Member

okay, i investigated the 'cleaner' solution and it is doable. (if anyone wants to take a peek, i can open a pr.)

would that break current uses?

yes. or maybe not sure.

it breaks the tests test_default_setter_none_nullable, test_default_none_nonnullable and test_default_setter_none_nonnullable around here. they were introduced with #200. and i can't really decypher the reasoning behind these. @dkellner, it'd be super helpful if you could have a look at this issue here and give us your opinion.

@dkellner
Copy link
Contributor

dkellner commented Aug 5, 2018

@funkyfuture The reasoning behind test_default_none_nullable and test_default_setter_none_nullable is that if a field is nullable and a document already has None set for this field, then normalization (with a default value set) should not change this.

So in the test the document is always {'foo': 'foo_value', 'bar': None} and this should not change after normalization, because 'bar' already contains a valid value.

As for the other two, test_default_none_nonnullable and test_default_setter_none_nonnullable: I'm sure the tests are wrong. The helper function _test_default_none_nonnullable never gets called and the naming suggests it was supposed to be used by those two tests. And it propably should be something along this lines:

def _test_default_none_nonnullable(default):
    bar_schema = {'type': 'string', 'nullable': False}
    bar_schema.update(default)
    schema = {'foo': {'type': 'string'}, 'bar': bar_schema}
    document = {'foo': 'foo_value', 'bar': None}
    expected = {'foo': 'foo_value', 'bar': 'bar_value'}
    assert_normalized(document, expected, schema)

But that's surely a matter of definition if in this case of an intially invalid field value ('bar': None) the defaulting mechanism should kick in.

@funkyfuture
Copy link
Member

cool, after fixing the tests, only test_default_none_nullable is failing.

i still would argue that the default setter should be called in any case and it is up to itself to consider a field's value and other constraints like nullable. however, it can't atm, because it even doesn't know which field it is called for (#279). so, that would require a breaking change that is planned already, hence we shouldn't break anything else.

the heat is really making everything a little harder, and delegating seems a good strategy to not mix things up ;-). i'll see that i work out an issue description for Cerberus 2.

@andreymal would you propose an update of the documentation to describe the situation as it is now?

@dkellner would you provide a bug fix for Cerberus 1.2 and 1.3? here is what i did:

# after the imports, also usable by test_default_setter_existent
def must_not_be_called(*args, **kwargs):
    raise RuntimeError('This shall not be called.')

…

@mark.parametrize('default',
                  ({'default': 'bar_value'}, 
                   {'default_setter': must_not_be_called})
                  )
def test_default_none_nullable(default):
    bar_schema = {'type': 'string', 'nullable': True}
    bar_schema.update(default)
    schema = {'foo': {'type': 'string'}, 'bar': bar_schema}
    document = {'foo': 'foo_value', 'bar': None}
    assert_normalized(document, document.copy(), schema)


@mark.parametrize('default',
                  ({'default': 'bar_value'},
                   {'default_setter': lambda doc: 'bar_value'})
                  )
def test_default_none_not_nullable(default):
    bar_schema = {'type': 'string', 'nullable': False}
    bar_schema.update(default)
    schema = {'foo': {'type': 'string'}, 'bar': bar_schema}
    document = {'foo': 'foo_value', 'bar': None}
    expected = {'foo': 'foo_value', 'bar': 'bar_value'}
    assert_normalized(document, expected, schema)

@dkellner
Copy link
Contributor

@funkyfuture You're delegating to Vietnam here, so much for the heat ;-). I will take a look at it and get back to you until wednesday.

@dkellner
Copy link
Contributor

@funkyfuture Just to be sure, you want the behaviour you described before, right? So far I've refactored the tests using mark.parametrize and fixed test_default_none_not_nullable as we talked before.

  • default_setters don't ignore nullable fields
  • but coerce does

@funkyfuture
Copy link
Member

not sure which of the previously behaviours you mean. ;-) anyway, no, for now we shouldn't change any of that. just fix your tests (and possibly code to comply w/ them).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants