5 Replies Latest reply: Dec 3, 2010 8:47 AM by Jesper Pedersen RSS

Parsing and validation

Stefano Maestri Expert

Writing negative unit test cases for Metadata Parsers I realized we are not Throwing exception in case of mandatory element missing.

We are not validating xml file too, so even if these elements are defined as mandatory from xsd parser just try to parse them and in a lot of cases it got success. The result is (for example) a DataSource without driver class defined that is totally bad.

We should of course avoid these situations, and xsd validation is not a solution IMHO since our metadata could be build not only from xml but also from annotation and maybe in future updated from a management console.

 

It's not a little (even if it's not very big)  work since all metadata have to be revisited and updated to verify and validate input data in constructor throw exception and so on. Maybe it would better to switch from current constructors strategy to a builder pattern for metadata providing validation of inputs, but I have to think about that.

 

IMHO it's a top priority task.

 

Opinions?

 

S.

  • 1. Re: Parsing and validation
    Jesper Pedersen Master

    Yes, this is a top priority.

     

    We used to have a method that would validate the metadata model after the merge and throw an exception if the constraints defined by the XSD wasn't correct.

     

    The datasource part is easier since there is only one place where the information can come from (XML), so the parser can be a lot more strict there.

     

    As to updates from the management layer - the management facade must verify that the update is correct before sending the value to the instance. However, we can only do basic tests here, but that should be enough.

     

    Implementation: If a builder pattern can help, go for it. The current requirements for the metadata model can't be changed though.

  • 2. Re: Parsing and validation
    Stefano Maestri Expert

    Hi all,

     

    I have just committed the implementation of validation for datasources metadata. You can see the patch there:

     

    https://github.com/maeste/IronJacamar/commit/173b9afbc33f2774e42908b221329dc767f306ca

     

     

    There are 2 main points I'd like to discuss and get your feedbacks:

     

    1. Validation has been implemented as method validate AND it's fired for every metadata creation (aka as part of the constructor). I think it's better to have valid metadata since we are working with immutable object at metadata level (at least in 99% of cases). Of course the method validate is public and can be fired by metadata's client too in case of metadata forced modification (forceXXX methods). Opinions?
    2. I have implemented at the moment just a validation to respect xsd specification (more or less just mandatory fields and cardinalities). But of course there are more things we could verify at code level. Some example: backGroundValidationMinutes has non sense if BackGroundValidation is false; min pool size could not be < of max_pool_size and so on. Or even more complex checking like provided className are at least valid class name. Since this validation comes very early in user experience (typically during deployment) I think a fine grained check would help a lot users avoiding hard to understand runtime exception during datasources/ra use. Of course make this fine grained check would need some extra analysis and some extra documentation, but I think we would get great benefits.

     

    Feedbacks are more than welcome, in the mean time I'll provide unit tests for current validation

    bye

    S.

  • 3. Re: Parsing and validation
    Jesper Pedersen Master

    Validation has been implemented as method validate AND it's fired for  every metadata creation (aka as part of the constructor). I think it's  better to have valid metadata since we are working with immutable object  at metadata level (at least in 99% of cases). Of course the method  validate is public and can be fired by metadata's client too in case of  metadata forced modification (forceXXX methods). Opinions?

     

    The validator is also available as a standalone tool, and have Ant and Maven integration. Having the validator as part of our deployment chain is to ensure that people are actually using it before deploying to our container.

     

    So all the triggered rules should be collected and summaried in the generated report.

    Of course make this fine grained check would need some extra analysis  and some extra documentation, but I think we would get great benefits.

     

    Yes, it would be nice to have a very fine gained analysis of the deployments. However, I think it is of lower priority than some of our other tasks atm. If we have some sort of framework for these validation rules it would be a good task for a new contributor.

  • 4. Re: Parsing and validation
    Stefano Maestri Expert

    Jesper Pedersen wrote:

     

    Validation has been implemented as method validate AND it's fired for  every metadata creation (aka as part of the constructor). I think it's  better to have valid metadata since we are working with immutable object  at metadata level (at least in 99% of cases). Of course the method  validate is public and can be fired by metadata's client too in case of  metadata forced modification (forceXXX methods). Opinions?

     

    The validator is also available as a standalone tool, and have Ant and Maven integration. Having the validator as part of our deployment chain is to ensure that people are actually using it before deploying to our container.

     

    So all the triggered rules should be collected and summaried in the generated report

    Oki, but it isn't our validator. Here we are validate our metadata representing xml or annotations. AFAIK our validator tool is about our javax.* implementations, generated from this metadata. IOW here we have implemented minimal check from xsds constraints, something that users could achieve standalone just validating xml against our xsds. Of course we are not enabling xsds validation for performance reasons, doing this validation on generated metadata instead.

     

    Jesper Pedersen wrote:

    Of course make this fine grained check would need some extra analysis  and some extra documentation, but I think we would get great benefits.

     

    Yes, it would be nice to have a very fine gained analysis of the deployments. However, I think it is of lower priority than some of our other tasks atm. If we have some sort of framework for these validation rules it would be a good task for a new contributor.

    Good point. I have created a JIRA for the community about that:  https://jira.jboss.org/browse/JBJCA-476

    Anyone interested?

     

    bye

    S.

  • 5. Re: Parsing and validation
    Jesper Pedersen Master

    Oki, but it isn't our validator.

     

    Ah, ok I guess that we have to define more clearly where what type of validation we are talking about since we are starting to have a lot of layers that focuses on different things.