Close-up view on white conceptual keyboard - Test

Validation

Validation functions check whether or not information has been entered correctly. You will surely be familiar with the irritating references on websites such as “Please enter your date of birth!” Even if this tends to get on users’ nerves, sound data validation saves time, and ultimately money, during the work schedule later.

Data validation. In computer science, data validation is the process of ensuring that data have undergone data cleansing to ensure they have  data quality, that is, that they are both correct and useful. It uses routines, often called “validation rules”, “validation constraints” or “check routines”, that check for correctness, meaningfulness, and security of data that are input to the system. (Source: Wikipedia)

Validation takes place on several levels in software products:

  • Validation of field entries (“The entered value „13.13.2016“ is not a valid date”.): By checking field values, you ensure that a data value can be interpreted later on in the software.
  • Validation at dataset level („The start of the holiday must be earlier than the end of the holiday.“) Dataset verification ensures that the values of a data set that have been entered are consistent with one another. However, no other datasets are taken into account during verification.
  • Validation on tables/dataset groups (The service car has already been reserved for „13.12.2016“).  By verifying tables, you can prevent datasets of the same type (here: reservations) coming into conflict with one another or being completed incorrectly.
  • Validation of the domain context (The service car cannot be reserved for „13.12.2016“as the leasing agreement will have expired by this date.“) These more complex checks link parts of the business process with one another. This means they may be easier for the user to understand but are difficult to connect within the software itself because data is displayed in different masks or individual datasets cannot be recognized among large volumes.

As you can see, domain validation has a great deal of potential. Content-related connections to the issues can be verified, thus avoiding errors. But it is not always easy to identify the right areas for such checks. The obvious way is to search for possible input-error sources together with users and developers. Is a software function complex, does it require a lot of data, or is it difficult for the users to understand? If so, then this could be a good candidate for checks. Think about what kind of errors (can) be made – and formalize them. The resulting check functions will save a number of employees a lot of time: the testers, the test users and later o, of course, all the users in the company. Use test and introductory phases so that you can identify further requirements after checks have been carried out. Whenever an error occurs that can be traced back to the wrong data, ask yourself: Can we guarantee that this error is identified by means of automated checks?

Often it is useful to distinguish between errors and warnings here. Errors should be classed as: “A calculation isn’t feasible if it hasn’t been corrected!” and warnings as: “There could be an error but this may be a desired connection.” You should be able to hide warnings and not disrupt the further process if this circumstance is desired.

Examples of checks in the field of network planning:

  • Does the network structure enable each market to be supplied with the products that are in demand there?
  • Can every line (each production site) manufacture products which successors (e.g. markets or other production sites) need?
  • Has it been defined which components are to be used for each product?
  • Can each component required at a production site be supplied?
  • Is the production capacity that has been determined at a production site at zero over a longer period of time? (Short intervals without production capacity can, for instance, be a result of reconstruction, holiday closures or similar)

Warnings can be the following checks:

  • A production site supplies only one single market/buyer.
  • A product can only be manufactured at one production site.
  • The weight of a component is heavier than all other products by factor 1000 (the error „by 1000” tends to occur if the values using thousand separators are used in the software environment).

If the error source has been detected and a verification system has already been drawn up, it is easy to correct the errors. These can run automatically through algorithms or open a view in which the correction must be made.

  Last but not least…

… here is a list of items where you can check whether software meets your expectations relating to error checks and prevention:

  • Are error checks performed at field level and are the check results visible without the work process being disrupted (e.g. the cell has been marked red but exiting the cell has not been prevented)?
  • Are error checks possible at table level and are understandable i.e. individual explanatory texts displayed?
  • Can checking a more complex, domain- specific link be conducted?
  • Can check results
    • be displayed so that they support the correction process (e.g. in an area of the window?)
    • be hidden, e.g. if warnings do not apply?
    • lead directly to the relevant data set?
    • contain function calls which correct automatically or in part?

Make use of the possibilities that data validation offers –  and let OPTANO support you!

Do you have any suggestions, questions or comments? Is there anything you think we haven’t broached in this article or do you have an idea about how we can make OPTANO even better? Then we’d like to hear from you!

Other topics you may find interesting…