My comment
The metadata repository is the right place to store data quality standards:
those that can be automatically transformed into database constraints
such as referential integrity, data types, data nullability, data
domains etc. as well as, more importantly, those business rules that
require human interaction.
The hardest part is the perseverance and discipline necessary to
maintain the data quality standards, but also to instruct and monitor
users that standards are consequently applied.
My additional comment
To answer the question .., related to
my comment "The hardest part is the perseverance and discipline
necessary to maintain the data quality standards, but also to instruct
and monitor users that standards are consequently applied.":
The weakest element in the integrated system of people - processes -
tools is undoubtedly the human factor. Users that enter data do not only
need to be trained and monitored in their doing, but the organization
has to create a cultural climate that rewards high quality of data.
Example: If people that enter data are paid by number of correctly and
completely created/updated objects (persons, addresses, products, orders
etc.)), the resulting data quality will naturally be higher than if
those people are paid by time.
In general, there needs to be a system of incentives that make it
attractive for users to contribute to data quality. A simple, but
important factor to increase their motivation is also to ask users on a
regular basis for their feedback about difficulties and possible
improvements of the process.