My comment
I agree with your [Karen Lopez] "State of the Union" of Data Modeling as described in
your post related to this discussion. Leaving possible and probable
reasons aside that led to that state, I will here focus on your
question.
According to my observations, organizations have lost (and some never
adopted) the technique of targeted denormalization which bridges the gap
between the logical data model and the physical database design and
thus brings Data Modelers and DBAs together.
Based on a 3-level approach (I skip to discuss the steps that lead to a
Logical Data Model), responsibilities of Data Modelers, Developers and
DBAs can be assigned as follows:
- Logical Data Model: developed by Data Modelers / Data Administrators
- Transition Model: created by Developers and DBAs to denormalize the Logical Data Model according to the requirements of the application (create physical tables, views, indexes etc.)
- Physical Database Model: based on the Transition Model, DBAs add physical parameters allowing to completely generate the DDL for the (production) database
- Support denormalization of Logical Data Models
- Keep track of the lineage through the transformation process from logical tables/columns to database tables/columns
- Offer mechanisms that automatically propagate modifications from the Logical Data Model to subsequent levels (for modifications where the methodology is algorithmic and does not require human intervention)
- Offer mechanisms that allow to manually integrate modifications from the Logical Data Model to subsequent levels (for modifications where design decisions need to be taken)
- Include an interface that generates the script to create / alter the database (DDL) from the Physical Database Model.