Creating the Infrastructure for Capturing a Data Model

At this point, all I’ve given you is a set of tables that can store metadata about database tables and their relationships – they pretty much reflect a subset of what you can get from the system views.  Let’s take a look at the process for capturing this information as the database is modified.  The following are the major steps:

  1. Create Entity Tracker database
  2. Create change log table, supporting tables to capture the data model snapshot, and a view to help later with being able to dynamically construct a graphical data model diagram.
  3. Create supporting stored procedure and user-defined function to be invoked by DDL Trigger to create the database snapshot
  4. Create the DDL Trigger to execute the stored procedure whenever a change occurs to any entities affecting the database.
  5. Implement the DDL trigger and stored procedure into the database to monitor.  Note that the actual tables containing the changes are centralized into tables in the Entity Tracker database.

After these steps are done any change to any table or relationship will result in a new version of the information being stored in the Entity Tracker database.  Over time, this will allow us to monitor how the data model has changed for a particular database.

Attached is a zip that contains the SQL needed to create the Entity Tracker database tables, UpdateSchemaRelations stored procedure, and UpdateSchemaVersion DDL trigger.  Let’s take a look at each step and some of the code.

1) Create the EntityTracker database – That’s actually not in the zip (CREATE DATABASE ENTITYTRACKER is all you need or use SSMS).  You might want to call it something else, that’s not a problem, but you’ll need to modify the UpdateSchemaRelations stored procedure in that case to match.  The reason is that this stored procedure is actually implemented into the target database in addition to the DDL trigger.  The reason is that the stored proc references system tables from the database being monitored and I haven’t figured out a way to get the stored proc to be able to do that aside from putting it into the target DB (if somebody else knows a way, please comment). 

2) Next, the change log and supporting tables (the ones from the diagram in the introductory post) need to be created along with the view to facilitate querying relationships.  The script EntityTracker.Tables.sql and EntityTracker.Views.sql in the handles this.  Note, we only care about the view_SchemaTableRelations view at this point, you can disregard the other views for now.

3) Create supporting stored procedure (UpdateSchemaRelations) for DDL Trigger to invoke and supporting user-defined function.  These files are named UpdateSchemaRelations.Procedure.sql and the function is found in EntityTracker.Functions.sql (We are only interested in the dbo.udf_GetTableMaxDepth function in the This is the core logic that actually makes the snapshot of the data.  Where possible, the procedure uses standard Information Schema views with the exception of the relationships snapshot (had to use Sys.Indexes for retrieving key info due to problem with unique index not included as table constraints when the primary columns are already defined as a separate primary key).

Before getting buried in the code, let’s look at the high-level logic:

  1. Insert list of tables into SchemaTable table
  2. Insert list of referential constraints into TableRelation table
  3. Insert Relation column pairs associated with referential constraints into RelationColumnPair table
  4. Calculate the relationship for each depth using a custom TSQL recursive function that traverses the relationships to find the deepest level for each table. One of the challenges for this process is that a RDBMS allows self-joined tables and circular relationships so the function must check for these conditions to avoid infinite recursion.  Note that this function relies on the dbo.udf_GetTableMaxDepth function which relies on the view_SchemaTableRelations. 


For the next posts, we’ll drill down into the specific .SQL code and finally we will look at the user interface in more detail.  I’ll also make a live demo available that you can use to get an appreciation of the user experience.  After that, we’ll go back down into the code used to support the UI and discuss extending this technique for other scenarios.

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s