Skip to:

Entity Relationship Models Evaluating Content and Quality

IRM Guideline 11, Version 1

Purpose for This Document
Information Resource (IR) models support an agency's business by representing business information: facts, rules, data and processes. This document focuses on one of the first and most important of the models: the Entity Relationship (E/R) model . The E/R model represents business information in the form of entities and relationships of interest to the business, that may eventually be converted into computerized data.
The purposes of this document are to: 1) advise agency IR management personnel about the need to establish a process for evaluating entity relationship models; and 2) to provide a checklist with examples to assist agency business experts and technical personnel in reviewing and evaluating models.
This document is primarily intended for reviewers of models, but other uses may exist as well. For example, this document may be useful when creating models to ensure quality is planned in at the beginning.

Audiences for This Document
The audiences for this document include managers, decision-makers, business experts and technical evaluators of entity relationship models. Others, such as modelers and data analysts, may also find this document useful as a model development aid. Organizations may use this document by picking and choosing portions that fit a particular project, or they may prefer to use it as a template for creating their own internal checklists for modeling.
Potential audiences are identified in the chart on the next page. The chart identifies why each audience might be concerned with model evaluation, what actions they should take, how they should proceed, and which sections of this document they should read.




What? (Actions)

How? (To Proceed)

What to Read?

Executives; managers; decision- makers

Models support the business. Business decision makers have a stake in the use and quality of models. Models can help an agency comply with standards and laws relating to data management, access and privacy.

Endorse / create a model evaluation process for the organization.

Assure prerequisites & standards for modeling exist within the organization.

Develop staff modeling expertise.

Circulate checklist to model developers and evaluators (business & technical).

Promote use of the checklist.

Initiate creation of modeling prerequisites & standards.

Plan for staff skill development.

Strongly suggested:

In Chapter I:

Executive Summary;

Appendices A & B

Optional (but ideal):

In Chapter I:

Information About the E/R Checklist;

Prerequisites to Using the E/R Checklist

Model reviewers: Business experts

To determine if business facts and business requirements, as represented within a model, are accurate and complete.


a) Model's Focus

b) Business facts, and

c) Business requirements, as diagrammed within a model.

Review prerequisites;

Use checklist and examples to evaluate models.

Strongly suggested:

In Chapter I:

Prerequisites to Using the E/R Checklist.

In Chapter II;

E/R Model Checklist for business reviewer.

In Chapter III:

Checklist Examples: for business reviewer.

For use of Checklist:

In Chapter I:

Appendix C;

Appendix D

Appendix E

Model reviewers: Technical experts

To determine if technical details of a model are accurate and complete.

To determine if prerequisites, standards, conventions and rules were followed in the model.


a) Technical details;

b) Standards compliance;

c) Model logic;

d) Diagram prerequisites, conventions and rules were followed;

e) Diagram conforms to Focus

Facilitate team model review

Review prerequisites;

Use checklist and examples to evaluate models.

Strongly suggested:

In Chapter I:

Prerequisites to Using the E/R Checklist.

Chapter II: E/R Model Checklist;

Chapter III: Checklist Examples

For use of Checklist:

In Chapter I:

Appendix C;

Appendix D

Appendix E

Creators of models

To cross check and validate work.

As an aid while developing models.

Inspect model; evaluate accuracy, completeness, and compliance with focus (quality).

Use checklist and examples to check work.

Strongly suggested:

In Chapter I:

Prerequisites to Using the E/R Checklist.

Chapter II: E/R Model Checklist;

Chapter III: Checklist Examples

For use of Checklist:

In Chapter I:

Appendix C;

Appendix D

Appendix E

Creators of modeling prerequisites

To establish prerequisites for modeling in an organization.

Establish prerequisites: standards, conventions and tools for models and modeling.

Use "Prerequisites to Using the E/R Checklist" and "References" sections of this document to identify prerequisites needed.

Strongly suggested:

In Chapter I:

Prerequisites to Using the E/R Checklist.

References section

For use of Checklist:

In Chapter I:

Appendix B;

Appendix E

Creators of vendor performance contracts

To establish performance requirements, and non-performance clauses, for modeling contracts

Include some or all of the following as contract performance requirements:


Checklist items (& examples)


Other (including local requirements)

Describe prerequisites for contract;

Select checklist items & examples for contract; Create performance clause for contract.

Strongly suggested:

In Chapter I:

Prerequisites to Using the E/R Checklist.

Appendix A;

Appendix B;

Chapter II: E/R Model Checklist

Chapter III: Checklist Examples

For use of Checklist:

In Chapter I:

Appendix C;

Appendix D

Appendix E

AUDIENCE NOTE: The E/R Checklist (part of this document) is technical in nature, and thus uses vocabulary and model examples that may be unfamiliar to some readers. Those who want or need to understand either the vocabulary or model examples should refer to the Appendix section. Appendix C provides a "legend" for how to interpret the model examples. Appendix D provides sample text for entities, relationships and attributes.

Models and Information Resource Management (IRM)

Minnesota government has adopted an Information Resource Management (IRM) philosophy that includes the use of information resource models as a primary building block. In an IRM environment, information is viewed as a strategic business asset, thus organization-wide information resource models are created that reflect the business. Quality is built into the models at the beginning, before any information resources are built. Ideally an organization would have a full set of models in place before initiating any development effort. However, many organizations are unable to devote the resources to a large, organization-wide modeling effort, and instead are building models as they work on projects. Regardless of the path taken, the eventual goal is building sharable information resources based on a set of organization-wide information resource models.

Information resource models are typically developed and managed under a Data Administration function within an IRM organization. Appendix E shows how entity relationship modeling - and model evaluation - are positioned within Data Administration, in IRM environment.

Rationale for Creating a Model Evaluation Process

Information models support the business of Minnesota government agencies by acting as blueprints for how business is conducted. One type of model, the entity relationship model (E/R), is currently being built by agencies to document their businesses, plan project information requirements and to eventually build data bases or data repositories. Entity relationship models support the business by documenting information requirements and important facts about the business. Business information is represented in an E/R model by entities (real objects of interest), relationships (between entities) and attributes (details about entities or relationships). Other facts, such as descriptions of entities, relationships, or attributes, business rules and business examples, are typically included in text that accompanies the model.

E/R models reflect the links between business entities, relationships and attributes, and an organization's data assets. It is therefore important for agencies to be able to evaluate the content and detail of E/R models. A thorough model evaluation determines if a model accurately reflects the business requirements and facts, and whether the model complies with standards. This type of verification contributes to quality data base structures.

The term, "E/R Model" can be used to mean a high-level business-oriented model of data, or a data structure view, often called "logical data model". (Physical data models are not considered to be E/R models, and are outside the scope of this document.) Each type of model documents information requirements from a particular perspective, and plays a role in the ultimate use of the model's information. For example, business data models represent real objects in an environment. A business data model can be validated by business people unfamiliar with the technical side of information processing. Business data may eventually be converted into technical language and structures to create data bases. The conversion of a business data model produces one or more models that represent views of the data, as opposed to views of the business. Data view models address data organization and structure, and define additional details needed to create data bases.

All models can benefit from thorough evaluations, based on the intended use and focus. Problems with models are easier to fix than errors that surface later, such as errors in data base structures. For this reason, an evaluation goal for E/R Models is to discover factual or requirement errors as early as possible.

Model evaluation criteria can be used to establish baseline performance requirements for modelers. Model evaluations can also be used to establish whether performance requirements were met by modelers, including outside contractors hired to produce models.

Model evaluations can help assess the feasibility and legality of potential data sharing opportunities. A multi-organization collaborative interested in sharing data would ideally evaluate how their models intersect (connect with) or overlap (duplicate) each other. Finding the areas of model intersection or overlap helps identify business areas for potential data sharing and can show where data practices compliance conflicts may exist.

In order to use models in collaborative efforts, the models need consistency and a high degree of continuity between them. There are multiple acceptable ways to represent model information, but some common modeling rules and methods must be in place to ensure models can be used in collaborative efforts. The E/R Checklist and Prerequisites portions of this document provide a "benchmark" of reasonable modeling criteria for producing consistent models and for evaluating models.

Organizations external to Minnesota government have also created models that may affect the state. For example, the federal government recently provided a data model as part of proposed standards for identifying land parcels within geographic information systems. If implemented, these standards could impact state organizations in the future. In addition to federal government models, private sector organizations, in search of collaborative efforts, have approached the state with their models. Agencies will clearly need to be able to evaluate externally-created models for impacts on decision making.

Externally-created models pose two kinds of risks. First, models created by external organizations are not meant to represent the perspective or business requirements of state government. Therefore it is unlikely that external models will accurately or thoroughly reflect state requirements. Second, an absence of common modeling rules and methods makes it difficult to compare requirements across organizations. A tool, such as the E/R Checklist, can help agencies evaluate compatibility between an external model and an agency model. Even in the absence of an agency model, the checklist can still assist with evaluating logic, consistency, and to some degree content quality, of an externally-created model.

When Are Model Evaluations Needed?

Model evaluations can add value to models that are meant for continued use within the organization. Models created solely as a communications tool, or for other non-permanent uses, may not need an evaluation. The uses and flow of models between Minnesota government organizations are shown in the diagram, "Purpose and Flow of Information Models Within Minnesota Government" (see Appendix A). Each flow represents a potential validation checkpoint where a model review may exist to validate requirements, evaluate contract performance, or establish quality benchmarks.

What Should Management's Role Be In E/R Model Evaluation?

Management should endorse and create an agency strategy for conducting model evaluations. Key objectives for model evaluations should be: 1) to improve model quality in order to ensure business needs are met and benefits realized , and 2) to ensure agency information resources, based on models, are built correctly and accurately. Management should also ensure modeling prerequisites and standards are established within their agencies, and that staff have appropriate modeling expertise to conduct evaluations.

Potential business benefits and uses for models that are of interest to management include the following:

1) E/R models form the foundation for future information resources, such as data bases, thus must meet business needs and represent business facts correctly and accurately. Data bases based on business requirements in E/R models will only be as good as the E/R models.

2) In the future, vendor contracts will require performance requirements, and non-performance clauses for the development of state information resources. Agencies that hire outside modelers will need to establish performance requirements and non-performance clauses for future contracts. Proficiency in evaluating models will be required to evaluate contract performance.

3) Agencies will need to determine feasibility and assess potential impacts of models created by external organizations and federal agencies seeking collaboration or developing standards.

4) Correct representation of business facts within an E/R model can help an agency comply with Minnesota Statutes Chapter 13, Minnesota Government Data Practices Act (MGDPA). High quality models help avoid unplanned data redundancy (thus data accuracy problems), and minimize the likelihood of collecting unnecessary data.

5) Agencies can use models to determine if vendor software will meet their business needs.


An E/R Checklist is included with this document. It is intended for use by agencies to establish a quality base for E/R models they create, use, or receive from other organizations. The checklist can help establish agency benchmarks for model quality, model requirements, or the performance evaluation of modelers. The diagram, "Using the Entity Relationship (E/R) Model Checklist with Vendor Contracts" (Appendix B) is shown as one way to use the checklist. This diagram shows using the checklist with vendor contracts to establish initial RFP and contract requirements, and to eventually evaluate contract performance.

Basics of Using the Checklist

The E/R checklist describes various conditions, and their resulting implications, that may be found through inspection of model details. Generally the checklist items require a technical evaluation, however a few also require a business evaluation. In this context, a "technical evaluation" means a review by those having in depth knowledge of business object and data analysis, and of the particular modeling techniques used. The checklist identifies when an evaluation condition should include business experts. Typically the degree of involvement of business experts should be adjusted to fit their knowledge, interest and available time. Conditions not identified for business review are assumed to need a technical review.

Both business and technical knowledge are needed to do a thorough model evaluation. Therefore, the most desirable approach is to use a review team. The team would ideally include business experts and experienced modelers who could facilitate the review process as well as provide technical expertise. Generally any model reviewer needs to be able to read and understand the models. Those evaluating technical aspects of models also need some modeling expertise, or must be able to obtain assistance from an experienced modeler.

The checklist identifies, for each condition, one or more potential implications or problems that may result. The purpose for the implications column is to help model evaluators by linking inspection conditions to reasons they might be important. The implications will not apply in all cases: some may not be relevant for a particular model, or in a particular environment.

The checklist also identifies the degree of risk associated with each condition - high, medium, or low - to assist reviewers in evaluating the impact of errors. The risk categories are:

High risk means a model is unlikely to meet needs, and any resulting effort based on the model (i.e., database, system, or collaborative) could fail.

Medium risk means the model has the potential to cause problems. The model can probably be used to develop a data base or system, but care should be taken to avoid the pitfalls (identified within the checklist).

Low risk means the model may have violated some rules or may be confusing. Generally problems in this category don't cause failures.

Some checklist conditions are accompanied by examples. Where applicable, the "Examples" column provides either a text explanation or a reference to diagrams in the Appendix.

Assumptions for Model Evaluations

The following assumptions apply to this checklist and should be understood by model reviewers:

1. Prerequisites are assumed to exist and be available to model reviewers. If prerequisites do not exist, some checklist items might need to be modified or bypassed. The prerequisites are:

a) Focus statement, or equivalent (described elsewhere in this section of the document)

b) Naming standards, for entities, relationships and attributes

c) Definition of what details are contained in the textual information (See Appendix E)

d) Identification of diagramming conventions used to create a particular model

e) Applicable laws, policies, standards, guidelines identified (or identifiable by the reviewers)

2. The judgment of model evaluators, in consultation with the modelers, must be applied when using the checklist. Sometimes a reviewer may detect a checklist condition that was created deliberately by the modelers, and is OK in that situation. Some implications may be acceptable in only certain situations or environments, while others may be acceptable because of the model's focus. Modelers decide how to model business facts based on what makes sense for a particular environment, or on the model's purpose. Deliberate decisions made by the business, (assuming facts are correctly represented), should be respected by model reviewers. When in doubt about particular details, reviewers should evaluate the risk associated with a condition, and the model's focus and purpose.

3. The implications or problems with E/R models tend to fall into one of the following categories:

a. Things that shouldn't be in the model (typically outside focus)

b. Things that are missing from the model

c. Things that should be there but are not laid out correctly (facts not accurately represented).

When a reviewer detects a checklist condition that turns out to be a modeling error, the impact of the error depends on the level and purpose for the model. At the conceptual level, the implication is often that business facts are incorrect or not clear. The implications at the data base design level may include building the wrong data structure, building structures that are out of compliance with laws, not making data accessible, or creating business solutions that don't meet the business needs.

4. The examples that follow the checklist were created using a modified 'Chen' diagramming convention. A legend is provided to help interpret the examples (Appendix C). The use of a particular convention, such as Chen, is not a requirement. The selection of diagramming conventions is left up to individual organizations, provided the convention can represent and communicate about business entities, relationships, attributes, cardinality, and textual information correctly.

5. Many checklist inspection items with data practices implications are also of general concern, and are included in the main portion of the checklist. A "Special Considerations for MGDPA" section documents checklist items that are specific to the Minnesota Government Data Practices Act (MGDPA) only (i.e., not applicable to model validation in general). This section will be updated as relevant MGDPA examples surface.

6. This is a draft document. Comments are welcome!


In order to use this checklist, the following must be identified:

1. Type of Entity Relationship Model:

Business Data (or Conceptual Data) Model:A model that shows the structure of a set of business objects (entities) and their relationships. (The Checklist Examples are of this type.) A conceptual data model shows data requirements independent of project implementation and independent of technology. Organizations do not always agree on terminology of what is, or is not, a valid "conceptual data model". However, the semantics are not important: what is important is that a model reviewer needs to know the modeler's intent.

Logical Data Model:A model showing detailed data requirements for a project, independent of a specific computing environment, but detailed enough to be translated into one. Usually a logical model is fully attributed and attributes are defined in detail.


This checklist can be used regardless of how an organization defines the different types of models. This checklist does not apply to physical data models.

2. Business Purpose of Model

A model evaluator needs to know the intended business uses of an E/R model to determine the impact of problems found by the review. The evaluator also needs to consider whether the degree of completeness and "polish" of the model is adequate for its intended purpose. For example, models for scoping a development effort do not need to be as complete, or error free, as those that will be used to create data bases. Models created for eventual data base implementation need to be fairly detailed, and free of implications that might lead to incorrect data structures, redundantly stored data, or unnecessary data collection and storage.

Some possible business uses are:

Building a database

Scoping a development effort

Documenting the business' objects or data (creating a baseline)


Defining/understanding your business

Other business uses: reengineering, changing strategic directions, downsizing, etc.

3. Purpose for the Model Review

Some purposes for reviewing models are:

To give the go ahead for database design

"Are we in the right ballpark?" review

Check the (business) relevance of the model

Check accuracy of facts represented by the model

Check for compliance with standards and laws

Check models from outside sources for

compatibility with internal needs

Other purposes

4. Focus of the Model

"Focus" consists of a set of parameters that is an appropriate tool for defining any project or effort, including a modeling effort. Focus provides clarity to business users, direction to (model) developers and data base designers, and benchmarks for model reviewers.

The following component parameters are included in Focus:

Scope is the portion of the world that is the subject of the project (or model). It defines the project's boundaries. Scope statements typically include statements about boundaries, i.e., "From ___ , To___" or "Including____, Excluding _____".

Perspective is the point of view used to determine which aspects of the world are relevant. Ideally you should include as many perspectives as you think will ever be needed. Perspectives may include:

Executive management

Supervisory management

All operational areas


Other organizations

Detail is the amount of detail and precision reflected in the model. For example, a model to complete cost estimates may be built at a high level and with less detail, than a model intended to be input to a data base design.

Universality is the level of generalization in defining an application or model. Because universality identifies the degree of reusability and number of potential uses or applications for a model, it is key to data sharing. Universality asks these questions:

1. During what period of time will the model be in use?

2. Is there a period of time during which changes to the model be avoided?

3. During what period of time can the model be extended, before starting over?

Scope of Integration identifies other projects or models that may be impacted by this project or model.

5. Special Legal Requirements / Needs (Data Requirements)

Identify all special legal requirements and needs that must be considered within the model. An example is the data privacy classifications established by state and federal laws that govern access and privacy, such as the Minnesota Government Data Practices Act (MGDPA).

6. Applicable Statewide and Local Standards

The following standards are needed for modeling:

Diagram convention standards

Text (non-diagram) content standards

Naming standards

Abbreviation usage standards

Modeling Tool Selection

Statewide Technical Standards

Other standards


These documents relate to E/R modeling, modeling prerequisites, or potentially relevant standards. They can be found in the old IPO "Creating and Managing Information Resources for Minnesota State Government Organizations", or are available from OT at:

(612) 215-3878 (voice)

(612) 215-3877 (fax) (Internet address).

"Annotated Reproduction of Minnesota Statutes Chapter 13 - Unofficial" (and other materials related to MGDPA) can be obtained from the Department of Administration,Technology Management Bureau, Public Information Policy Analysis Division.

Statewide Open Standards for Information Systems

"Statewide Policy on Technical Standards"

IRM Guideline 2: "Information Resource Models: Creation and Use"

IRM Guideline 3: "CASE Tools: Selection and Use"

IRM Guideline 9: "Data Administration: Data Naming Primer"

IRM Guideline 10: "Data Administration: Data Naming Practitioner's Guide"


The materials contained in this publication are based, in part, on copyrighted and proprietary works from Advanced Strategies, Inc. of Atlanta, Georgia, that have been developed in collaboration with and licensed to the State of Minnesota government and are used herein with permission. The following individuals prepared the content and production of this publication:

Anne Bentley - Minnesota Office of Technology

Steve Farrell - Advanced Strategies, Inc.

Special thanks to the following individuals who provided content review for this document:

Nancy Anderson - Minnesota Office of Technology

Ann Bidwell - Pollution Control Agency

Roberta Casey - MnDOT

Mary Eide - AmeriData Consulting

Darryl Folkens - MNAssist

Steve Gustafson - Minnesota Office of Technology

Kathy Hofstead - MnDOT

Tim Leister - MnDOT

Eileen McCormack - Minnesota Office of Technology

Karl Olmstead - MnDOT

Sarah Thompson - Supreme Court

Nancy Weirens - MnDOT

Carol Worden - Minnesota Office of Technology


Appendix A:

Purpose and Flow of Information Models Within Minnesota Government

Appendix B:

Using the Entity Relationship (E/R) Model Checklist with Vendor Contracts

Appendix C:

Legend For Reading E/R Checklist Examples

Appendix C:

Legend For Reading E/R Checklist Examples - Cardinality (CONTINUED)

Appendix D:

E/R Model - Sample Text

Appendix E:

Framework for Conducting Business Within an IRM Environment
Where Does 'Data Administration' Fit into the Organization? (no longer available)
What is Data Administration? (no longer available)
Data Modeling Activities Within Data Administration (no longer available)

Chapter II:

Entity Relationship Model Checklist

Chapter III:


Example 1
Example 2
Example 3
Example 4
Example 5
Example 6
Example 7
Example 8
Example 9
Example 10
Example 11
Example 12 1 of 4
Example 12 2 of 4
Example 12 3 of 4
Example 12 4 of 4
Example 13
Example 14
Example 15
Example 16
Example 17
Example 18 1 of 3
Example 18 2 of 3
Example 18 3 of 3
Example 19
Example 20
Example 21
Example 22
Example 23
Example 24
Example 25
Example 26
Example 27
Example 28
Example 29
Example 30
Example 31
Example 32
Example 33 & 34
Example 35