Skip to:

Case Tools Selection and Use

IRM Guideline 3, Version 1

PURPOSE AND SCOPE
Information Resource Management (IRM) is Minnesota government's strategic direction for developing and managing information resources. IRM requirements have been incorporated into the 1996/97 biennial funding requirements of the Minnesota Office of Technology (OT). Agencies requesting funding for information resources must address six "Critical Success Factors" for IRM, including the creation of information resource models. Several agencies have asked for advice on acquiring automated tools to help manage their information models. OT has prepared this guideline to assist Minnesota government decision makers in evaluating the use and selection of automated tools for their organizations and/or projects.

DEFINITIONS
Automated tools for developing and managing information resources are typically used by business people, information architects and developers of applications and technology resources. Some types of automated tools are designed to manage business information or to document an organization's information resources. Some types are designed to assist with the development of information resources by automating its processes. For example, the creation or testing of computer code can be performed by this type of automated tool. Some tools integrate the features of both types. All of these automated tools belong to the family of products referred to as 'CASE'. The Office of Information Technology, State of California, has defined 'CASE' as follows:

"Computer Aided Systems Engineering (CASE)" is used to refer to a set of automation tools and techniques that provide support to the process of planning, developing, installing and maintaining information systems. CASE productivity tools can give automated support to strategic planning and enterprise modeling, to describing and diagramming business processes, to designing systems, to generating computer programs from specifications of the business functions, and to maintaining application systems...."
It is difficult to define what is, and is not, a 'CASE' tool. First, providers of automated tools for managing information resources are not consistent in identifying which belong to the 'CASE' family and which do not. As a result, some products with CASE capabilities may be classified as CASE, while other similar products are not. Project management software and some 4GL's (4th Generation Languages) are examples of products that may be classified either way.

Second, different classifications have evolved to help clarify what is, or is not, a "CASE tool". The related definitions are not necessarily based on the same criteria. For example, some definitions focus on the kinds of products that are included. For example, 'Upper CASE' tools may be defined as including entity-relationship modeling capabilities, while 'Lower CASE' tools may be defined as including program code generators. Other definitions focus on how products are used. 'E-CASE' describes products used on a large, "Enterprise-wide" scale, while 'W-CASE' describes "Workgroup" products used on a smaller scale, usually on a desk top or within a small workgroup.

The different perspectives in defining CASE tools has resulted in confusion about the categories and types of CASE tools. For purposes of this publication, three main categories of CASE tools are identified based primarily on how and when they are used within the information resource development process. The three categories are:

  • 'Upper' CASE tools: For initial planning, requirements analysis, or conceptual design phases. These tools include products that captures business requirements, or produce and manage business models (data, processes, rules, technology, etc.).
  • 'Lower' CASE tools: For automating the systems development phases of design, construction, or installation. These tools include any product that aids in the post-planning and analysis phases of development. Ideally information models created within 'Upper CASE' tools provide the requirements and design criteria for 'Lower CASE' generation of code, database schemas, or other deliverables.
  • 'I-CASE' - (Integrated CASE): For initial planning, and all the way through installation. These tool sets integrate 'Upper' and 'Lower' CASE tools and span the entire development life cycle.

USING AND SELECTING CASE TOOLS
IPO recognizes that most information models can't be documented or managed without automated tools. At a minimum, agencies will need CASE tools for their project and agency-wide information modeling efforts. Since 'Upper' CASE tools are used for creating and managing IRM models, IPO's initial focus is on requirements for 'Upper' CASE tools. IRM models are an important step toward implementing IRM and reaching the ultimate goal of statewide data sharing in Minnesota government.

Before selecting CASE products, agencies should evaluate how and why they plan to use these tools to meet both short and long term requirements. Agencies should use caution in selecting and using CASE products. Many tools currently on the market have shortcomings that may interfere with reaching the goals or realizing the benefits desired by Minnesota government. For example:

  • CASE tools were not designed to support model management and shared data across organizations, which is a strategic goal of Minnesota government,
  • CASE tools don't support the statutory requirements of Minnesota government (i.e., to allow identification of data classifications under the Minnesota Government Data Practices Act),
  • CASE tools were not designed to be modified to support user's unique requirements,
  • CASE tools were not designed to support integration complexity of multiple platforms or development environments,
  • CASE has been promoted as a set of tools to automate systems development activities, such as generating program code directly from enterprise models. Realization of this goal has been more theoretical than practical for many organizations, and
  • The CASE tools that provide the most sophisticated capabilities also tend to be the most expensive. They generally require the greatest commitment to learn and maintain, and often provide functionality that overlaps other tools.

For short term needs, 'Upper' CASE tools may be valuable for agencies to become familiar with modeling. They can also provide support for capturing and managing current project and agency information. However, CASE technology is not mature enough for most agencies to realize the integration and automation success that Lower or Integrated CASE have promised. Also CASE tools on the market today do not address Minnesota government requirements and complexities, and are not easily adaptable to meet these needs.

Problems with the use of CASE in Minnesota government may be resolved in the future. Additional automation tools will address the integration shortcomings of CASE and provide support for more complex data management. Some of these tools can be extended by users to fit their environments, thus hold promise for managing statutory requirements unique to Minnesota government. CASE vendors will continue to improve and upgrade, and to form alliances with vendors of other types of automated tools.

IPO is currently evaluating other automation tools in addition to CASE for suitability and applicability to Minnesota government. An example of other automated tools currently being evaluated by OT is repository technologies. Repositories are often used as companion products to automated CASE tools because they provide extended and enhanced data management features. Repositories manage dictionary information, plus provide additional data collection / data storage features. Within a repository, an organization can define and manage a wider range of business information than with CASE tools. A repository can also integrate information from and about multiple existing information systems, to broaden an organization's information management capabilities.

OT's present position on CASE tools is that most agencies should consider CASE tool investments to be part of an interim strategy toward automating information resource management and creating information resource models. Prior to making any investments, agencies should evaluate their needs and their readiness for CASE tools, especially those that go beyond the capabilities of 'Upper' CASE.

Agencies that are considering 'Upper' CASE tools at this time should recognize the potential need for 'Lower' CASE, or other automation tools in the future. OT recommends that agencies position themselves by considering future automation as part of their overall CASE strategy, and plan accordingly for future expansion or conversion. However, agencies should carefully evaluate costs, benefits and risks to ensure they are not investing in functionality they can't or won't use in the future.

Many agencies are selecting CASE for a first time pilot or training project. Other agencies may already have experience with CASE tools. OT recognizes some agencies may be at a more advanced stage of CASE "readiness" than others. These agencies may need to go further in automating development of information resources, and may be able to realize additional benefits of ('Lower') CASE despite the shortcomings. OT recommends that all agencies develop criteria that will allow them to:

  • Assess their readiness for CASE,
  • Determine an appropriate CASE strategy,
  • Select CASE products that fit their environments and readiness, and
  • Evaluate their CASE success

CASE SELECTION CHECK LIST
The following checklist includes seven categories for assessing organizational "CASE readiness" and CASE technology requirements. The checklist is intended to assist organizations in developing criteria that fit their needs and environments. Samples of CASE selection criteria and an actual CASE assessment (developed for a Minnesota government organization) are included at the end of this guideline.

1. ASSESS YOUR ORGANIZATION:
Your organization should assess its "readiness" for CASE. This assessment should include your organization's commitment to CASE, willingness to change and ability to implement.

  • Need: How urgent is your organization's current need for CASE? Is your organization motivated to realize the benefits of CASE? The greater the (perceived) need, the greater the likelihood of your organization following through on implementing CASE.
  • Commitment: How committed is your organization to CASE? Is CASE a strategic initiative or a small test / pilot effort? Your organization's amount of effort, willingness to accept risk and ability to plan and execute over the long term will be affected by its commitment levels.
  • Investment: How much is your organization willing or able to invest in CASE?
  • Culture: How much risk is your organization willing to take? Is your organization willing to standardize how it represents its knowledge? Is your organization willing to expand the use of information technology into its business units? Is your organization currently using a development methodology? If not, is your organization willing and able to introduce a disciplined approach to development? Can your organization support an implementation and training effort for new technology? Is CASE appropriate to your current work environment or projects?
  • Existing environment: What level of CASE knowledge does your organization currently have?.. of expertise?.. of actual experience?.. of education? .. of skills? Is CASE consistent with your organization's strategic business and information technology plans? Does your organization have technical standards in place? Does your organization have existing technology - hardware and software - necessary to support a CASE environment?

2. ASSESS THE INITIAL CASE PROJECT:
The first CASE project should be selected with care so that it will:

  • Deliver expected benefits that are of value to your organization
  • Succeed in its mission
The first project should generally not be mission-critical, even though mission-critical applications may reap the greatest benefits from CASE. The first project should be mistake-tolerant, able to sustain risk, and able to tolerate a learning curve.

3. ASSESS YOUR ORGANIZATIONAL IMPACT OF CASE:
Generally the introduction of CASE will affect your organization in ways it may not have anticipated. Therefore, your organization needs to consider, and plan for, changes introduced by CASE such as:

  • Bringing more structure to the development environment
  • Requiring more emphasis on work groups instead of individual efforts
  • Requiring training in analysis methods, development methodologies, automated tools, modeling techniques, etc.
  • Requiring reengineering: automation of what your organization already has or does won't fix what is wrong or sloppy
  • Requiring rethinking of your organization's rules, policies and practices to tighten discipline. For example, an organization that changes from present methods of compiling code to automatic code generation, must learn to change CASE tool information instead of changing the generated code.

4. ASSESS CHARACTERISTICS OF THE DEVELOPMENT ENVIRONMENT:
Your organization should consider its use of, or plans for, development approaches, technologies, or philosophies, such as:

  • Object orientation: may need an OO (Object Oriented) DBMS
  • Prototyping: may suggest a shorter term need to include Lower CASE
  • Client-server
  • Code generation
  • Rapid Application Development (RAD):
    • Reusable models
    • Reusable objects
    • Reusable applications
  • Repository requirements:
    • An external repository (separate from CASE tool)
    • A repository as part of its CASE tool
    • Functional requirements for a repository
  • Need for integrated tool sets
  • Need to integrate different CASE tools / platforms, etc.
  • Need for software measurements, such as:
    • Function Points (FP)
    • Lines of Code (LOC)
  • Intent to do reengineering or reverse engineering using CASE tools
  • Need to support Joint Application Development/Design (JAD) or Joint Requirements Planning (JRP)
  • Techniques or other approaches your organization needs to support

5. ASSESS OTHER REQUIREMENTS OF YOUR ENVIRONMENT:
Your organization should also consider other aspects of the current or planned environment that may affect the selection or use of CASE tools. Some examples are:

  • What is the purpose for automated tools in this environment?
  • Who will be the users of the tools? How easy must they be to learn and use?
  • What compatibilities are required? Are there existing methodologies, project management systems, or technical standards that will be affected by CASE?
  • What support is required for existing environments: DBMS, programming environment, or testing tools?
  • What platforms are currently supported? .. planned for the future?
  • What are your organization's technical requirements?
    • Security - access, virus protection, firewalls
    • Documentation and training; vendor support
    • Maintenance: by the vendor; by your organization
    • Graphical User Interface
    • Interfaces to other tools, repositories, DBMS, spreadsheets, etc. (currently available, or planned for the future)
    • Architecture of product and underlying DBMS
    • Openness: non-proprietary, portability, scalability
    • Methodology independence or dependence

6. ASSESS OTHER (FUNCTIONAL) REQUIREMENTS:
Your organization should determine functional requirements for CASE such as:

  • Diagramming & methodology capabilities:
    • Support for different types of diagrams: Entity/Relationship, Data Flow, State Transition
    • Support for different diagramming conventions and/or methodologies: Martin, Chen, Bachman, Yourdon, Ganes & Sarson
  • Dictionary / repository capabilities:
    • Support for interfaces to external repositories
    • Capabilities of the dictionary
    • Functionality duplication between repositories and dictionaries
  • Configuration management / version control
  • Analysis and reporting features
  • User-defined reports and views
  • Linking, or decomposing, from one diagram, diagram type, or entity, to another
  • Other functional requirements needed by your organization

7. HOW WILL YOUR ORGANIZATION EVALUATE ITS CASE SUCCESS?
Before beginning a CASE effort, your organization should determine how it will evaluate the results of its efforts. Your organization should begin with realistic expectations for success and a plan for evaluating the results. A 1992 study comparing satisfaction among CASE users showed that satisfaction levels were higher among those with realistic expectations, organizational commitment and concern for the impact of CASE on your organizations. The study also showed those who could quantify an improvement had a much higher rate of satisfaction than those who could not.

The following criteria should help your organization plan with success in mind:

  • Establish measurements for evaluating CASE effectiveness before beginning the effort: identify project, organizational, operational, or architectural goals that can be measured and set expectations for reaching the goals.
  • Select initial projects for which measurement goals can be applied. Set project milestones for evaluation of success.
  • Take a long term view. Your organization should set strategic goals for CASE: success does not come overnight, and real benefits come over time when CASE technology is applied to strategic applications.
  • Do a risk assessment before beginning the effort: anticipate the potential effects; determine how to minimize the risk of negative consequences and maximize the likelihood of positive consequences.
  • Do implementation plans for CASE. These can span several projects, or can be projects themselves (such as initial training efforts).
  • Select an initial project that demonstrates benefits your organization wants to achieve.
  • Select a strategy for selecting, acquiring and using CASE based on evaluating criteria identified by your organization.

FUTURE DIRECTIONS FOR CASE IN MINNESOTA GOVERNMENT

Statewide requirements for developing information resource models and integrating models across organizations have not been developed. Several pilot projects are underway that focus on data sharing across organizations or integration of data across organizations or computing platforms. As a result of these projects, needs have surfaced to develop data naming standards, establish consistency of modeling and analysis methods and address other related subjects.

CASE and other automated tools will play an important role in managing statewide data and information models in the future. As data and information modeling topics surface, and are evaluated from a statewide perspective, the role of automation will also be considered. However, automation requirements must be secondary to:

  • Maintaining consistent data administration requirements for all Minnesota government organizations in areas that affect data sharing and data reconciliation,
  • Ensuring quality information modeling and analysis methods are used throughout Minnesota government organizations, and
  • Capturing and maintaining business requirements of Minnesota government organizations.

Examples of some of the State's business requirements are:

  • Meeting public policy requirements, as defined by the Minnesota Government Data Practices Act and other Minnesota laws, to ensure that access to government data meets statutory requirements,
  • Achieving data sharing across organizations,
  • Achieving integration of information models and data across computing platforms

The long term strategy for CASE in Minnesota government must first address the above requirements, and second, evaluate automation technologies that meet those requirements.

SAMPLE 'CASE' SELECTION CRITERIA
Organizations need to establish criteria for CASE selection in several categories to reach their goals. The following is a set of sample goals and criteria for choosing CASE tools:

Investment Costs: Goal: Try to minimize initial CASE tool costs; minimize, or avoid, requirements for additional hardware or extensive training to support the tool.

Business Needs: Goal: Assuming the business need has been established, identify any project-specific or agency requirements, and incorporate these into the selection criteria.

Ease of Use: Goal: Select tools that are easy to use and easy to learn; avoid tools with rigorous methodologies to learn and manage; consider organizational and skills requirements for using and supporting CASE tools and managing information models.

Flexibility: Goal: Select tools that support various modeling types (data and process), methodologies and drawing conventions; select tools that allow users to control modeling rules.

Information Integrity: Goal: Select tools that include data dictionary support, version and change control, security measures and support multiple users (if needed).

Protect Investment: Goal: Select tools that allow import and export of model data. Avoid tools with proprietary methodologies, DBMS, or other constraining features.

CASE Tool: Capability / Requirement / Criteria

Product #1

Product #2

Product #3

1. Support for (preferred) diagramming convention (i.e., Chen, Martin, Bachman) (Organization selects its preferred method)

Yes

Yes

No

2. Price

 

$1545 plus $1395 each additional client copy

 

3. Client requirements (work station): RAM / disk (in megabytes)

386/25, 4/80 min; 486/33 8/80 rec.

386 +

2/1.5

4. Server requirements

NetBIOS compatible, Windows, Novell

Novell; 3Com; NetBIOS; Banyan;DECnet; LAN Manager; StarLAN

 

5. GUI requirements

Windows 3.1

Windows 3.1

Windows 3.0+

6. Product's current version #

4.1

3.0

 

7. Multi-user capability

If yes, how supported?

Yes

----

Yes

----

Yes

----

8. Ease of Use

 

 

 

9. Security Device

Serial # Check

Block / Diskette

Block

10. Define Entity capability

Define child; explode child; name attribute; explode attribute

Yes

Yes

11. Define Attribute capability

Yes, see above

Yes

Yes

12. Define Attribute without Entity capability

Yes

Yes

No

13. Define Attribute characteristics

 

 

Mandatory (cannot define attribute without characteristics)

14. Define Relationship capability

Yes

Yes

Yes

15. Alias capability

 

Yes

Yes

16. Masking capability

 

Hidden identifier for selecting specific objects for presentation

No

17. Canned reports

Yes

Yes; includes affinity analysis

Yes (6)

18. Custom reports

Yes, has Report Writer

up to 125

 

19. Can create matrices within tool

No

Yes (11); includes CRUD; can change in WIN.INI

No

20. Underlying DBMS structure

dBase III+

dBase III

 

21. Links to Repository

 

 

 

22. Import / Export Capability

dBase III, Clipper, R:Base, FoxPro

Yes - either CSV or ASCII text format; only one object type per export file; spreadsheet interface; additional module,link to PowerBuilder

 

23. Rollup / Partitioning

 

 

 

24. Change control method

Lock at file and record levels, password, privileges, permission, lockout

Can check in and out objects

 

25. Capacity

256 objects / diagrams

 

 

26. Process modeling capability

Yourdon/ DeMarco; Gane & Sarson; SSADM

Yourdon/ DeMarco; Gane & Sarson;Ward & Mellor

No

27. Methodologies supported

Yourdon/ DeMarco; Gane & Sarson; SSADM; Ward-Mellor; Hatley; Martin; Chen; Bachman; IDEF1X; MERISE; others

Martin; IDEF0; IDEF1X; Shlaer/Mellor; OOA&D; Coad/Yourdon; Booch '91

No

28. SQL generator

All SQL: ANSI SQL; Clipper; dBase III; dBase IV SQL; SQL Server; Progress; Sybase; ORACLE; others (18)

Yes

Yes

29. Normalizer

 

 

Yes

30. Documentation / technical support

 

 

Poor

31. Other / comments

 

Has long term direction; more power in add-ons (like direct link to Power Builder)

Weak product; forced to put physical characteristics on attributes for entities and relationships

NOTE:The above table was derived from a CASE selection chart created for use by a Minnesota government organization in 1994. The organization selected functional and technical requirements to meet its needs. From a list of CASE products that met those requirements, the organization conducted a more detailed comparison.
BIBLIOGRAPHY
A CASE Investigation, National Education Training Group (Video Series), 1992
Principles of I-CASE, National Education Training Group (Video Series), 1992
Research Notes (related to CASE), Applications Development & Management Strategies, GartnerGroup, 1994
Hannam, Mary, Repositories Built on Kindergarten Lesson: Sharing Eases Development, Software Magazine, September 1994
Office of Information Technology, Department of Finance, State of California, Computer Aided Systems Engineering, Preferred Practice Series, June 1992
Rinaldi, Damian and Gannon, William G., Characteristics for CASE Success, CASE Product Guide, 1992, Software Magazine
Semich, J. William, Client / Server CASE: Oxymoron or Essential?, Datamation, Sept. 1, 1994
Software Magazine, CASE Product Comparisons, CASE Product Guide, 1992
Sprague, Christopher, Managing CASE Tools: It's More Than Just Using Them, CASE Product Guide, 1992, Software Magazine