Previously, I wrote that I would post things I learned in my database training so that others can use them as they see fit. Considering my two days worth of inability to fix the loop in PHP in my performance management information support system, allow me to take a moment to stop coding and write about things I learned and how I applied it to the project I am working on. This way, I think I can initiate my intent of sharing my knowledge in learning while reviewing why I designed the database for my project that way.
First and foremost, information system development is and should be a deliberate and systematic process. I would not talk about that process as it is a combination of some system development things and management and organizational requirement, with the two needing to coordinate (and I mean TALK) closely if the organization intends to make the system really responsive to their requirements. If the two do not, don't expect the system to be helpful (i.e., easy to use, will do what the organization needs, and will be a SOLUTION INSTEAD OF BEING AN ADDITIONAL BURDEN).
Nonetheless, in the interest of sharing basic knowledge (which is my current level on the matter), here are the basic phases as proposed by Hoffer et al in their book, “Modern Database Management” 11th edition:
Project Development Phase
|
Purpose
|
Project ID and Selection
|
Develop a preliminary understanding of the business situation that has caused the request for a new or enhanced IS
|
Project Initiation and Planning
|
State business solution and how IS might help solve a problem or make an opportunity possible
|
Analysis
|
Analyze the business situation thoroughly to determine requirements, to structure those requirements, and to select among them competing system features
|
Logical Design
|
Elicit and structure all info requirements
|
Physical Design
|
Develop all technology and organizational specifications
|
Implementation
|
Write programs, build data files, test and install the new system, train users and finalize documentation
|
Maintenance
|
Monitor the operation and usefulness of a system, and repair and enhance the system
|
My project was personal so it did not go through phases 1-4 before going to phase 5. Rather, based on my understanding of the process (which, in case you missed it in the first paragraph, was about performance management), I dived into coding immediately. I knew that the project would be big and my basic requirements were not complete when I started it. And those considerations affected my database design, as I wanted it to be robust and scalable.
And with that, let's go now to data modeling.
A data model is, for our purpose now, how an organization views, uses, and processes data. As such, it is a biased view of how the organization sees things, as things are represented by data. Data modeling, then, “establishes the range and general contents of the database,” to “to create an overall picture or explanation of organizational data,” as Hoffer et al explained in their book.
For a clearer (and nuanced) example, let's take a look at performance management in the Philippine bureaucracy.
All government agencies report to many oversight agencies, and in performance management, this is not only to one agency. For those who do not know, below are three oversight agencies that any other agency reports to, with their corresponding reporting arrangements:
Department of Budget and Management (DBM) – for organizational performance
Civil Service Commission (CSC) – for individual performance of rank-and-file
Career Executive Service Board (CESB) – for individual performance of managers
As you can see, performance of any one person will need to be aligned with all of these, somehow.
DBM, through the Organizational Performance Information Framework, or OPIF, looks at the agency-level organizational performance, which is measured by how much an agency performs against its set Major Final Outputs, which are, among others, an agency's mandated goods and services. These MFOs are clearly identified in the national budget, as enacted by Congress (through the Performance Informed Budget, or PIB) through the General Appropriations Act (as such, each MFO has a budget, similar to marketing programs in the private sector).
CSC looks at the performance management of the rank-and-file, which is currently measured in terms of accomplishment of each individual's performance contract, or PC. The PC, which sets the target (similar to the agency's MFOs), will be the basis of scoring their accomplishment during performance evaluation period. Currently, this performance management framework is called the Strategic Performance Management System (SPMS).
CESB, through the Career Executive Service Performance Evaluation System, or CESPES, looks at the performance management of the bureaucracy's managers (i.e., Assistant Director and higher). Similar to the CSC's SPMS, the bureaucracy's managers craft a performance contract with their supervisors for performance targets, which will serve as basis for evaluating their performance at the end of the rating period, which is currently one calendar year (Compared to the CSC's SPMS, the period is usually six months, although an agency can have a shorter period of minimum 3 months or longer, up to a maximum of one year).
Here's a summary of the basic aspects of the three performance reporting systems:
System
|
Coverage
|
Performance Report Prioritized
|
Rating Period
|
DBM - OPIF
|
Organization
|
Accomplishment of agency-level, agency-nominated MFOs
|
Quarterly
|
CSC - SPMS
|
Rank-and-file employees
|
Accomplishment of agreed upon performance targets, which should be aligned with agency MFOs
|
Semestral, with provision for shorter (minimum of 3 months) or longer (maximum of 12 months)
|
CESB - CESPES
|
Managers
|
Accomplishment of agreed upon performance targets, which should be aligned with agency MFOs, with additional consideration for leadership and innovative outcomes
|
One calendar year
|
Based on this, you could see that if we are going to create an performance management information system, it should incorporate these considerations.
Additionally, the CESB created a hierarchy of performance objectives, as follows (based on Section 3 of CESB's Resolution 1136):
Organizational [Strategic] Objective – broad statement of goals of the organization achievable in the medium term (3-5 years) in order to achieve its mission and vision.
Office Performance Objective – enumerates the specific area/s of performance that is/are critical in accomplishing the strategic objective/s of the organization. Multiple Office Performance Objectives may contribute to an Organizational Objective.
Ratee Performance Commitment – refers to the Ratee’s strategy or plan of action that he/she personally commits in order to contribute to the performance objectives.
In the Presidential Management Staff, we harmonized the three, in terms of performance objectives, by assigning its MFOs as CESPES' Organizational Objectives, and the CESPES' individual performance targets in the performance contract as the Ratee Performance Commitment. The Office Performance Objective is derived from clustering of related individual performance commitments, which are organized by groups or units to achieve common and related outputs or outcomes.
This harmonization was accomplished through the Results-Oriented Performance Management System, or ROPMS (the system which the PMIS will provide information support).
In terms of organizing the hierarchy of performance objectives, one can see that:
The agency must have at least one MFO (for OPIF) / Organizational Objective (for CESPES).
An MFO / Organizational Objective may have one or more Office Performance Objective.
An Office Performance Objective should be aligned with only one MFO / Organizational Objective (this is for purpose of performance evaluation: we do not want one action to be rated more than once if it will belong to different MFOs).
Each of a Ratee's (both rank-and-file and managers) performance commitments should be anchored on one (and only one) Office Performance Objective.
From a database management perspective, it would look like this:
Organizational Objective table
Organizational Objective ID
|
VARCHAR* 15**
|
Long Description
|
VARCHAR 500
|
*VARCHAR is one of the types of data used in programming and database design. It means variety of characters, I think. Among the other types are DATE, INTEGER (INT), and BOOLEAN.
**Number of characters are only for sample. For the Organizational Objective ID, it is only 15 because it is only an acronym of the long name indicated in the Long Description field.
Office Performance Objective table:
Office Performance Objective ID
|
VARCHAR, 15
|
Long Description
|
VARCHAR, 500
|
Organizational Objective ID (Foreign Key)*
|
VARCHAR, 15
|
*The foreign key is a “restraint,” which is setting a control in the system that in order for this office performance objective to be added, it must be connected to an existing organizational objective.
Ratee Performance Commitment table:
Ratee Performance Commitment ID
|
INT
|
Long Description
|
VARCHAR 500
|
Office Performance Objective ID (Foreign Key)
|
VARCHAR 15
|
As I mentioned before, when I dived into coding, I was aware that the system will have to be able to scale, meaning it should be able to accommodate additional requirements or conditions, and for more than the reasonably medium-term set of assumptions. Meaning, while the PMS' current MFO on decision inputs focuses on final briefing kits (eg., documents for the President's events or meetings) and action on requests from the public, these may change in the future, and the system should be able to accommodate it (just by changing the type of output to be counted).
As such, I have to modify the tables as follows:
Organizational Objective table:
Organizational Objective ID
|
VARCHAR 15
|
Long Description
|
VARCHAR 500
|
Date Start
|
DATE*
|
Date End
|
DATE*
|
*This is to indicate that an organizational objective may cease, however, for purpose of recording, we would keep it in the system instead of deleting it (which would corrupt the relationship structure as that would mean that office performance objectives may exist without organizational objectives).
Office Performance Objective table:
Office Performance Objective ID
|
INT, AUTO-INCREMENT*
|
Long Description
|
VARCHAR, 500
|
Organizational Objective ID (Foreign Key)
|
VARCHAR, 15
|
*Changed to INTEGER, meaning the system will assign the number. This will ensure that, upon nomination of an office performance objective, it would be unique. The AUTO-INCREMENT indicates that the system will increase the numerical value automatically.
It may also be noted that the office performance objective does not have time consideration. This is because its effective period is the same as that of its parent organizational objective.
Ratee Performance Commitment table:
Ratee Performance Commitment ID
|
INT
|
Long Description
|
VARCHAR 500
|
Office Performance Objective ID (Foreign Key)
|
INT*
|
*Changed the data type into integer, as it must be the same as the one it references.
The above tables, which are called physical database design, should be able to cope with changes in an agency's MFOs. Also, considering that the office objectives are separated from the office themselves, an organizational restructuring would not require changing of the code (assuming the user interface sets up a facility of reassigning office objectives, which I did for my project). Below image shows how the three tables relate:
However, if the DBM or CESB changes the way it looks at performance and budgeting (i.e., changes the OPIF altogether), it would require a major review and updating of the system and the database structure.
The above narrative, hopefully, demonstrates sufficiently how database design works (Note that this is only for one aspect of the process. Your own process would most probably have more than one, each of which you have to consider for relationship with other tables, databases, processes, and user requirements.) It requires an understanding of the business' requirements and look at its operations, including the information and data required for and gathered from these operations, not just a pure technical appreciation of database design or programming. As such, managers should also take the time to understand this (not the technical level, but an appreciation for the opportunities and limitations) to make informed decisions and not be limited by the technical people's say so.
Final Note: Above narrative is a simplified version of the tables. For security reasons, some additional columns, reflecting additional organizational requirements, have been removed.