Standardized Procedure to conduct a Return on Investment Analysis on Programs within the Department of Education
Charles C. Worrilow, Jr.
Dr. Hubert K. Huang, MS, Med
Department of Education, Lehigh Valley Health Network
Research Scholar Program Mentor
This study aims to illustrate the importance and process of the return on investment methodology on programs within the Department of Education. The ROI methodology is a crucial tool to implement, not only in the DOE, but in the entirety of the Lehigh Valley Health Network. Due to the lack of data and knowledge available in the DOE, the ROI methodology has been nearly impossible to implement. This study provides a clear-cut method to conduct such evaluations and eventually perform an ROI analysis in the future. The ROI analysis (Level 5 evaluation) is one of the five evaluation measures in the ROI methodology; it is deemed as the most important and complex to evaluate.  The devised template provides the necessary tools to complete the chain of impact and determine the Return on Investment of virtually any program within the Department of Education.
The study began with the completion of the Villanova University ROI certificate program. This course was completed in order to get a better grasp of the material and project at hand. The course consisted of a series of six books and six discs that took the learner from the fundamentals of ROI to the communication of the results. Throughout the duration of the six-part course, detailed notes were taken and organized in chronological order. This process took about 4 weeks to complete. After the fourth week, key personnel who held leadership positions within the Department of Education were consulted with. These consultations provided the appropriate information to acquire a knowledgeable foundation of past, present and future educational programs in the DOE. This awareness facilitated the process of developing a relevant ROI template. This awareness allowed the study to be constructed around the needs of the department; allowing for a more credible template. Following the meetings with key personnel, a step-by-step, detailed procedure was constructed using Microsoft word. Using the detailed template, a more fluid and clear-cut flow chart was developed. Proper documentation for each step of the ROI methodology was hyperlinked into the document. Not only did this provide appropriate documentation, but facilitated the process for future programs in the DOE.
In past years, Return on Investment has been a term mainly used in the Business world. Today, the term is becoming more prevalent in the health network. Proving to be valuable to the network at large, senior management, or key stakeholders is crucial in today’s public health world. When a program or project is unable to prove its value or worth they are seen as expendable and not necessary for the continued success of the business. The ROI methodology is a performance measure to evaluate the efficiency of an investment, or can be used to compare the efficiency of a number of different investments (educational programs within the DOE). [1, 2] The National Association of Chronic Disease Directors defines ROI in the following manner:
ROI is the economic indicator—meaning, you are dealing with money and costs. Basically, return on investment shows the financial benefits derived from having spent money on developing or revising a system or program. The intent of ROI is to measure how effectively the organization or program is using its money. 
Calculating the ROI on programs within the Department of Education is not only crucial in demonstrating the value of these educational programs, but also the significance of the DOE as a whole. The ROI Institute’s book ROI Fundamentals: Why and When to Measure Return on Investment talks about six different benefits of the ROI methodology. These benefits include: “the measurement of the program’s contribution, priorities to be established for high-impact programs, improves the effectiveness of all programs, earns respect from senior management team and program sponsor, and finally will demonstrate that the program or project is an investment, rather than an expense”.  With regards to the DOE, little has been done to obtain the necessary evaluation data to accurately determine many programs’ value or worth to the network.
Currently, the Department of Education determines the effectiveness of their programs from level one reaction data. They collect these data through surveys and questionnaires. Although this is a good method to determine the effectiveness of a training program, it is not very credible data to use by itself. Reaction data proves to be very useful in determining the relevance of the material, the facilitators’ competence, and the usefulness of the program. However, this available data will prove to be a problem when senior management wants to further data collection and conduct ROI analyses. Level one evaluation data is not sufficient enough to conduct credible ROI analyses.
Other, more detailed and complex, data collection methods are illustrated in the study. These levels of evaluation include: level two data collection, learning and confidence; level three data collection, application and implementation; level four data collection, impact and consequences; and level five data collection, return on investment. These methods further the data collection process, allowing for a credible ROI analysis. Level two evaluation data focuses on what participants learned during the program. A variety of follow up methods are used to collect level three data to determine whether participants have applied what they learned in their work environment. Level three evaluation data are important to gauge the success of the application. Level four evaluation data represents the actual results achieved by the participants. These results could include: outputs, sales, costs, quantity, time, or customer satisfaction. These data are good measures to determine a business impact from the program or application. Finally, level five data evaluation is the ultimate level of evaluation. These data answer the question, “Do program benefits exceed program costs?”
This study demonstrates the steps necessary to prepare, plan, conduct, and implement an ROI analysis. The study provides the tools necessary to complete each of these essential steps to perform a credible analysis.
Understanding that not all programs are ideal to conduct such an analysis is crucial. The program or project must meet specific criteria to be ideal to perform an ROI analysis. In addition, it is critical to follow the twelve guiding principles of the ROI methodology to acquire the most credible results. These include: when conducting a higher-level evaluation, collect data at lower levels; when planning a higher-level evaluation, the previous level of evaluation is not required to be comprehensive; when collecting and analyzing data, use only the most credible sources; when analyzing data, select the most conservative alternative for calculations; use at least one method to isolate the effects of a project, if no improvement data are available for a population or from a specific source, assume that little or no improvement has occurred; adjust estimates of improvement for potential errors of estimation, avoid use of extreme data items and unsupported claims when calculating ROI, use only the first year of annual benefits in ROI analysis of short-term solutions, fully load all costs of a solution, project, or program when analyzing ROI; intangible measures are defined as measures that are purposely not converted to monetary values, communicate the results of ROI methodology to all key stakeholders. [1, 3]
In order to implement the ROI standardized procedure, the Department of Education must establish a credible and accurate record keeping system. In order to precisely calculate the ROI on future programs within the DOE, evaluation data must be accurately collected and documented.
Throughout the duration of my study there were multiple challenges and limitations present. Firstly, the limited information available on the education programs in the DOE proved to present a few challenges. When trying to develop a standardized ROI procedure with direct correlation to these programs, it is important to know past data collection methods and procedures used in order to illustrate those feasible methods available to conduct an ROI analysis. The standardized procedure should include data collection methods that are feasible for the DOE to conduct. With the limited knowledge of past methods, it was a challenge to create a procedure that directly corresponded to the needs of the DOE.
In addition, time proved to be a challenge in the completion of the procedure. After the conclusion of the ROI certificate program, only 4 weeks were available to develop the standardized ROI procedure. This time constraint was a barrier in allowing for a more detailed procedure to be constructed.
Lastly, there were no methods of measurement present used to capture levels 2, 3, 4, and 5 evaluation data. As a result, these methods had to be devised and incorporated in the standardized ROI procedure.
Phillips, P., & Phillips, J. The ROI Institute, Inc. “Essentials of ROI Methodology.” Phillips ROI Methodology, 2008. Web. 6 Jun. 2015.
Phillips, J., & Phillips, P. (2008, September 9). The Basics of ROI. Retrieved July 21, 2015.
Sterling, E. (2009). A Practical Guide to ROI Analysis. Retrieved July 8, 2015.
Published In/Presented At
Worrilow, C., Jr., (2015, July 31) Standardized Procedure to conduct a Return on Investment Analysis on Programs within the Department of Education. Poster presented at LVHN Research Scholar Program Poster Session, Lehigh Valley Health Network, Allentown, PA.
Research Scholars (Acknowledgements and Co-authored Publications), Research Scholars - Posters
Best Quality Improvement Project - Second Place