Once the Educational Goals have been defined, the next phase of the process is to gather data. However, it is not necessary to collect data on all courses, and depending on the approach one selects, the length of data collection may range from two terms to two years.
Approaches to data collection
The graphic shows the Educational Goals data collection and assessment process as a part of the external review process.
External review timeline
Unit self-study: Between October and December of the year prior to a unit’s external review, units are to produce a unit self-study which is circulated to external reviewers before their site visits. Assessment of Educational Goals should be included in the unit self-study (see Sections 3.2.a. & b. in Senate Guidelines for External Reviews of Academic Units).
Unit action plan: After reviewing the external review report, units are expected to produce a unit action plan to address the recommendations from reviewers and to plan for the next cycle.
Progress report: In year 4, units are expected to produce a progress report for internal use and reporting to the SFU administration. As part of this document, units are expected to report on their assessment of all Educational Goals.
Data collection for Educational Goals assessment
In the above graphic, the approach presented for data collection and analysis is sequential in nature, proceeding through the phases of definition, collection, analysis and action. It allows an almost equal amount of time for each of the phases and enables units to focus on each phase independently.
However, depending on units’ assessment plans, resources, time allocation, and preference, variations of this approach are equally appropriate.
For instance, units may decide to collect on each Educational Goal individually, by defining, collecting, and analyzing before moving to the next Educational Goal, and repeating this process until all Educational Goals are assessed.
Another variation is to collect on all Educational Goals while also analyzing the data simultaneously.
There are many other variations possible. Educational developers at the Centre for Educational Excellence can assist units in their plan for assessment.
An assessment plan outlines the process for collecting and analyzing data about the effectiveness of student learning within a program. It includes a statement of Educational Goals, selected direct and indirect methods of assessment to best evaluate program effectiveness, a plan for collecting and analyzing data, and for implementing change.
Example of an assessment plan
Units can adapt and customize this template to meet their needs.
Definitions of elements in the assessment plan
Program level Educational Goal: Identifies the knowledge, skills, abilities, etc., that students should be able to demonstrate upon completion of the program. The goals need to be specific and measurable.
Breakdown of Educational Goals: Sometimes it can be helpful to break down program level Educational Goals into smaller units that can be operationalized. This may be helpful for finding data in the curriculum which is relevant to the assessment of the program level Educational Goals.
Data source: Programs should identify where in their curriculum (e.g., course number) data is being gathered to assess the specific Educational Goals. Note that not all courses need to be assessed. Courses towards the end of a program can provide a measure of student achievement, and first-year courses can provide insight on student skills at program onset. Curriculum mapping is useful for identifying which courses Introduce, Emphasize, Reinforce, and Apply the skills and knowledge identified in an Educational Goal.
Direct assessment: Direct assessment (also called curriculum-embedded assessment) requires students to demonstrate their knowledge, skills, etc., and faculty members to then assess whether/how well students are achieving/have achieved a program goal. Examples of direct assessment include artistic work products, case studies, exams, juried performances, oral presentations, papers, and portfolios. Examples of direct assessment can be found here.
Indirect assessment: Indirect assessment gathers perceptions of whether/how well students are achieving/have achieved a program goal, in the form of reflection questions. Examples of indirect assessment include surveys of alumni, employers, and current students; exit and focus group interviews; enrollment and retention data (e.g., from Institutional Research and Planning); and job placement data. Indirect assessment complements the data collected from direct measures and cannot stand alone as the sole measure of student performance. Examples of indirect assessment can be found here.
Years/semester of data collection: Programs should identify when (in which year or semester) the data is being gathered.
Major findings: Programs should identify the major findings after analyzing the data collected.
Actions that resulted from findings: Programs should provide evidence that the findings have been used to further develop and improve student achievement of program level educational goals (i.e., actions that were taken as a result of data collection and analysis). It is also important to state when findings provide evidence that students are successfully achieving a program level Educational Goal.