Skip to main content


Q: Can we look at more than just DFW data when evaluating student success?

Answer: If the process or a program identifies a particular course as an opportunity to improve student success through DFW rate, the next level of analysis can disaggregate data by instructor, day, time of day, modality, and include SEI data if the program faculty are comfortable with that approach. Course repeat success rates could also be considered. DFW rates are a starting point for a discussion about student success, not the only metric that will or should be used.

Q: Will programs be able to ask for detailed or record-level data that may help explain the metrics?

Answer: It will depend on the metric. While it will possible to provide record level data for student enrollment and completions, which faculty are being counted for the program or which subject codes, for example, there may be other data, like salary data, that won’t be shared at the individual record level. Programs will be able to request clarification of their data and will have support through the Morgantown Provost’s Office to get to the clarity required to sufficiently respond to the data in the self-study.

Q: Can the workload committee select meaningful peers for their analysis?

Answer: Yes. The committee should collaboratively work with its campus leadership and the Morgantown Provost’s Office to determine the best set of peers for particular comparisons.

Q: Can the order of the self-study form be changed to prioritize and arrange student success questions before questions about faculty cost?

Answer: This change could be made but the current order of the form does NOT align the questions by order of importance. Instead, the form is arranged in the same order as BOG Rule 2.2 to simply ensure that those criteria are clearly met and that questions can be easily identified as tying to one of those required areas. Given that only one comment was received about this, we are going to preserve the current order for the reason stated above.

Q: Should the self-study be more narrative in form or should programs simply provide data?

Answer: Both. The self-study emphasizes the narrative context that the program can provide which explains the program’s performance and its on-going initiatives and changes. It is also expected that the program is likely to have additional data and evidence to support the claims it makes in its narrative.

Q: Can we improve faculty development regarding student mental health?

Answer: We will ask Associate Provost Melissa Lattimer about what resources and trainings can be shared / provided specifically to PSC in this area.

Q: Could PSC develop more targeted technical degrees and, if so, in what areas?

Answer: Eventually, yes, but not until after academic transformation resolves and it is clear what the resources available for that might be. 

Q: How were the program “groups” identified as many of those groups are actually sub-sets of larger programs at PSC?

Answer: The associate’s “groups” undergoing program review were identified collaboratively by the Provost’s Office at WVU Morgantown and the campus leadership at WVU Potomac State College. There were two criteria for identifying a “group”: 1) there existed two or more majors which could potentially be served by one single major as determined by CIP codes and common practices at peer associate’s granting institutions; 2) at least one of the majors in that potential grouping had low enrollment.

Q: How were the ad hoc committees and their charges identified?

Answer: Similar to how the program “groups” were formed, the ad hoc committees and their charges were the result of a collaboration between the Provost’s Office and Morgantown and the WVU Potomac State College leadership. They were formed to address the additional goals for academic transformation (beyond those specific to program review) that were also collaboratively determined.

Q: Will the timing of this review take into account on-going program changes and improvements?

Answer: Yes, the self-study process will allow programs and program groups to present any plans, changes, or improvements currently underway for consideration; the appeal process would give programs and program groups another opportunity to clarify why any particular recommendation that is made may not be well advised given on-going program changes and improvements.