This week we are hearing from (), which is located at (). This post is by Michelle Hodara (), Practice Expert - Applied Research and Postsecondary Success at REL Northwest.
() / () previously blogged about and about a .
Today’s post is written from the researcher perspective. Stay tuned: Thursday we will share the practitioner’s perspective.
Just as (PCC) in Portland, Oregon, is dedicated to helping all its students succeed, is committed to helping clients improve their programs and reach their goals.
It was a good fit, then, when both entities came together for the first external evaluation of , a comprehensive scholarship and advising support program at PCC that seeks to change the lives of first-generation and low-income students.
Our work was very much a partnership. We collaborated with the Future Connect staff from the outset, and we worked closely together throughout the evaluation.
Given the tight timeline—the work began in summer 2016, and the final report was published in July 2017—we made sure to meet regularly with the Future Connect staff and were in constant communication throughout the process.
The evaluation did three main things:
1. Examined Future Connect’s impact on program participants’ college performance, progression, persistence, transfer, and completion
2. Gathered perspectives of college success coaches, participants, and alumni on the effectiveness of the program and how it might be impacting students’ academic and nonacademic outcomes
3. Provided a more complete understanding of the financial barriers Future Connect students must overcome to achieve college success
We found that the program is having a substantial positive effect on participants’ academic outcomes, including first-year GPA and credits earned, persistence to the second year of college, and three-year completion and transfer rates.
Based on our experience, here are some takeaways for researchers studying educational programs:
Get to really know and understand the program. Outside researchers provide new eyes and a fresh perspective, but you need to familiarize yourself with a program before you can understand what might be working and what needs improvement. One way to do so is to create a logic model, which provides a visual depiction of the activities and goals of a program. This is what we did with Pam Blumenthal and Josh Laurie, the leaders of Future Connect. In one of our first meetings, we spent several hours collaboratively creating a logic model for the program. This provided a solid foundation for the evaluation and ultimately drove the work.
If possible, use a mixed-methods approach. Quantitative and qualitative data contribute different kinds of information about a program. Whenever possible, collect both hard numbers and stories to understand whether a program is working and how. For example, although Future Connect has been quite successful overall, when we examined its effects on academic outcomes, we found that the gains for black students were smaller relative to other student groups. This prompted us to revisit the interview transcripts, and we found that coaches had spoken directly to this issue, noting that—compared with other student groups—some black students seemed to struggle with a sense of belonging. Thus, based on what we found in both the quantitative and qualitative data, we recommended that Future Connect increase its focus on fostering a sense of belonging for black students.
Ultimately, studying Future Connect allowed us to deeply learn about and spotlight a program that is making a real difference in the lives of students in our community. This successful partnership is now leading to additional opportunities for collaboration, and we look forward to watching Future Connect continue to grow and improve.