Learning & Philanthropy: How Adopting a Learning Identity Helps WomenStrong Improve its Activities and Support
When was the last time you read about a process evaluation conducted by a grantmaking organization? If you love nerding out on these topics like me, maybe it wasn’t so long ago. There are some foundations out there leading the way, like the Headwaters Foundation, which produces and publishes an annual learning book. I’m proud to work for an organization like WomenStrong that places a high priority on learning and transparency, both of which are aspects of our broader efforts to align with the values of trust-based philanthropy. To that end, I’m excited to share the results of our latest process evaluation with you today.
Why conduct a process evaluation now?
About a year ago, WomenStrong began implementing a new three-year Strategic Plan and adapted our program implementation approaches to respond to our learnings from the previous years. Part of this plan included revisions to our capacity strengthening program, which now co-creates tailor-made “action plans” to address each partner’s needs and aspirations for its own strengthening. As of this spring, the partners had been working on their finalized action plans for about six months, so we decided to leverage routine check-in calls and interview staff to understand folks’ experiences and to get everyone’s feedback about our newly-expanded program activities. We also wanted to reflect on our theory of change, think about factors that may influence the outcomes we hope to see in the future, and to identify areas we could improve to give us the greatest chance of achieving our desired results.
What methods did we use in this evaluation?
First, our Knowledge & Learning team gathered information to inform our analysis:
- We reviewed all implementation data we had gathered to-date. This included quantitative information about the volume of capacity strengthening support we had provided and participation in our Learning Lab activities, as well as qualitative notes captured by staff about these engagements.
- We individually interviewed every staff member.
- We used time during existing check-in calls with each grantee partner organization to ask their staff questions about their experiences with our support and activities and to make sure that we were addressing the needs and topics our partners had identified as priorities, both for their individual capacity strengthening programs and also in their Learning Lab activities.
Then, all of this information was pre-processed, either quantified into charts or transcribed and coded by our Knowledge & Learning team.
All of this information then fed into an all-staff participatory analysis workshop conducted in New York City in April. Over the course of four days, all of us had the opportunity to collectively draw meaning from the data – digging into the charts and code reports, working in small groups to identify key themes, and then discussing their thoughts with the full group. After we had generated key findings, we used emergent learning practices to create and prioritize a list of the most pressing questions we wanted to answer about our work going forward, and then worked in small groups to generate action hypotheses, or in our parlance, “tweaks.” At the end of that week, we had a list of 25 “tweaks” – concrete actions we felt had the potential to improve our support meaningfully, generated and owned by all staff.
Here’s what one staff member said about her experience:
[It was an] awesome, great process with meaningful participation of all staff… Being able to jointly identify some key issues/challenges/gap[s] and most importantly ask ourselves some key questions that we are all going work around to find answers to.”
–WomenStrong Staffperson
How have we applied the findings of the process evaluation so far?
Since our April workshops, small groups of interested staff have taken on each tweak and are finding opportunities to implement them in our ongoing work. The tweaks encompass a wide range of tactics and apply both to our internal processes, as well as to programmatic activities with partners. For example, as a result of the process evaluation findings, we are testing out new introductory activities for our weekly team calls, to improve staff communication and build relationships. Simultaneously, we have also designed and are currently testing a new way of gathering feedback about our Learning Lab partner calls, which focus on such topics as advocacy, communications, monitoring and evaluation, and feminist leadership, again, as per our partners’ expressed needs and desires.
Going forward, we will continue to document the results of these tweaks and will share our results back regularly with staff and partners. Our hope is that by cultivating an organizational ethos of growth and learning and embracing transparency, we can both improve our work and inspire others to give these approaches a try.
For more information, or to share your own learning approaches and experiences, please reach out to me via email here.