Last week I attended two conferences in Vancouver; the Community University Expo, and the Canadian Evaluation Society’s Annual Meeting. Both were engaging and inspiring and offered me a chance to chat about some of the work I’ve been doing with a broader scientific and research community.
At the beginning of the week, I attended day one of the Community University Expo’s Community Jam. Held at Simon Fraser University in Surrey, delegates from across the country came together to learn about community partnerships in action in the city of Surrey, and beyond. This also gave the team an opportunity to talk about ICON and how we’ve integrated social innovation, entrepreneurship, community-engaged scholarship, transdisciplinary learning, experiential learning, and training in knowledge translation and transfer and knowledge mobilization into a set of unique undergraduate courses at the University of Guelph.
On Tuesday I joined Nancy Snow (Assistant Professor, OCAD) as part of a panel discussion at the Canadian Evaluation Society’s Annual Meeting. Moderated by Dr. Beth Snow (Program Head – Program Evaluation, Centre for Health Evaluation and Outcome Sciences), we spent a 90-minute session in front of a full house of more than 100 conference attendees to provide some ideas on how computer science and graphic design might benefit the domain of program evaluation. Titled A Graphic Designer, an Evaluator, and a Computer Scientist Walk Into a Bar: Interdisciplinary for Innovation, we provided thoughts based on three main questions that Dr. Snow asked (paraphrased):
- Identifying the problem is a priority
One of the challenges that evaluators face is stakeholders holding preconceived ideas about how an evaluation should be conducted or about what the findings of an evaluation should be? What can you share from your disciplines that can help evaluators deal with this challenge?
- Data collection can be difficult
Data collection is a challenge due to limited resources, a limited number of participants, and participant fatigue (among other things). How might computer science and design help evaluators with the challenge of data collection given these limitations?
- Mobilizing knowledge isn’t easy
How can we ensure that our findings and recommendations are used? What insights can evaluators leverage from graphic design and computer science to help promote the translation of our findings into action?
Over the course of the 90-minute session, Nancy and I answered these questions (and more from the audience) based on our experiences and understanding in our respective research domains. It was a fantastic conversation that included discussion of passive data collection tools and methods, participatory design, journey maps, user models, alternative data collection methods, gamification, pulse surveys, and breaking down silos between disciplines earlier in a student’s career.
Based on follow-up questions and popular demand, Nancy, Beth, and I are now going to spend some time drafting up the various ideas we presented as part of the panel discussion. Stay tuned!