Selecting an assessment tool is often only half the battle - effectively implementing it to glean valuable data can involve a learning curve that takes precious time. This webinar series aims to provide you with ways to get ahead by using the right tools the right way from the start.
Presenters with diverse backgrounds and varied experiences will provide an overview of the tools, discuss how to create instruments, and give examples of how the tools can be used.
Join us to learn:
- Overview of some common assessment tools:
Rubrics, Surveys, Focus Groups
- Discuss benefits and use of each tool.
- Gain knowledge about how to best utilize these tools.
Dr. Tisha M. Paredes
Dr. Tisha M. Paredes was the Assistant Vice President for Institutional Effectiveness and Assessment at Old Dominion University. Prior to accepting this position in 2015, she served as the office’s Research Associate, Senior Research Associate, and Director of Assessment.
Tisha helped to promote institutional effectiveness activities at all levels of the University. She has presented workshops and consulted on topics including institutional effectiveness, academic and administrative assessment, general education assessment, QEP assessment, and SACSCOC compliance. She has served on several on and off-site committees. Her book, Using Focus Groups to Listen, Learn, and Lead in Higher Education, was published in summer 2018. In June 2021, Tisha stepped away from ODU to focus on other pursuits.
Director of Social Science Research Center
Old Dominion University
Dr. Tancy Vandecar-Burdin has been with the SSRC since its inception in 1998. She has more than 20 years of experience with survey research (to include phone, mail, and web surveys) as well as conducting focus groups and interviews.
She works with faculty and other research partners to support their research and data collection needs and is responsible for the day-to-day management of the SSRC’s operations, as well as serving as project manager or principal investigator for most of the SSRC’s projects. She has taught undergraduate courses in criminal justice and graduate courses in public policy and survey research. She has managed survey research and other evaluation activities on a variety of topics including: programs/services for university and community college STEM/cybersecurity students, awareness of campus suicide prevention resources, perceptions of mental health and substance abuse services in Virginia and New Hampshire, general quality of life in southeast Virginia, and the experiences of families with receiving early intervention services in Virginia. Dr. Vandecar-Burdin also serves as the chair of the ODU Institutional Review Board and is a member of the American Association for Public Opinion Research (AAPOR), the American Evaluation Association (AEA), and the Association of Academic Survey Research Organizations (AASRO).
Dr. Popp has over twenty years’ experience as an educator, beginning with 14 years as a k-12 teacher, and 8 years serving as an Associate Dean for Assessment and Institutional Effectiveness at a small private college, where she was the founding director of the Secondary Education Program.
In her time there she assisted with multiple institutional accreditation self-studies with the Higher Learning Commission (HLC), the Association for Biblical Higher Education (ABHE), and programmatic accreditation with the Iowa Board of Educational Examiners. She formerly served as a peer reviewer and site visitor for ABHE. She holds an EdS in Educational Administration and EdD in Higher Education Leadership, both from Missouri Baptist University. In her two years working at Weave Education she has assisted hundreds of institutions in developing high-quality and sustainable institutional effectiveness practices.
Assessing learning, development, and effectiveness can be difficult.
With many tools available to faculty and staff, many practical questions arise.
- Which are the best tools to use?
- When should we use the tools?
- How do we use them to gather relevant and actionable data?
Looking for help demonstrating learning?
This session will cover ideas and strategies for addressing the following:
How do you know if learning has occurred or improved? Did students grow as a result of a program or event? What tool is best for determining the effectiveness of a program or unit? How do you develop reliable and authentic tools for the task at hand?
We look forward to seeing you!