Thursday, May 3, 2012

Tracking Attendance at Co-Curricular Activities

Jeff Lail (@jefflail), a Coordinator of Programs in the Office of Campus Activities and Programs at the University of North Carolina at Greensboro, posted a video describing an initiative he is involved in at UNCG regarding tracking attendance at University co-curricular events as a step towards developing means to measure longitudinal growth of students.
Jeff asks for feedback on the program, and in spirit of doing so, I’d like to offer a few thoughts.  
It’s possible to view Jeff’s video and make some assumptions that are potentially significant; I’d like to mention two of them.
Jeff comments that the learning student affairs aims to facilitate takes place longitudinally, and thus is not conducive to testing as in the classroom.  Jeff’s point is valid and accurate, yet it is a little incomplete; Student Affairs does attempt to facilitate more pervasive improvements in developmental domains, such as identity, values, citizenship, interpersonal relationships, self-management, etc. 
Here is where I see the first assumption.  It unintentionally implies that academic learning is not longitudinal.  It very much is longitudinal, though.  Generally speaking, the fact that the academic domains of knowledge are fairly well defined and there is a structured, progressive curriculum affords the convenience of testing.  But there are clear, longitudinal goals in terms of what skills and knowledge a student should achieve by the time they graduate. 
These involve skills related to cognitive complexity and critical thinking, scientific inquiry, information literacy, communication, etc. as they are applied within that particular field.  All of these are just as complex, intertwined, etc. as those that we look at in Student Affairs.  Indeed, they are intertwined as well!  One cannot understand their identity without engaging a sophisticated degree of cognitive complexity, and the attempts to cognate on one’s identity, their purpose, their values, their emotions, etc. are applications of complex cognitive thinking. 
Thus, juxtaposing academic and student affairs learning occurring on different timeline is more misrepresentative than I suspect Jeff intended it to be.  They are more significantly different in that academic learning involves more domain-specific knowledge and skills, while student affairs learning involves more domain-general knowledge and skills.
The second, a flip side to the first, is that the domains of learning student affairs attends to cannot be deconstructed within an organized and progressive curriculum.  While, I certainly agree that this practice is not mainstream in our field, that doesn’t make it inaccurate. Indeed, there are movements within our field to develop more structured and rigorous curricula and assessment methods.  One example is the Residential Curriculum Institute and the various implementations of Learning Reconsidered and Learning Reconsidered 2.  There is even the fresh face of the New Leadership Alliance for Student Learning and Accountability aligning institutions together that are committed to improving assessment at all levels of the institution – including Student Affairs – in order to develop ways of improving student learning.
We can and need to reconsider our work with this lens.
There are some significant privacy concerns regarding the process in which the program is implemented.
Jeff explains the program gives any student who is doing a program a card reading device that obtains student ID numbers using an iPad app.  This is absolutely important data to collect, and I would offer that I think every institution should strive to track attendance at every possible event.  I’ll explain why later.  But the fact that the program provides this to any student doing an event means that students are essentially gaining access to other student ID numbers, which are protected under FERPA.
There is a knee-jerk argument that by swiping their card, the student is essentially giving permission for the student coordinating the swiping to have access to the ID number.  That argument falls apart quickly though.
  • The student did not give consent in writing
  • The student was not made aware of exactly how that data would be used
  • The student collecting the data was not a University Official
  • The student collecting the data had no educational purpose that would warrant the school granting them access to it
Jeff tweeted to me that UNCG’s IT department approved it, and I am a bit surprised by that.  I suspect there is more context to this that I am not privy to, because UNCG’s University Counsel states the university’s policy pretty clearly:
A final thought on this particular concern, aside from whether or not permission is received.  As professionals I think the question, Is giving students such open access to student ID numbers appropriate (ethically or professionally)?, is a very important one to ask.  I am not saying that my response to that question is “right,” but I am saying it’s the right question to ask.
Systematic and extensive tracking of attendance and involvement data is crucial to our field’s ability to show demonstrable impact.
The meta-analysis research by Blimling (1989, 1999) and Pascarella & Terenzini (2005) that is well known in our field was unable to show any consistent impact of student affairs efforts on student learning and success.
In other words, we have essentially been irrelevant.  It pains me to say that, but that is the conclusion of the research.  I know we can point to individual students and show tremendous and inspiring exceptions to that.  I can point to a number of individual students and show that… but here is the problem…
Out of the thousands of students I have interacted with in my 15 plus years, I can only point to probably less than 100 that I can effectively and unequivocally demonstrate how I/we impacted them.  When you do the math, it’s no wonder that research has not been able to show our efforts consistently have an impact.
But some efforts have shown to have an impact.  The National Wabash Studies are a good example, and Salisbury and Goodman (2009) point out, the Wabash Studies illuminate that it is not the activity that students are engaged in, but rather the quality of the design of that activity and the quality of the interactions within it that make experiences learningful.
Thus I see 3 critical needs in order for us to start to show a consistent and positive impact:
  1. We need to focus as much on understanding Learning Theory as much as (and perhaps more than) Developmental Theory.  Our efforts to design developmental experiences are often ineffectual because we lack critical knowledge from Cognition and Instructional Design in how to consistently effect meaningful learning. 
    • As I embarked on my journey into Learning theory and Educational Psychology, this was one of the most difficult admissions I had to make to myself.  We are not the experts on Student Learning as we are taught and/or like to believe.
    • We need to utilize this knowledge to reconsider our work in the context of a flexible, relevant, and longitudinal curriculum conducivee to our context and the affordances and constraints it offers
  2. We need to dramatically increase our competence in assessment so that we know what data to collect, how and when to collect it, what inferences we can make from it, and how to act on it.
    • The UNCG tracking program collected the data but I didn’t get a sense of how it was being used to illuminate the impact of student affairs efforts on student learning.  I got the impression it was hard for them to glean any evidence of learning from the data.
    • That is not surprising; attendance numbers, quantity of programs, etc. are descriptive statistics, and we cannot responsibly or effectively make inferences from them about learning or development.  Any conclusions like that are based on assumptions and leaps of logic with no supporting evidence.
  3. We need to systematically track student attendance at events, patronage of services, etc.  But not because those numbers demonstrate anything except operational metrics.  We need that data for better statistics!
    • We need to utilize regression models to determine what behaviors of attendance, involvement, etc. are most correlated with student success on our individual campus. 
    • Understanding which departments, which programs, etc. show greater correlations essentially identifies “high impact” programs and services that we can focus our energy and resources on.  (I do not mean that to suggest we base elimination decisions strictly on that data, though, in case anyone reads it that way)
    •  It may be able to identify common pathways students take within student affairs that contribute to or deter students from succeeding.  Path models of student success would be a very innovative application of that particular methodology.  And it would require much more sophisticated attendance tracking than most of us do presently.  Jeff and UNCG could be the champions that finally make something like that happen, though!
Those are my thoughts.  Jeff, and others, I hope they are thought provoking if not helpful.

No comments:

Post a Comment