Ashley M. Yopp, University of Georgia, ayopp@uga.edu
Billy R. McKim, Texas A&M University, brmckim@tamu.edu
Yvonna S. Lincoln, Texas A&M University, ysl@tamu.edu
Abstract
A lack of engagement has been reported to contribute to an ever-widening gap between how students develop knowledge, skills, and abilities and how teachers provide instruction. At the onset of this study, the purpose was to understand how depth and sequence of experience influenced student engagement, yet an emergent etic perspective surfaced. Data were collected from hundreds of hours of student interviews and observations, student and teacher reflexive journals, and classroom dialogue. Results of this study included a personal autoethnographic narrative describing the complex and unforeseen realities of (dis)engagement experienced by teachers and students. At the conclusion, it was evident the scope of the study needed to be expanded to not only describe the influence of how depth and sequence of experience engaged and, in some cases, disengaged students and teachers alike, but also the role meaningful connection plays in teaching in high stakes learning environments.
Introduction
Engaged students are more motivated to learn but understanding how to engage students is a complex task (Coates, 2007). Teaching and learning are not mutually exclusive. The ability of a teacher to engage students is met with an unlimited number of extraneous variables and ever-changing policies that continuously disrupt their daily approach. Issues of student engagement become more difficult in high-stakes educational settings. According to the National Research Council (1999), the context and standards of high-stakes environments have unintended consequences that discourage teachers from improving instruction to engage students. Additionally, teachers exhibit more controlling behaviors and are less likely to use practices that support student engagement, including exploration and experimentation (Sheldon & Biddle, 1998; Bain, 2004).
The challenges of teaching today’s student require teachers to adapt to a new reality that is far from the classroom many educators experienced as students. Priority 4 of the American Association for Agricultural Education National Research Agenda (Edgar et al., 2016) included the need to understand “meaningful, engaged, learning opportunities is paramount to future learning environments,” signaling a “paradigm shift” in the way teachers prepare students for the 21st century (pg. 38). However, perceptions that foster ideas of “edutainment” and quick fixes to student engagement only create misrepresentations of the problems teachers are facing in their classrooms (Sorathia & Servidio, 2012). Teaching, without renewed perspective of learning, may create an ever-widening gap between how students develop knowledge, skills, and abilities and how teachers provide instruction. In adapting the learning environment to also “entertain” students, teachers are taking on additional responsibilities in the classroom; teachers’ past experiences and current social, emotional, and mental states can largely affect these additions. To understand engagement, holistic accounts of student and teacher experiences should be considered. Assessing both parties allows researchers to identify levels of engagement in relation to classroom culture and learning expectations.
The complex web of perspectives, approaches, and settings presents the need to understand student and teacher engagement at a basic level. Bain (2004) suggested “the best teaching cannot be found in particular practices… but in the attitudes of teachers, in their faith in students’ abilities to achieve, in their willingness to take students seriously, and let them assume control of their own learning” (p. 78-79). In line with the learning process, teachers must consider their inevitable impact on student learning – experiences, both positive and negative, will impact student learning as well as the overall classroom environment. Although research has contributed to varied components of teaching and learning individually, a collective and reciprocal understanding could illustrate possible opportunities for teachers and students to engage in any learning environment – including all five disciplinary areas of our profession (agricultural communications, agricultural leadership, school-based agricultural education, extension and outreach education, and agricultural education in university and post-secondary settings).
What is Student Engagement?
Engaged learning practices used to develop students into in-depth learners, instead of passive receptors, have been essential components of educational theory for years (Johnson et al., 2001; National Research Council, 2009). Drawing on constructivism, engaged learning requires students construct knowledge with their own experiences instead of accepting the experiences of an all-knowing teacher (Piaget, 1976). In higher education, Chickering and Gamson (1987) provided a set of principles to engage undergraduates in learning; principles include student-faculty interaction, student cooperation and reciprocity, and active learning.
Developing a specific definition of student engagement has become increasingly important as researchers and administrators work toward practices to improve student performance. Krause and Coates (2008) defined student engagement as “the extent to which students are engaging in activities that higher education research has shown to be linked with high-quality learning outcomes” (p. 493). Similarly, Hu and Kuh (2001) defined engagement as “the quality of effort students themselves devote to educationally purposeful activities that contribute directly to desired outcomes” (p. 3). Harper and Quaye (2009) argued engagement was a more complex matter that required more than an understanding of time and effort. In their view, involvement without feeling engaged was simply compliance; students must feel an emotional connection to make meaning of their experience.
What are the Benefits of Engaging Students and Teachers?
Dewey (1938) defined the most powerful learning experiences as those that engaged the human mind in meaning-making. Dewey believed the most educative learning experiences allowed learners to solve problems and build understandings through interaction with the world around them. Although students were the primary concern of most researchers in the literature, Magolda (2005) contended they’re not the only ones to benefit from increased engagement in the learning process. The reciprocal environment constructed to engage learners fosters increased teacher engagement as well (Magolda, 2005). Although the literature is rarely focused on the benefits increased student engagement has on teachers (at any level), the benefits can be inferred. For example, increased faculty-student interaction resulted in greater job satisfaction (Bensimon & Dowd, 2009) and feelings of connectedness for faculty members (Kuh, 2009).
Although the benefits of incorporating student engagement practices are well documented in the literature, there is little to illustrate the consequences of disengagement beyond mere observation of what teachers may perceive as disengaged behaviors. Further, rarely have both student and teacher data been viewed simultaneously to understand the reciprocal nature of (dis)engagement in teaching and learning.
Purpose
This study is a snapshot of a larger study describing and comparing how and when experience engages students in the learning. Although this study was an unanticipated outcome the phenomena of teacher disengagement lends insight to challenges to teaching post-secondary courses in agriculture. Therefore, the purpose of this study was to illustrate or story the phenomena of teacher disengagement as an emergent etic perspective and consequence of implementing deep, prolonged instructional experiences in a post-secondary environment.
Research question: How can implementing deep, prolonged instructional experiences in a post-secondary environment affect student and teacher (dis)engagement?
Theoretical and Conceptual Framework
Shame Resilience Theory (SRT; Brown, 2006) provided structure to understand how individuals experience shame in high stakes environments. Brown’s theory, including the shame web depicted human interaction, specifically female in current research, can be explained best by understanding the variables associated with shame and the relationship between experiences with shame and performance standards. Initially, Social Cognitive Theory (SCT; Bandura, 1986) provided bounds for data collection; SCT allowed data to be categorized by the interaction between people, environment, and behavior. By viewing personal characteristics as reciprocally altered by behaviors and environments, researchers can view people as both creators and products of their experiences and understand the way individual thoughts and feelings affect the different ways people approach the world (Bandura, 1986). In providing detailed accounts of individuals’ personal experiences, researchers can view academic experiences with a more fluid set of expectations, including experiences with shame.
Originally, data collected during student and teacher reflections were framed using SCT (Bandura, 1986) in an attempt to categorize deep, prolonged instructional experiences in a post-secondary environment using personal and environmental determinants as stable concepts or variables. After analyzing the data, Shame Resilience Theory (Brown, 2006) proved to frame the data in a more descriptive and honest way, including additional variables for behaviors recorded in the original study. Admittedly, data from the original study proved more colorful when viewed in the context of SRT. The change from SCT (Bandura, 1986) to SRT (Brown, 2006) allowed for data to be analyzed in full context of the experience with greater understanding of the connection between variables associated with shame and performance expectations within a high stakes learning environment.
Method
This autoethnography was part of a larger study that spanned one calendar year—three academic semesters (spring, summer, and fall). Although the findings were focused on phenomena of teacher disengagement, the context of the course, activities, and students enrolled contributed to the findings. The larger inquiry included four cohorts with varying levels of deep, prolonged experience. Forty-two students (six male, 36 female), between 18 and 25 years of age, agreed to participate after enrolling in one of four sections of an undergraduate social science research methods course. Students represented four majors: agricultural leadership development, agricultural science, agricultural communications and journalism, and animal science.
When this study was conducted, I was a graduate student, and I co-taught the research course with my dissertation advisor and committee chair. Our students were involved as both participants and researchers. Specific learning objectives were aimed at developing students’ abilities to access information, think critically, and present and support reasoned arguments. However, students studied engagement by evaluating theories, collecting data from other populations, while also being introspective about their own engagement in the course.
Design
Although the larger study was an abductive, longitudinal, quasi-experiment the emergent etic perspective storied here was autoethnographic in nature. Autoethnography is a method of rigorous self-reflection and reflexivity that relies on the personal experiences of the researcher to describe and evaluate beliefs, practices, and experiences (Ellis & Bochner, 2006; Adams & Manning, 2015). By nature, “autoethnography is messy, uncertain, and emotional” (Adams et al., 2014, p. 19). The ability to use a research method to both accommodate for and acknowledge the difficult realities of social life helped make meaning of my experiences struggling within a larger hyper-structured research design.
Sources of Data
Data were extracted from more than 200 hours of interviews, four cohorts, and six hours of class per week, additional research meetings, conversations, and informal interactions of unknown amounts of time, and six weeks of immersive field experience. Additionally, quantitative data were collected from four commercially available instruments and used as artifacts to further increase the credibility of findings through data triangulation (Lincoln & Guba, 1985). Although the findings presented in this study only include data from observations, reflexive journals, student-teacher dialogue, and countless hours of rigorous introspection, the influence other sources of data may have had on my interpretation cannot be untangled.
The Human Instrument
Lincoln and Guba (1985) provided characteristics that “qualify the human being as the instrument of choice for naturalistic inquiry” (p. 193). Unlike most quantitative instruments, human beings are adaptable and “like a smart bomb, the human instrument can locate and strike a target without having been preprogrammed to do so” (Lincoln & Guba, 1985, pp. 193-194). As the primary instrument of data collection, I viewed this process from a nonlinear perspective, but had the flexibility to use quantitative artifacts as sources of data. Data, regardless of method or source, were used to mold, adapt, and continuously calibrate the human instrument.
Observations, Journals, & Dialogue
Observations were made before, during, and after each class and research meeting and during the entire field experience. As an active participant in the experience, I was able to capture interaction, be inductive, and observe behaviors beyond what students would divulge during an interview (Patton, 2015). Observations brought my own perceptions to light as well as the perceptions of students as recorded in their reflexive journals.
In addition to my own journal, the reciprocal nature of the larger study required my students keep reflexive journals to reflect critically on the “human as instrument” (Guba & Lincoln, 1981). Journals served as a reservoir for thoughts, feelings, observations, and field notes. Together, we chronicled the learning process while calibrating our instruments through self-discovery and interrogation (Lincoln et al., 2011). Journals provided insight to distinctive voices each of us brought to the classroom and led to a greater understanding of the multiple perspectives that framed the learning process (Alcoff & Potter, 2013).
Students engaged in a constant exchange of thoughts and ideas that served as both a source of data and method of learning. Specific attention was given to Socratic dialogue to help unlock implicit ways of thinking and insights not previously explored by the group (Given, 2008). Many times, our Socratic sessions would occur spontaneously outside of the bounds of class meetings and usually near a white board. Our concepts, models, and brainstorms were captured in photos to visually recall and interpret the experience along the way.
Trustworthiness of Findings
Lincoln and Guba (1985) outlined techniques for establishing trustworthiness to ensure findings are reached in a systematic and disciplined manner. Trustworthiness techniques mirror evaluation criteria found in quantitative research and provide increased “inspectability” of data and findings. I used multiple techniques to enhance trustworthiness of findings including prolonged engagement, persistent observation, triangulation, audit trail, peer-debriefing, member-checks, reflexivity, and thick, rich description. Extensive records (reflexive journals, sketchbooks, pictures of conceptual designs and models, and process and personal memos) were kept for confirmability and constant comparison of significant statements, codes, and emergent themes. A coding structure was used to ensure a detailed audit trail and is as follows:
Student data: Ex: 014_BR2_079
- Student participant code (01 – 042); 2) Source of data (BR = Black and Red journal, SB = Sketchbook); 3) Page number = (001 – 175)
Teacher data: Ex. FN_BR3_104
- Research activity (OBSV=Observation, FN = Fieldnote, RF = Reflection.); 2) Source of data (BR = Black and Red journal, SB = Sketchbook); 3) Page number = (001 – 175)
Data Coding, Analysis, and Presentation
The task of understanding ethnographic data lies in the ability to condense mass amounts and sources of data (Merriam, 1998). I originally approached coding in a very inductive manner, using in vivo coding, descriptive codes, and deductive codes based on the framework of SCT (Miles et al., 2014). However, the use of SCT proved to hinder the analytic process when considering the reflexive nature of my own data. I continued with the coding process despite my frustrations; I inductively analyzed and coded data, developed additional codes to describe unexpected elements that emerged, and placed each into a matrix where they were continuously sorted into primary, secondary, and tertiary themes. It was not until much later that I discovered the process of analyzing data was not an exact science. Therefore, data were then viewed as analytic memos where I recorded additional elements of how the coding process took shape (Saldaña, 2016). The resulting findings were presented with student data alongside my own using verisimilitude—a literary strategy that captures the researchers’ thinking processes and attempts to realistically convey the intricacies of the experience with thick, rich description—thereby, enabling readers to reconstruct the experience for themselves (Creswell, 2009; Lincoln & Guba, 1985).
With consideration for my relationship to the data and the difficulty experienced during analysis, SRT (Brown, 2006) was introduced as an alternative to SCT and data were analyzed for evidence of emotive response, specifically shame. Shame data, in the context of high stakes learning environments, must be considered when studying unintended consequences of student-teacher (dis)engagement. In Shame Resilience Theory: A Grounded Theory Study on Women and Shame, Brown identified five main concerns of shame: what are the participants describing, what do they care about, what are they worried about, what are the participants trying to do, and what explains the different behaviors, thoughts, and actions (Brown, 2006, p. 44).
Findings
Words are Hard
Native Language
I began this process in search of a way to make learning research more engaging. After considering various methods of classroom engagement, my teaching partner and I decided to forgo traditional teaching methods by avoiding the use of research terminology in class. Instead, we used common language so students might discover terms on their own and attach those words to experiences as they came about. For me, it was pretty easy to adhere to our native language because as a graduate student, research terminology was new to my everyday vocabulary. However, my partner had been using research jargon for eight years and the transition was difficult. Words are Hard quickly became a classroom hashtag and constant reminder to communicate in a way our students understand.
Research as a Second Language
The hashtag, #wordsarehard, became a fun “game” for our students. Our open and transparent process left very few things unsaid in our classroom, and students quickly caught on to the struggle we were experiencing with words. For students, myself included, research was a second language and “unlocking” new words was exciting… at first. For example, after observing other [University] students at various locations on campus, our students began to describe the various behaviors, environments, and personal characteristics they had observed. As one student wrote in his journal, “[Teacher] gets so excited when we figure things out. I need to Google Social Cognitive Theory” (07_BR1_014). My journal entry echoed their observation that day. FN_BR1_029: It’s working! It’s really working! #wordsarehard #proudteacher. I was motivated to provide them with experiences and attach terminology after they understood meaning. It seemed crazy, but research was becoming our second language and after years of learning terms just to pass a test, we were interested in how they became a permanent part of our vocabulary.
Language Acquisition through Experience
As time passed, words including “sample,” “instrument,” and “analysis” started to creep into our classroom discussions. Instead of discussing what might occur during an observation, interview, or face-to-face survey, students experienced issues first-hand and shared their successes and failures with our class. The chance to rifle through their experiences made it easier to share new terminology as we evaluated the process of understanding people. Although students seemed to be refreshed (or maybe just relieved) by the lack of terminology, a few also expressed a bit of confusion and annoyance with the process. One student was hesitant to speak up in class, but wrote “How is observing some people at the [student center] relevant to any kind of actual research” (013_BR1_018). Another student wrote, “Just give [the terms] to me. I know how to do research! I’m tired of waiting around for you to give me information” (06_BR1_027). I wanted to understand their point of view but was irritated with their impatience. After returning to interview and preliminary data, I saw the shared connection. Both students were double majors in animal science and predisposed to research in the basic sciences. In a way, they were ahead of the rest of the class (and always would be), but reflections provided more insight as each progressed. One wrote, “Observations seemed like useless collections of information. I now see it was the beginning of understanding a larger process” (013_BR1_018).
Native Language Attrition
Much to my surprise, as students gained efficacy with research terminology, I did too. Soon, my normal contributions to office banter were replaced with “what’s your unit of analysis,” and “what if we used a different conceptual framework?” I noted this transition after reflecting on time back home with friends. FN_BR3_062: When will I realize that not every lunch requires #researchtalk? I’m blabbering. THEY DO NOT CARE. Obnoxious! I found it only got worse as time went on. Research permeated my every interaction from my first cup of coffee in the morning to the text messages I sent before bed. Phone calls with my mom became more difficult and I could no longer explain to her what I had been up to. My “research buds” shared Piled Higher and Deeper (Ph.D.) comics on Facebook poking fun at the phenomena, but I had a hard time finding humor in our shared experience. FN_BR3_079: So much for being a great communicator! Might as well live under a rock. Because I had surrounded myself with peers in the same situation, the issue didn’t really become a problem until a new crop of students began the second phase of this study. Everything I prided myself on was slipping away.
FN_SB2_012: Why can’t I connect with them? I’m a teacher, damnit! Or am I? 🙁
Lost in Translation
In almost an inability to remember what it was like to struggle with the research process, I found it more difficult to engage the final cohort of students like the first. FN_BR3_104: There’s a gap between cohorts that I don’t really understand quite yet. They are struggling. How do I make this better? I’m at a loss here. It seemed my newfound connection to research terminology and the process of doing research left it difficult for me to connect student learning to new experiences and new experiences to student learning. The first cohort seemed to embrace new terms because they were anxious to finally get them. They anticipated them. They wanted them. The second cohort, however, didn’t seem to make connections in the same way. In some cases, the words seemed to pass by the experiences as if students were simply going through the motions. More times than I would like to admit, students wrote things like, “is she even talking to me?” or “I’m over trying to understand this class.” It hurt, but they were right. I was speaking a foreign language and oblivious that my connection was lost somewhere in translation. In feeling loss of teaching and communication skills, I was forced to reflect on differences emerging in the data, specifically my own – I spiraled into web of shame (Brown, 2006).
Gut Punch: Cognitive Dissonance & Reciprocal Engagement
“How can you expect me (student) to be engaged when you (teacher) aren’t?” (16_BR1_064)
FN_BR2_084: Stop the bus. What did she just say? Are you kidding me?!
Owww
I have no recollection of what I said in response to [student] that day, but I was completely taken aback by her comment. We had intentionally built an environment where students could feel comfortable saying things like this, but I doubt my response was indicative of that effort. I was angry. FN_BR2_084: I’m giving everything I’ve got over here. Who do they think they are? I spent the next few hours sitting at my desk ruminating on the remark. I started to wonder if we had given students too much power and freedom in the classroom. FN_BR2_085: This is why structure is important. She would never say that to [faculty member]. My rant continued on the next two pages and finally subsided with a final thought.
FN_BR2_087: Oh, wait. I told her to do that.
Cognitive Dissonance
The original remark about my perceived level of engagement resonated in eleven other student journals (all but two students’ present) that day. Students began to question my general level of interest and motivation in the course. It was pivotal. I spent weeks (and months, really) thinking about how many times I teach students to do one thing, while modeling a completely different behavior. I also considered the times I observed this type of behavior from my own teachers and mentors. This insight became a magnifying glass, of sorts, and I began examining almost all of my interactions. Could something as simple as “walking the walk and talking the talk” be paramount to this study? FN_BR3_012: “Do as I say, not as I do.” Looks like Dad’s old mantra is coming back to haunt me.
Although my reflection may seem trivial, to me it was revelatory. This study was originally designed to understand students and the experiences that engage them in learning, but all the while, I may have been looking in the wrong direction. I literally told them (on the first day of class) I wanted to find a new way. I told them I believed engagement to be a two-way process and I wanted their open and honest feedback. Yet, there I was ignoring my own levels of engagement. Even more, I was wrought with fear that others (faculty, mentors, etc.) might discover my less than stellar performance and quietly ask me to pack my things. In retrospect, that was a silly thought, but the stakes seemed so high at the time and I was far from hitting the mark. She [student 16_BR1_064] provided the one piece of information that changed the way I considered this study, twelve little words that haunted my brain for months. It broke my spirit, but enlightened my path.
Autopilot: The Harsh Reality of (Dis)engagement
“You are different, beautifully so, and people will benefit from your perspective.
Your words mean something. This experience is teaching you far more than what can be observed – it’s teaching you to believe in you.” (06_SB1_003)
The excerpt (06_SB1_003), above, was written on a postcard and taped face down into the pages of a student’s sketchbook. I thumbed through several times, never giving them too much thought, but once the tape started to give way, this postcard flipped over. It was one of ten she planned to send as little reminders to herself when she arrived back home. Lucky for me, she forgot to send them, and that afternoon, I sat by myself, read through each one, and bawled my eyes out.
FN_BR2_114: I’m exhausted.
When teachers say, “I’m exhausted”, I don’t really believe that’s what they mean. I’m sure they are tired and may think they are exhausted, but what I really hear them saying is, “I’m not excited about what I’m doing right now.” When teachers are engaged, they ignore being tired; they’re in the zone and running on fumes of passion.
FN_BR2_115: I’m really exhausted.
I recognize the blatant contradiction here, but that doesn’t change the reality of its occurrence. Comments like the one above peppered my field notes during the last six months of this study. I was ashamed to write down thoughts like, “What am I doing?” or “I don’t want to be here,” so I didn’t, but they occurred nearly three times as much. There I said it. I was on autopilot.
The shame of thinking these things, let alone including them in this study, was paralyzing. The idea of being “called out” for a less than perfect study because Iwas a less than perfect teacher was more than my pride (and future career) could take. I felt like a big ole’ phony. Surely, I wasn’t the only one to ever feel this way, right? Right? Do you think anyone else knows?
When students say, “I’m exhausted”, I don’t really believe that’s what they mean. I’m sure they are tired and may think they are exhausted, but what I really hear them saying is, “I’m not excited about what I’m doing right now.” When students are engaged, they ignore being tired; they’re in the zone and running on fumes of passion.
“I’m exhausted” (38_BR1_071).
Huh? It was like some form of black magic. My students couldn’t possibly be experiencing the same thing. We’re different. They don’t know what I know. The rare occurrence of this finding in the literature made the connection between my data and my students’ data even more difficult to accept. I needed some reassurance. FN_BR3_099: HELP!! I give up. This is impossible.
Cold Hard Truth
This study took me down a long, circuitous path. Communicating the findings (on paper) has been a monumental task, but I have told this story (to anyone who would listen) every day since it began. I wrestled with my own experiences— both teaching and learning—at every turn. I questioned and resisted what I considered to be “conformity;” I’ve been angry, frustrated, and disenchanted; and I developed a pretty large chip on my shoulder, too. FN_BR3_047: How can I communicate this experience? How do I adequately portray my own disengagement? How do I describe how much I’ve changed? I don’t even feel like a teacher anymore. I’ll never get a job after this.
To this point, the “pieces” or themes were like vignettes that lined the walls of my heart and mind for months, but they remained static without understanding the experience more holistically. The fact is, “words are hard”—hard to articulate, difficult to write, painful to digest, and often lost without the ones around them. The larger study began with specific research questions concerning the influences of experience on student engagement, however, “the path of discovery is not clearly marked, nor should it be” (Thorp, 2001, p. 37). I could have easily described student engagement throughout the entire study, outlined findings of the hyper-focused quasi-experimental design I set out to follow, and provided more direction for others to build on for the future, however, that would have alienated the most glaring pieces of data—my own. Identifying and addressing my experience within a high stakes learning environment provided insight into the concept of shame resilience. In addressing my own professional and personal experience with shame, I moved forward while highlighting what seems to be an emerging reality in high stakes academic environments – honest data reporting. In sharing this data, my feelings of powerlessness and isolation decreased when I invited others in. After letting go of the many ways this piece might be perceived and how that perception might affect my future career, I created more experiences with empathy, connection, power, and freedom than I could have ever expected (Brown, 2006).
Discussion
As the case with most naturalistic inquiry, the purpose of this study was not to infer to a larger population. Rather, the intent was to understand an unanticipated and, arguably, unfortunate phenomena: Teacher disengagement. Not only is the literature describing this phenomenon vague, it may be nonexistent. While teacher engagement is critical for the learning process, student expectations seem to be the immediate point of discussion. It’s important to mention the relationship between expectations and engagement – if expectations for learning or instruction are not met engagement will suffer for both teacher and student (Majkowski & Washor, 2014).
Experience was noted in no less than five of the seven research priority areas of the National Research Agenda (Roberts et al., 2016). Further, the history of, need for, and value of integration of experience into agricultural education environments was thoroughly noted by Baker, Robinson, and Kolb (2012). Despite the expansive number of researchers who have recommended integrating experiences into the educational environment, few have noted many of the potential unintended consequences of implementing deep, prolonged instructional experiences in a post-secondary environment. The occurrence of these consequences is not likely a new phenomenon. Yet, the implications of presenting ugly data or the unintended consequences of a study are not widely present in the agricultural education literature. Therefore, several elements should be investigated and considered by future studies:
Issues with Unrealizable Objectivity
Although it may seem as if I abandoned the design of this study somewhere along the way, that is not entirely the case. The design was like a too-tight sweater, uncomfortable but difficult to throw away. The truth is I became so focused on design that I had a difficult time connecting with the most important and significant part of my study: my students. It was important for me to tell that story, to illustrate the ways in which this study changed because I changed and allow the reader to come to conclusions on their own. Ignoring the growing pains would have omitted the difficult truths of an unrealizable objectivity – something I’m afraid is all too common in our research, but rarely explored. Reflecting on the power of vulnerability within the context of shame helped untangle this phenomenon; speaking shame is a pivotal opportunity for increased personal understanding and the development of personal and professional strategies for resilience (Brown, 2006). My attachment to design, and the research process for that matter, made it difficult for me to engage in the very environment I created. My quest to understand the complex nature of people and social interaction was beset by my transition from teacher to researcher. I was no longer the responsive and adaptable educator, but instead a rigid and design-focused researcher. I experienced the shame of foregoing a past pattern of thought by adopting the accepted norms of my high stakes environment, both losing and acquiring skillsets along the way.
Meaningful Connection in High Stakes Learning Environments
The process of learning new information is only engaging for so long without a personal connection between teachers and students. Harper and Quaye (2009) argued student engagement required more than an understanding of the teacher’s time and effort. The findings of this study provide evidence to support the influence of time and effort, but also raise questions of where that time and effort should be placed for effective learning. In this case, I placed the most time and effort on the process of conducting research instead of the people involved. I lost the connection with students when I stopped being responsive to their needs. There was no meaningful emotional connection to help make students connect to their learning experience, thus altering the overall learning environment. My commitment to people, to teaching people was ignored in my attempt to produce high quality research (Brown, 2006).
Might the high-stakes environments discussed by the National Research Council (1999) be to blame for the unintended consequences of disengagement by both students and teachers? Could the pressures of producing high quality research discourage teacher-researchers from improving instruction to engage students? Or is it simply the nature of research to become detached when adhering to focused and structured designs? Future research should consider the environmental and internal factors associated with faculty and graduate student expectations as it relates to student and teacher (dis)engagement in higher education; specifically, experiences with shame in academia and willingness to report honest data. Social scientists should consider moral and ethical implications of such, especially when expressing the realities of praxis in teaching and learning.
Connection Between Student & Teacher Experience
Often times, research considers the issues of student engagement independently from teacher engagement, providing a host of strategies to foster a better learning environment. However, rarely are variables considered side-by-side in a more holistic way. In doing so, it may be easier to notice the behaviors exhibited by students are not all that different from their teachers. Future research may benefit from observing engagement as a more universal phenomenon affecting teacher and student behavior similarly. Mojkowski and Washor (2014) contend student disengagement to be a deeper issue than previously believed. Student expectations are driving disengagement concerns. Students are struggling to fit into restrictive academic environments; therefore, a shift is necessary to focus on sustained engagement practices: relationships, challenge, play, relevance, authenticity, practice, choice, application, time, and timing (pg. 9).
The Problem with Theories
Although SCT was not the guiding force of this study, it served as a point of reference when considering factors of teaching and learning. At a granular level, factors suggested to change engagement were easy to understand, but the bigger challenge required I consider the way each and every interaction changed the next. It was a sequence of interactions, changes, and behaviors too large for me to see alone and the belief that a simple formula might uncover one solution was short-sighted. The simplification was intended to guide understanding, but a more complex analysis was needed. Literature identified SRT (Brown, 2006) as a means to shorten gaps of understanding and widen perspectives. SCT provided firm bounds for the study to begin while SRT provided a fluid construct for the data to exist.
The static and predictable nature of theories may lead people (especially young researchers) to believe the findings of this study (or any study for that matter) are merely formulaic in that the same person, doing the same thing, in the same way would provide the same answer every time. However, formulas are rigid and conventional, and albeit mathematical, function as a way to solve problems–human or otherwise. The sheer number of variables needed to consider the dynamic interaction between students and teachers during the process of learning is overwhelming, but should be considered, nonetheless. Might some teachers be restricted when conceptualizing teaching and learning as a formulaic process?
I contend student and teacher engagement, and thus, (dis)engagement to be more of a complex algorithm that adapts and changes. Although key “formulas” may make up an engagement algorithm, those formulas, the way they are arranged, and the many ways in which they change is more complex than what I could understand during the course of this study. Understanding the way this study emerged and the gravitational-type pull environment played in our findings would be too complex of a task without the consideration of a larger, adaptable algorithm. Future research should consider student and teacher (dis)engagement as an algorithm that stretches and changes in new, more dynamic ways. One potential method of inquiry is an algorithm based in SRT. Establishing a conceptual algorithm based on SRT may provide context for evaluating teachers in a complete emotional, social, and mental context. Understanding engagement with students and teachers demands an adaptable framework for teaching and learning.
New Methods in Agricultural Education
Ok. Hear me out. Many of the struggles of this study and my ability to adequately describe my experience may lie in our professions level of discomfort with more uncommonly used methods. As a graduate student, I worked alongside my mentors to develop a quasi-experimental study to “increase the rigor” associated with Agricultural Education research in the social sciences. All the while, this design held me back from truly understanding the phenomena at play. I struggled. I sought out additional qualitative methods, but rarely saw those methods in the pages of our journals. It seemed (to me) that I must adhere to a more structured design if I wanted to succeed. Might young researchers be hindered by our collective distaste for new methods? How can we mentor young researchers in rigorous methods spanning paradigms? What does the future of Agricultural Education (broadly defined) look like when young and old researchers alike struggle with making sense of a too tight and seemingly sterile science in a socially constructed discipline? Could the introduction of theories similar to the open-ended study of shame resilience provide a greater insight into teacher-student experiences? How do we help? Help.
References
Adams, T. E., Jones, S., & Ellis, C. (2014). Autoethnography: Understanding qualitative research. Oxford University Press.
Adams, T. E., & Manning, J. (2015). Autoethnography and family research. Journal of Family Theory & Review, 7(4), 350–366. https://doi.org/10.1111/jftr.12116
Alcoff, L., & Potter, E. (Eds.). (2013). Feminist epistemologies. Routledge. https://doi.org/10.4324/9780203760093
Bain, K. (2004). What the best college teachers do. Harvard University Press.
Baker, M. A., Robinson, J. S., & Kolb, D. A. (2012). Aligning Kolb’s experiential learning theory with a comprehensive agricultural education model. Journal of Agricultural Education, 53(4), 1-16. https://doi.org/10.5032/jae.2012.04001
Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Prentice-Hall.
Bensimon, E. M., & Dowd, A. (2009). Dimensions of the transfer choice gap: Experiences of Latina and Latino students who navigated transfer pathways. Harvard Educational Review, 79(4), 632–659. https://doi.org/10.17763/haer.79.4.05w66u23662k1444
Brown, B. (2006). Shame resilience theory: A grounded theory study on women and shame. Journal of Contemporary Social Services, 87(1), 43–52. https://journals.sagepub.com/doi/pdf/10.1606/1044-3894.3483
Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39(7), 3–7. https://files.eric.ed.gov/fulltext/ED282491.pdf
Coates, H. (2007). A model of online and general campus-based student engagement. Assessment and Evaluation in Higher Education, 32(2), 121–141. https://doi.org/10.1080/02602930600801878
Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods approaches. Sage. http://fe.unj.ac.id/wp-content/uploads/2019/08/Research-Design_Qualitative-Quantitative-and-Mixed-Methods-Approaches.pdf
Dewey, J. (1938). Experience and education. Macmillan.
Edgar, D. W., Retallick, M. S., & Jones, D. (2016). Research priority 4: Meaningful, engaged learning in all environments. In T. G. Roberts, A. Harder, & M. T. Brashears (Eds.), American Association for Agricultural Education national research agenda: 2016-2020 (pp. 37–40). Department of Agricultural Education and Communication. http://aaaeonline.org/resources/Documents/AAAE_National_Research_Agenda_2016-2020.pdf
Ellis, C. S., & Bochner, A. P. (2006). Analyzing analytic autoethnography: An autopsy. Journal of Contemporary Ethnography, 35(4), 429–449. https://doi.org/10.1177/0891241606286979
Given, L. M. (Ed.). (2008). The Sage encyclopedia of qualitative research methods. Sage Publications.
Guba, E. G. & Lincoln, Y. S. (1981). Effective evaluation: Improving the usefulness of evaluation results through responsive and naturalistic approaches. Jossey-Bass.
Harper, S. R., & Quaye, S. J. (2009). Beyond sameness, with engagement and outcomes for all: An introduction. In S. R. Harper & S. J. Quaye (Eds.), Student engagement in higher education (1-15). Routledge.
Hu, S., & Kuh, G. D. (2001, April 10-14). Being (dis)engaged in educationally purposeful activities: The influences of student and institutional characteristics [Paper presentation]. American Educational Research Association Annual Conference, Seattle, WA, United States. https://files.eric.ed.gov/fulltext/ED452776.pdf
Johnson, D. W., Johnson, R. T., & Smith, K. A. (1991). Cooperative learning: Increasing college faculty instruction productivity. ASHE-ERIC Higher Education Report No. 4. The George Washington University, School of Education and Human Development. https://files.eric.ed.gov/fulltext/ED343465.pdf
Krause, K. & Coates, H. (2008). Students’ engagement in first-year university. Assessment and Evaluation in Higher Education, 33(5), 493-505. https://doi.org/10.1080/02602930701698892
Kuh, G. D. (2009). The national survey of student engagement: Conceptual and empirical foundations. New Directions for Institutional Research, No. 141, 5–20. https://doi.org/10.1002/ir.283
Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Sage.
Lincoln, Y. S., Lynham, S. A., & Guba, E. G. (2011). Paradigmatic controversies, contradictions, and emerging confluences, revisited. In N. K. Denzin & Y. S. Lincoln (Eds.), The Sage handbook of qualitative research (4th ed., pp. 97–128). SAGE.
Magolda, P. M. (2005). Proceed with caution: Uncommon wisdom about academic and student affairs partnerships. About Campus, 9(6), 16–21. https://doi.org/10.1002/abc.113
Merriam, S. B. (1998). Qualitative research and case study applications in education. Jossey-Bass.
Miles, M. B., Huberman, A. M., & Saldaña, J. (2014). Qualitative data analysis: A methods Sourcebook (3rd ed.). Sage.
Mojkowshi, C. & Washor, E. (2014). Student disengagement: It’s deeper than you think. The Phi Delta Kappan, 95(8), 8–10. https://kappanonline.org/student-disengagement-dropout-washor-mojkowski/
National Research Council. (1999). High stakes: Testing for tracking, promotion, and graduation. The National Academies Press. https://doi.org/10.17226/6336
National Research Council. (2009). Transforming agricultural education for a changing world.
National Academies Press. https://doi.org/10.17226/12602
Patton, M. Q. (2015). Qualitative research and evaluation methods (4th ed.). SAGE.
Piaget, J. (1976). Piaget’s theory. In B. Inhelder, H. H. Chipman, & C. Zwingmann (Eds.), Piaget and his school (pp. 11-23). Springer-Verlag. https://doi.org/10.1007/978-3-642-46323-5_2
Roberts, T. G., Harder, A., & Brashears, M. T. (Eds). (2016). American Association for Agricultural Education national research agenda: 2016-2020. Department of Agricultural Education and Communication. http://aaaeonline.org/resources/Documents/AAAE_National_Research_Agenda_2016-2020.pdf
Saldana, J. (2016). The coding manual for qualitative researchers (3rd ed.). SAGE.
Sheldon, K.M. & Biddle, B.J. (1998). Standards, accountability, and school reform: Perils and pitfalls. Teachers College Record, 100(1), 164–180. https://www.tcrecord.org/Content.asp?ContentId=10304
Sorathia, K., & Servidio, R. (2012). Learning and experience: Teaching tangible interaction & edutainment. Procedia-Social and Behavioral Sciences, 64, 265–274. https://doi.org/10.1016/j.sbspro.2012.11.031
Thorp, L. G. (2001). The pull of the earth: An ethnographic study of an elementary school garden (Doctoral dissertation). Texas A&M University, College Station, Texas. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.1075.2874&rep=rep1&type=pdf