Teaching and facilitating are not the same
Online teachers are teachers online. Online teachers are not facilitators, and facilitators are not teachers. Teachers have the job of teaching content, incorporating differentiation, proper pedagogical strategies, and customizing their teaching approaches to meet learner needs. A facilitator merely facilitates learning. They can be seen as more of a formality rather than a necessity, and if you take away the facilitator, the same learning could still be achieved. While learning can happen through a student-facilitator relationship, a proper online classroom is not composed of students and a facilitator, but rather students and a teacher. Teachers are professional experts in the art of teaching, and not just subject matter experts. The “lernification” phenomenon paints teachers as subject matter experts and students as autonomous learners, almost independent of one another. The threat of online learning is placing “teacher subject expertise and critical professional judgment in the background of educational practice” (Bayne, Sian, et al., 2020). This approach to ‘teaching’ places the subject matter expert at the center of the classroom, rather than focusing on “the growing expertise of the students”(Morris, S. M., & Stommel, J., 2018) and customizing the learning experience for their increased growth. Facilitators know content and can present it clearly. Facilitation of content is seen in youtube videos, mass online courses, and podcasts. Teachers, though, are master pedagogues and “their knowledge of the discipline and their knowledge of pedagogy interact” (National Academies of Sciences, Engineering, and Medicine, 2000) to produce a true learning environment. Due to their structure, online courses can easily face the threat of facilitation, and what could have been a true online classroom, can quickly turn into an online dump of knowledge. Baumgartner’s Active Learning While Physically Distancing chart presents how various communication strategies can be altered in different learning modalities. This chart is not only a wonderful resource for teachers who need to alter or enhance their learning communities, but also amplifies the ways in which online learning compares to face to face learning. It is important to note that switching from a face to face to online community does not make the active learning strategies impossible, and they can still be implemented with as much importance as in face to face learning. This chart is one of many resources available for teachers looking to ensure their online courses are taught, and not just facilitated. Quality online education is not “one size fits all” There is a lack of individualization in many online courses. They can be created largely for the general public, without consideration of unique learners. As with any course designed for a high volume of learners, there is minimal room for individualization. Massified online courses present themselves similar to massified in-person courses. While the 500-person lecture can be a convenient way for universities to deliver content to a large number of students in a timely and cost-effective manner, the education these learners receive is not going to be comparable to the education they would receive in a smaller 20-30 person classroom. The teachers in these large courses are reduced to facilitators, and students can quickly become just a number. This Pacansky-Brock infographic touches the four principles for humanizing online instruction. The third key principle is awareness. Pacansky-Brock explains that “awareness is achieved by learning about who your students are and how you can support them” (Pacansky-Brock, 2020), and in order to learn about your students, there must be some aspect of individualization and/or individualized learning. To even come close to touching a sense of individualized learning, courses must incorporate smaller recitations, which can be difficult to facilitate in mass online education. In general, “massified higher education cannot, in many instances, be experienced as intimate”(Bayne, Sian, et al., 2020). Courses developed for the general public can be used as a wonderful supplement for learners who are looking for a convenient learning experience, but without the ability to customize the course to the learner’s needs, these online courses fall short for many students. While there may be structured opportunities for engagement with peers and / or the instructor, the overall outline of mass online courses have little room for flexibility in pedagogy and engagement opportunities that many learners need in order to thrive. So, the real issue here isn’t with online education as a whole, but rather with massified learning. The unfortunate reality is that many online courses are created for public use, and in turn lack the quality of in-person individualized learning. Online courses that are provided in a smaller environment, mimicking the structure of small in-person courses can still provide the one-on-one and specialized instruction that many students seek. Educational technology raises and resolves constraints of education simultaneously In recent years, the rise of online education has proven to be a tremendous solution to many issues presented in K-12 and higher education. During the Covid-19 pandemic, the world started to embrace online learning and educational technology as a solution to many of their problems. Students were able to learn from the safety of their homes, as teachers discovered ways to transition their content from in-person to online. While the implementation of online learning and a broader educational technology toolkit resolved the immediate problem at hand, teachers uncovered new problems not present before implementing these new technologies. Teachers found that technology “has great potential to enhance student achievement and teacher learning, but only if it is used appropriately”(National Academies of Sciences, Engineering, and Medicine, 2000). In other words, educational technology comes with its own constraints, and isn’t a magical fix to all problems faced in the classroom. Lo and Hew’s (2017) research on flipped classrooms emphasized some of these constraints. As a solution, they proposed 10 guidelines to implement along with a flipped classroom in order to mitigate some of these challenges. The guidelines covered student-related, faculty-related, and operational challenges. As found in this study, there are really unique affordances of educational technology and online learning, but these tools must be looked at with a critical eye and adjusted appropriately. “Trying to make an online class function exactly like an on-ground class is a missed opportunity”(Morris, S. M., & Stommel, J., 2018). Knowing where your technology falls flat, and not trying to ignore these constraints, but rather using its affordances to your advantage can provide students with an unmatched educational experience. Educational technology is not meant to replace traditional education. When used appropriately and in combination with proper instruction, teachers and students can maximize its potential. References Baumgartner, J. (n.d.). Teaching tools: Active learning while physically distancing. Louisiana State University. Lo, C. K. & Hew, K. F. (2017). A critical review of flipped classroom challenges in K-12 education: Possible solutions and recommendations for future research. Research and Practice in Technology Enhanced Learning, 12(4), 1–22. Morris, S. M., & Stommel, J. (2018). An urgency of teachers: The work of critical digital pedagogy. Pressbooks. Bayne, S., Evans, P., Ewins, R., Knox, J., Lamb, J., Macleod, H., O'Shea, C., Ross, J., Sheail, P., & Sinclair, C. (2020). The manifesto for teaching online. MIT Press. National Academies of Sciences, Engineering, and Medicine. (2000). How People Learn: Brain, Mind, Experience, and School: Expanded Edition. Washington, DC: The National Academies Press. Pacansky-Brock, M. (2020). How to humanize your online class, version 2.0 [Infographic].
0 Comments
Can you identify 4 similarities between the first two images and the second two images? When I tasked myself with this exercise I came up with:
Did you come up with something along the same lines? I wouldn’t be surprised, since there are not a whole lot of similarities here. Now, if you asked me the differences between these two assessments, I could probably hand you an entire list. To begin, these assessments show the growth of an entire semester. The first sandbox assessment was created before expanding my knowledge on assessment techniques, technologies, structures, design, and data collection. I was given a technique to center an algebra assessment around, but that was the only constraint in my box. This gave me lots of room to play. What I found interesting is how much flexibility I had in my first assessment, yet how little flavor I gave it - I had the room to play but didn’t take advantage of it. Somehow, having the constraints placed on my assessment made me think outside the box (or inside the ‘sand’box in this case), and forced me to consider these different aspects of assessment. Because I had to include certain components in my assessment, I had to think about how these categories play an important part in assessment. Adding one category at a time throughout the semester allowed me to shift my attention to a new aspect of assessment each week, and figure out how I was going to highlight the importance of each. Knowing what I now know about assessment, I have redone the first sandbox assessment to see what I could come up with! Although having the constraints of each category placed on me made my assessments a little wonky, I now see the potential that my first assessment had. I included technologies, data collection techniques, clarified response format, and structure of the assessment. I would love to hear your thoughts on the improvements made! In the last few weeks I have been diving into data and assessment. This is coming off the tail end of my course’s last unit involving AI. While I was in a syncharnous zoom meeting with some of my classmates, one of them brought up the realization that AI doesn't know students like teachers know students. This classmate had made a lesson / assessment plan using AI, and had noticed that while the overall structure was good what AI couldn’t compete with was the teacher’s ability to know their students on a personal level - their growth, how and where they struggle, how and where they flourish, and what helps them achieve the most success. With this conversation in mind, I dove into exploring assessment data with a student-centered approach. While I explored, a theme that continued to emerge was how to create assessments that produce data reflective of the whole student. Assessment data can be a really useful tool, but it can also really harm students if used in the wrong way. Many times, administration can pull assessment data from a class and minimize students to mere numbers and labels. This removes all humanization, fairness, and equitable feedback from assessment, and defeats all purpose of assessment. Don’t we assess students to find out how they are doing overall? How can we tell how they are doing as a human from a few true / false or multiple choice problems? The reality is that we can’t. That said, there is a valid need for this summative assessment data on a larger scale. Even though it may seem frustrating on a small scale, this data is important to school and district growth overall. As a teacher, learning how to approach large assessments with students and what to do with data after assessments can change your relationship with assessment. This infographic, put together by Keown, et al., 2021 is a wonderful jumping point for teachers. In particular, the third row “using summative test data to group students” seemed really important to me. Assigning students labels based on assessment data can be super harmful. Imagine you missed a few days of school during the height of fraction instruction, and you have not been able to catch up on what you missed yet. Test day comes around and you still do not understand fractions, but the mathematics portion of the assessment is 50% fraction based. To no fault of your own, you don’t know how to answer any of those questions, run out of time, and end up nearly flunking the assessment. Because you placed in a certain percentile, you now have to spend an hour of your day in math intervention. This causes you to miss key time with your classmates, learning ELA content. You subsequently fall behind in ELA…. and the cycle continues.
Just as the inadequate AI lesson highlighted the importance of humanity in education, so does interpreting assessment data. Teachers know their students, and if this student’s teacher looked at their assessment results, they would be able to identify why the student scored in such a way, provide accurate feedback, and prevent labeling. References: Keown, K., Fossum, A., Li, J., Young, J. (2021, March 25). What the data can’t tell us. [Image] Achieve The Core. |
Welcome!Sit back, relax, and enjoy (or don't, up to you)! Archives
August 2023
Categories |