Ramblings. Musings. Required coursework.

Month: March 2017 (Page 1 of 2)

Week Nine Reflection

I found it a bit challenging to incorporate all of the articles into my blog posting this week, however, I learned quite a bit from them, even if they weren’t in the original blog post. Kohn’s (2007) point that “cheating is relatively rare in classrooms where learning is genuinely engaging and meaningful to students and where a commitment to exploring significant ideas hasn’t been eclipsed by a single-minded emphasis on ‘rigor’” really stood out to me. I think we, whether in K-12 or Higher Ed, tend to place all blame for cheating on the students and do not take the time to reflect on how the environment may be encouraging the behavior. As an instructional designer, it is easy to see that engaging and relevant courses tend to be far less plagued by issues of cheating. Further, I think collaborative environments focused on learning for the sake of learning, that take advantage of peer feedback and support, and require students to perform and demonstrate usable skills are those where competition is less of a problem and students excel.

Another take-away came from the Eberly Center’s Grading vs. Assessment (n.d.) article. I’d like to move away from grades that indicate both outcomes and behaviors or activities, such as participation, and focus solely on outcomes. By doing so, my students will receive feedback on specific areas of strength and weakness and I will be able to monitor overall class performance, “[tracking] the impact of instructional or curricular changes on specific learning objectives.”

I was able to interact with Cherie, Jule, and Kendra on their blogs and Gerald on mine. I love how Cherie is using Google Forms to pre-test her students. I inquired if she was intending to use the survey to test student ability to create a sentence demonstrating proper use of prepositions or prepositional phrases as a performance-based assessment. She is doing a great job of differentiating for her students, however, I suggested she consider how she will use formative assessment to provide necessary feedback along the way. I encouraged Jule to reconsider Popham’s explanation of criterion-referenced assessment as her description of assessment indicated she wanted to move away from a criterion-referenced test. Given her explanation of the activity and the comments she’s made in class, I believe it’s likely that she will, in fact, use a criterion-referenced analysis of the data. Kendra and I interacted about the nature of formative assessments and the concept of extra credit. While I agree with how she uses it, I encouraged her to consider a different term as “extra credit” often drives the grade-centric perspectives of our students and doesn’t really enforce the concept of learning for the sake of learning. Finally, Gerald and I bemoaned the fact that it’s hard to make the study of assessment interesting. That said, we both like the idea of performance-based authentic assessments, however, sometimes these are difficult to create in environments that are so content driven and really tend to rely more on breadth than depth.


Eberly Center for Teaching Excellence. Grading vs. assessment of learning outcomes: What’s the difference?. Whys & Hows of Assessment. Retrieved from http://www.cmu.edu/teaching/assessment/howto/basics/grading-assessment.html

Kohn, A. (2007). Who’s cheating whom? Phi Delta Kappan. Retrieved from http://www.alfiekohn.org/article/whos-cheating/

Week Nine: Intentional Assessment in the UbD Unit

Assessments, oddly enough (and perhaps somewhat humorously), are the focus of the course module I selected for the Understanding by Design (UbD) unit assignment. The Assessment module is one of seven in the NS 641 Developing Curriculum for Nursing and Other Professions course. As part of the Masters in Nursing Leadership Education track, students explore adult learning theory and teaching styles, curriculum development, technology for teaching and learning, and assessment methods for both courses and program evaluation across five aligned courses. Students in the track are generally professionals interested in obtaining the knowledge and practical skills necessary to educate peers within their profession. In addition to the Assessment module, student learning in NS 641 focuses on the context of professional curricula, active learning, modes of instruction, design frameworks (like UbD), course development plans, objectives, assessment, instructional materials, learning activities, course tools, and dissemination of created content. I opted to select this module for three reasons:

  1. It was my least favorite last year as I struggled a bit with making assessment interesting
  2. Evaluation of student performance and feedback indicated that at least two of the objectives were not internalized in the first attempt and, therefore, needed to be reconsidered
  3. Frankly, the timing was in tune with the assignment

As an Instructional Designer, and someone whose professional career has focused on curriculum development using a backwards design model, I will admit a preferential bias to authentic assessments, both formative and summative. Although a variety of definitions exist for the term, in general, authentic assessments require “students [to] perform real-world tasks to demonstrate meaningful application of essential knowledge and skills” (Mueller, 2016). Grant Wiggins further explains that they ask students to “engage in worthy problems or questions of importance,” using “knowledge to [perform] effectively and creatively…tasks [that] are replicas of…the problems faced by…professionals in the field” (Mueller, 2016). As part of the authentic assessment process, I create opportunities for criterion-referenced formative and summative assessments. Formative, low-stakes assessments provide practical and immediate feedback to students regarding their performance on learning objectives and guide both my immediate intervention responses to students and content revisions at the end of each semester (Eberly, Formative/Summative, n.d.; Shores & Chester, 2009; Popham, 2014). Summative assessments are used to demonstrate student mastery of learning objectives at the end of a module or the course as a whole; these assessments are evaluated against an aligned standards-based rubric (Eberly, Formative/Summative, n.d.; Popham, 2014). My decision to use formative and summative criterion-referenced authentic assessments allows my students to demonstrate mastery of the content in a way that is both meaningful and relevant to their future careers.

As such, it should be of little surprise that my authentic assessment in the Assessment Module revolves around students demonstrating a practical understanding of creation and use of aligned, authentic, formative and summative assessments. As part of the module, several opportunities are provided for students to complete a formative assessment, including: checks for understanding, peer feedback and resultant modifications, and a technology-based presentation on the assessment concepts they have studied. The checks for understanding and the presentation on assessment concepts create a space in which students can demonstrate knowledge mastery and synthesis after new learning and receive immediate peer and instructor feedback.

The peer feedback and modifications are part of a larger process to create and refine a Course Development Plan, which will be turned in as a summative assessment in Module 6 and graded against an objectives-aligned rubric. In Module 4, students complete the Desired Results section of the UbD template to develop their course and are required to provide feedback to each other. In Module 5, students make corrections to the Desired Results section based on peer feedback and will both complete the Evidence section and provide peer feedback. Through peer and instructor feedback, students will complete a final Course Development Plan that they present to peers in Module 7 and will teach as part of the Capstone course at the culmination of the Education track program.


Eberly Center for Teaching Excellence. Formative vs summative assessment. Whys & Hows of Assessment. Retrieved from http://www.cmu.edu/teaching/assessment/howto/basics/formative-summative.html

Eberly Center for Teaching Excellence. Grading vs. assessment of learning outcomes: What’s the difference?. Whys & Hows of Assessment. Retrieved from http://www.cmu.edu/teaching/assessment/howto/basics/grading-assessment.html

Kohn, A. (2007). Who’s cheating whom? Phi Delta Kappan. Retrieved from http://www.alfiekohn.org/article/whos-cheating/

Mueller, J. What is authentic assessment?. Authentic Assessment Toolbox. Retrieved from http://jfmueller.faculty.noctrl.edu/toolbox/whatisit.htm

Popham, J. (2014). Criterion-referenced measurement: Half a century wasted?. Educational Leadership, 71(6), 62-68. Retrieved from http://egandb.uas.alaska.edu:2048/login?url=http://search.ebscohost.com/login.aspx?direct=true&db=eft&AN=94925708&login.asp&site=ehost-live

Shores, C., & Chester, K. (2009). Using RTI for School Improvement: Raising Every Student’s Achievement Scores. Thousand Oaks, Calif: Corwin.

Week Eight Reflection

Since taking a Cognitive Science class in my undergrad, I’ve enjoyed reading about the brain and how it functions. A perfectly functional brain is fascinating; so, too, is how function changes when cognitive processes are impeded (for instance, I love reading Oliver Sacks’ works). Oddly enough, given this interest and the years I’ve spent in education, I’ve not spent much time analyzing how understanding the brain can help us understand our students and assist their learning process. I very much enjoyed reading both of the Jensen books and the articles I self-selected as they encouraged me to take the pieces I already knew about the brain and what I’ve learned about differentiation and design and synthesize it with the readings to determine best practices in my classroom moving forward.

I love the idea of being able to “hack” our students (and, frankly, myself as a student) – after all, we spend hundreds of dollars on tracking technology and health, exercise, and sleep apps to track our metrics, sleep patterns, etc. to ensure health and optimal body performance. Why not take into account the functions inherent to our neural systems and use it to our advantage? While I can absolutely see the relevance to my students, some of the strategies are a bit more difficult to apply when teaching an online course to adults –in many instances, they need to self-regulate the things I might do for a face-to-face class. That said, I’m definitely interested in trying a few on myself – particularly those that focus on information overload, management of time, brain breaks, etc. It’s also timely as I was late this week on my post and interaction because I just hit a wall. I generally struggle a bit in the spring (and always have); whether it’s seasonal affect disorder or something else, spring has always been a more difficult semester for me. After this last week or two, I’m feeling rather insane to be working full time, taking three classes, and teaching a course. Note to self and for everyone to see in writing, I don’t plan on doing that again, however, I do plan to implement a few of the concepts mentioned above to see if it helps in the meantime.

Aside from my personal reflection, although I was late, I did comment on several peer blogs to engage with at least their thoughts around brain-based learning. There were several posts that resonated with me this time. Cherie’s blog discussed how many teachers still equate poverty with inability to learn which I find to be an incredibly frustrating perspective in this day and age. I encouraged Cherie in her work with differentiation to help these students be successful in the supportive learning environment she creates and reminded her that choice in curriculum (which she offers) is one of the best ways to increase motivation. I also asked if she had read any works by Ruby Payne and how she thought they compared to Jensen.

Gerald’s blog examined the role of novelty in learning. I agreed with him that we need to capitalize on brain science and understanding students’ emotions to create optimal learning environments. I supported his premise that novelty in learning is a phenomenal tool as our brains grow bored with routine. To his comment that he wished he was creative enough to build in novelty, I recommended that he work with his peers, team, or department to discuss and brainstorm ways to create novel experience for students. I also noted that I am enjoying how classes such as this allow us to synthesize pieces of learning we may already have with new information (a perfect example of brain-based learning!).

Although we weren’t able to interact, Jim and I are on the same page regarding the fact that legislators and policy-makers don’t understand the nature of the brain, asking educators to focus on breadth of studies over depth. I shared an example of a standard from my World History course in Texas, one of over 100 I had to meet in a year. I find it incredibly frustrating that what we are asked to do runs counter to what neuroscience says is best for our students. I also encouraged him to think about how we can affect change in our classrooms and noted that we should all consider how we can bring change to the education system in which we operate.

Finally, I loved Kendra’s discussion of Maslow’s hierarchy of needs and shared a personal experience from grad school with her. During a particularly rough patch, I discovered that regardless of how well we know education theory (in this case Maslow), having a personal experience with the inability to focus due to a need not being met really solidifies how difficult life circumstances can make learning for our students. I reinforced her point that we need to slow down and see our students and do so holistically, understanding that what we perceive as bad attitude, laziness, etc. might be so much more than we know.

Week Eight: Brain-Based Learning and Differentiation

Use the term “brain-based learning” with some and you’ll potentially be met by skepticism or humor; after all, doesn’t all learning occur in the brain? Given this response, beginning with a solid definition seems wise. According to The Glossary of Education Reform (2013), the term brain-based learning, “refers to teaching methods, lesson designs, and school programs that are based on the latest scientific research about how the brain learns, including…how students learn differently as they age, grow, and mature socially, emotionally, and cognitively.” Brain-based learning, therefore, is less about noting the “location” of learning and more about how educators can understand cognitive development and functionality of the brain to design and develop content and activities to increase engagement and motivate their students to learn.

In Teaching with the Brain in Mind, Jensen (2005) describes seven critical factors in the learning process that directly relate to the design of the brain; these include “engagement, repetition, input quantity, coherence, timing, error correction, and emotional states.” Each of these factors exert unique and, sometimes competing, influence on the ability of students to receive, process, and retain information over time (Jensen, 2005). Additionally, sociopolitical and economic statuses play a role in the function of students’ brains. Low socioeconomic students “rarely choose to behave differently, but they are faced daily with overwhelming challenges that affluent children never have to confront, and their brains have adapted to suboptimal conditions in ways that undermine good school performance” (Jensen, 2009). These students often experience “emotional and social challenges, acute and chronic stressors, cognitive lags, and health and safety issues” (Jensen, 2009). As a result, educators should “assume [their] students need transition time from their personal lives to their academic lives” (Jensen, 2005).

Beyond the science of how the brain works, studies have shown the importance of students explicitly understanding the brain’s structural and functional changes during learning. “When students see [the learning] process as changing their own brains, the result is a powerful and positive cycle” that allows them to see that “hard work really matters” (Wilson & Conyers, Engaging, 2014). Further, students who practice metacognition, or “the ability to think about [their] thoughts with the aim of improving learning,” are able to think independently and reflectively, taking charge of their learning (Wilson & Conyers, Metacognition, 2014)

The design of the brain, and the resulting model of brain-based learning, strongly supports the use of differentiation in the classroom. A fundamental pillar of differentiation is the concept that a one size fits all model prevents teachers from engaging all students, meeting them where they are, and providing equal access to a quality education. Much like differentiation, brain-based learning acknowledges the “need [for] flexibility in the classroom to accommodate for a wide range of students” (Cozolino, 2013). Educators who understand the role of the brain in learning are able to, essentially, “hack” their students learning processes and use differentiation to ensure that learning takes place at the best possible time, in the best possible environment. Differentiation, then, becomes a tool for educators to ensure learning is relevant and meaningful to individual students, encourage depth over breadth, understand the importance of brain breaks, provide strategies to help students both practice metacognition and cope with internal or external stressors, and allow students to see the big picture but make choices regarding their individual learning experience to increase intrinsic motivation (Jensen, 2005; Desautels, 2016; Cozolino, 2013; Wilson & Conyers, Metacognition, 2014; Wilson & Conyers, Brain, 2014; Jensen, 2009). Educators who understand the science behind brain-based learning and synthesize this knowledge with the strategies of differentiation wield powerful tools to create learning environments allowing all students to engage and thrive.


Cozolino, L. (2013, March 19). Nine things educators need to know about the brain. Greater Good. Retrieved from http://greatergood.berkeley.edu/article/item/nine_things_educators_need_to_know_about_the_brain

Desautels, L. (2016, February 23). Energy and calm: Change it up and calm it down. Edutopia. Retrieved from https://www.edutopia.org/blog/energy-calm-change-it-up-lori-desautels

Great Schools Partnership. (2013). Brain-based learning. The Glossary of Education Reform. Retrieved from http://edglossary.org/brain-based-learning/

Jensen, E. (2009). Teaching with poverty in mind: What being poor does to kids’ brains and what schools can do about it. Alexandria, VA: Association for Supervision & Curriculum Development (ASCD).

Jensen, E. (2005). Teaching with the brain in mind. (2nd ed.). Alexandria, VA: Association for Supervision & Curriculum Development (ASCD).

Wilson, D. & Conyers, M. (2014, May 20). Brain moves: When readers can picture it, they understand it. Edutopia. Retrieved from https://www.edutopia.org/blog/brain-movies-visualize-reading-comprehension-donna-wilson

Wilson, D. & Conyers, M. (2016, February 13). Energy and calm: Change it up and calm it down. Edutopia. Retrieved from https://www.edutopia.org/blog/energy-calm-change-it-up-lori-desautels

Wilson, D. & Conyers, M. (2014, October 7). Metacognition: The gift that keeps giving. Edutopia. Retrieved from https://www.edutopia.org/blog/metacognition-gift-that-keeps-giving-donna-wilson-marcus-conyers

Week Seven Reflection

I tested PowToon this week. I enjoyed the software and its affordances; it’s a fun way to create animated, professional videos to introduce content to your students, create introductions, etc. Beyond using the program to create faculty-generated content, PowToon would also be great to offer as a presentation program alternative to PowerPoint, Prezi, emaze, iMovie, etc. I did, however, caution that (beyond the templates they offer) the technology would require a learning curve to create personalized presentations. Jim and I were able to have a good discussion around this point as he commented that the program had potential but seemed better suited to older students. I replied that, while I tended to agree, we should be careful making assumptions about the skills our students will come to us with as I’ve seen a four-year-old create short “videos” with the Sock Puppet app.

Jim and I also interacted around his testing and commentary on Dragon Dictate. While he was not sure he could see its uses in his classroom, I encouraged him to consider how it provides students with physical disabilities that prevent them from writing the ability to independently participate in this activity so many of us take for granted. Beyond that, I shared my experiences using Dragon Naturally Speaking to transcribe archival videos. I actually prefer Naturally Speaking for making transcripts as it doesn’t require you to say “Stop. New Sentence” after each sentence (which means you can’t make a transcript while recording a video).

As Rachelle’s Instructional Designer, she reached out to me at the beginning of the week for a recommendation on the technology she selected, as she wanted to test one that we use in our program but that would push her boundaries. I recommended, knowing she’s not used Google Drive much, that she try using one of the Google Apps for education. Although she was not fond of the inability to auto-grade, she created a Google Forms Quiz for her students. I recommended that she continue testing the software, particularly how the embed feature allows it to sync in our Blackboard courses. As I commented on her blog, I’m a huge fan of using an embedded Google Form to check for students’ prior knowledge before they begin the instructional materials and then embedding the results using Google Sheets to allow them to both assess their learning progression in the module and compare their responses to their peers.

Kendra and I interacted with questions around her testing of NearPod; I was interested to know if she saw relevance in the lessons to higher education, which she did not. I also asked, given that the lesson plans are created by others, if they allowed peers to review or rate the lesson plans to ensure quality prior to use. From what she was able to access in the peer version, they do not. I find this odd, given the movement of (free) open education resources (OER) to use peer rating and review systems to ensure and demonstrate quality – if a free resource includes this, you would assume a paid service would.

Finally, Gerald and I discussed the use of SPSS and SAS; I had specific questions about the free version and if there was a more advanced purchase option. He shared information relating to the capabilities of the software; as a result, I’ll definitely look into it as a viable alternative to SPSS when I’m statistically analyzing the data from my research.

« Older posts

© 2022 Heather Marie

Theme by Anders NorenUp ↑