In a perfect world, journalism education would be fully responsive to a learner’s individual needs. Whether teaching Introduction to Reporting or Advanced Data Visualization, instructors could ascertain students’ exact understanding and then support them to their next level.
Of course, like any utopian narrative, differentiated instruction is constrained because there will always be a range of skill levels in any class. It is even more complex when the course is online.
Yet in online journalism education, and particularly with college-level or adult learners who know their capabilities and goals, there is a real opportunity for highly responsive learning. In my experience teaching online journalism courses, I’ve forged a path into individualized learning through repetitive needs assessment, and although it’s time consuming, I think it’s worth a look for journalism educators as more coursework moves online.
What is Needs Assessment?
As part of most instructional design models, needs assessment (sometimes called front-end analysis) is an attempt to understand students’ skills and knowledge before instruction is designed and delivered. It’s essentially a pre-test, but in this case, I employed it repeatedly.
Although some keen instructors can understand their students’ abilities intuitively during the first few days of class or after the first major assignment, online learning includes a literal and philosophical distance, making intentional needs assessment valuable. And it makes sense to do it throughout a course; more input means better instructor decision-making and the ability to make instant adjustments for learners.
Teaching Journalism Advising Online
I kept this concept of repetitive needs assessment in mind as I remade three graduate online courses for scholastic journalism advisers through Eastern Illinois University. When these courses were first taught in the 1990s, the professor who designed them was on the cutting edge of distance education, but much has changed. When I endeavored to resurrect them as asynchronous classes in the summer of 2017, I had to reinvent them. Educational technology had evolved, and my students were a diverse group. Some were experienced and seasoned journalism advisers for high schools throughout the country. Others were brand new advisers, or just embarking on the beginnings of a scholastic media program in their schools. Still others were only from the education or journalism side of a curriculum, without experience of the other half.
My needs assessment of these learners was a purposeful and recursive plan of information-gathering. At the beginning, middle and end of each three-week, online course, students were asked to take a short (no more than five minutes in length) online survey, using Typeform, to explain their background knowledge, expectations, preferences and fears/reactions regarding both content and online learning.
Structural Changes From the Nascent Moments
A number of responses to the needs assessments questions led to alterations in the structure of the courses:
- When students identified clarity of instructions as an online learning concern, I put explicit learning outcomes on the syllabus and within each online module. I also sent emails at the beginning of each module regarding expectations and provided formative assessment feedback within 24 hours.
- When students noted readings and discussions were how they learned best, I commented on each post, expanded and revised the reading selections and connected students to each others’ area of expertise to propel brainstorming and collaboration.
- Because students overwhelmingly desired specifics due to distance, I added extra assignment sheets and artifacts from other programs for reference and more ideas.
- After a student noted “the discussions weren’t much of a discussion” (their peers’ posts lacked interaction and cohesion), I quickly inserted three suggested topics for responses that created commonality.
- As concerns about a lack of face-to-face interaction with the instructor were expressed, I created more videos—beyond an introduction and conclusion multimedia presence—with comments or questions I would typically pose in a classroom while sharing my own advising experiences.
Making Content Changes Throughout the Course
I also turned my attention to the needs assessment data on perceived strengths and weaknesses regarding journalism advising, so I could make instant adjustments to the content of the course. Then, I went back into my LMS (learning management system) once again:
- When I saw trends in content weaknesses, I added more readings (some optional) on blind spots. I also provided specific examples of projects as a starting point for those who were unfamiliar with a concept.
- Keeping the readings in the course fluid with open educational resources, I added and subtracted articles up to a day before the next module started as I read response posts.
- Some students mentioned that a few readings didn’t apply, so I gave them the ability to add one or two readings of their own to a module of choice, allowing the learners to drive content, too.
- I kept notes on which learner was interested in what learning outcome from the beginning, and then researched and provided resources and responses for those goals through individualized discussion/responses.
Summative Assessments Individualized
At first, I had more academic-style, capstone assignments, as would befit a graduate-level course, but as I returned to the needs assessment data, I noticed reflection and practical application were more valuable for these learners. I adjusted the summatives to include both scholarly and reflective components. Then, learners’ work was shared to be used by other advisers as desired — adding to their bank of practical resources.
The Heavy Lifting Was Worth It
This experiment using a repetitive, needs-assessment based strategy to provide instant adjustments to an online course was far from easy. It was quite time-consuming; most online courses take high levels of investment in design at the outset, but once they begin, the intensity lessens. This model was the opposite. I spent at least two to three hours in adjustments and upkeep every day, and that did not include grading time.
However, there were also moments of responsiveness that allowed students to engage in a way they did not expect in distance learning. “Personally, I thought that this course was amazing; thoughtfully conceived; and well-executed,” one student said. Although these types of comments cause instructor elation, it also underlined how little some feel they take away from learning experiences:
“Wow, what a great group of readings! Thank you. I have become that teacher when, upon hearing that a PD is planned for what our school calls Professional Learning Mornings (PLMs for short), my brow furrows and I wonder what I will be able to take away from the session and apply to my classroom. Sadly, PLMs often disappoint, leaving feeling frustrated because I don’t always feel that I have learned anything. Give me something I can apply and I will try it out the next day. I blather on about this because, Module 5—like what I perceive to be a “successful” PD—has given me much with which to ruminate.”
This taught me that my plan for developing an online course would have to be frequently revised as the course went along if I wanted the best possible learning experience for my students. There’s still much to learn from learners, and here’s hoping that as online learning tools evolve, we can only increase the opportunities to repeatedly find out what our students need next — and then try to deliver it.
Amanda Bright is a former professional journalist who later spent a decade as a scholastic journalism adviser. Currently, Bright is the Education Editor for MediaShift, a journalism instructor and adviser at Eastern Illinois University and the Media Content Coordinator for Indiana State University Online; she also serves as the Social Media Director and Web Co-Administrator for the Illinois Journalism Education Association.