Why did you choose the analysis methods that you chose?
I chose to use a Google Form that had a mixture of multiple choice questions that I could assign a quantitative value to as well as one open ended questions where I could evaluate the answers as qualitative data. I really thought that the quantitative data would be a lot easier to analyze since it was numbers. I have been having some struggles with the sheer amount of numerical data that I have to sort through. I had 140 students complete both the pre and the posttest and that alone has been very challenging. I am having a harder time than I thought finding the correct graphs to be able to include in my research paper with so much data. I definitely want to compare their pre and post test scores in a graph. I also asked students to self rate their ability to analyze a website for its credibility. I found this incredibly interesting and I am hoping to compare their score on their post test with their self evaluation score. How did each method that you chose answer your research question? I was amazed at the answers I saw on the pre test for their short answer open ended question. I really thought I would get a lot more variety in their answers in "how do you know if a website is trustworthy?" but they all seemed to answer the same way. After a class discussion I found out that in 5th grade, many of the students received a single day lesson from their teacher on how to read the URL of a website to analyze its credibility. Almost ALL of my students mentioned this in their answer on the pretest. I was happy to see that some did add to their skillset in their answer on the posttest, but I was shocked at how many still stuck with just that one piece of information to "check to see if the URL has .gov in the address." I think that the questions I asked students will help to answer my research questions. I asked a question where I had students report which methods that they had used in the past to evaluate websites and which ones they will use in the future. I think this quantitative data will be really helpful in answering my research question.
1 Comment
As I continued my research into the subject of media literacy I kept finding myself hitting dead ends. I realized that many of the articles I had found with research were quoting other research conducted more than 10 years ago. I began to see a large divide in the material I was finding. I would find one of the following:
I began to realize more and more that there is a growing need for developing strong curriculum and evaluation tools to assess if students have this critical skill of being able to evaluate internet sources, news and media found online and even images that are found online. As more and more content is being created every day at an exponential rate, we need to be preparing students to be savvy searchers. One of the most interesting research articles I found dubbed what I have been researching and wanting to implement in my classroom as "new media literacy" because it is different from the media literacy that has been researched and taught within our standards for a long period of time. This "new media literacy" incorporates Web 2.0 and social media. I did not find too much more correlation between the articles and the author's views other than there was a theme of the researchers recommending that this be further studied and evaluated in our students. This supports my research question "How does a digital media literacy unit affect students' ability to evaluate digital media?" When researching the topic of "digital literacy" and "media literacy", I had a significant challenge in finding articles with relevant and recent research on the topic. Some of the studies that I was able to find were from almost 10 years ago and although still relevant, the statistics seem to be irrelevant now with the exponential growth of social media use amongst adolescents as well as the capabilities of the common smartphone in today's world. Many studies were conducted between 2000 and 2005 on media literacy and student's ability to understand the messages in print media but little research has been done when combining digital literacy and media literacy to assess students ability to analyze digital media and its messages.
One such study was conducted by Greenhow (2009) with children in an afterschool program for low income students who were striving to attend college. In a survey given to over 800 low income high school students, she found that many students turn to the web to seek out a variety of information from college information to information for school research reports. Students identified the internet as being "essential" to their lives. Greenhow concluded that even though we are labeling our youth as digital natives, there are still some critical internet evaluation skills that need to be taught to all students before they graduate to make sure the gap in between low and high income students does not broaden. The final results from a study of undergraduate students in 2009 by Arke and Primack made many suggestions for future media literacy studies. Their study primarily analyzed students ability to critically evaluate traditional forms of media. Their study suggested that improved measurement of media literacy will help advocates learn what challenges still remain for students. The researchers also made suggestions at the end of their study that there will need to be additional measurements of students media literacy skills with regards to Web 2.0 content. Their suggestions for future media literacy evaluation in 2009 is very similar to what I will be researching with my own students. A recent research project related to game based learning and internet skills by Admiraal (2015) demonstrated a gender effect in their results. The game was designed to evaluate students reflective internet skills. The questions in the game were based on the concept of digital judgement and included questions about students' ability to acquire, process and produce digital information. Of the 7th grade students who participated in the study, the boys significantly increased their scores between the pre test and the post test, while the female students only increased their scores slightly. In 2009 research related to a science inquiry task and undergraduate students ability to evaluate websites was conducted by Wiley et. al. This study consisted on providing students with the "top 6" google results for a search related to a volcanic eruption. Students only had access to those links the instructor provided. Some of the links were credible sources and others were not. The study also involved providing students with a "seek" instructional unit on evaluating web sources. The results supported prior research and that students do not seem to have a coherent understanding of how they should evaluate sources. 79% of the students responded that they had never received instruction in assessing the reliability of an internet source. Greenhow, C., Walker, J. D., Kim, S. (2009). Millennial Learners and Net-Savvy Teens? Examining Internet Use among Low-Income Students. Journal of Computing in Teacher Education, 26(2), 63-68. Arke, E. T., Primack, B. A. (2009). Quantifying Media Literacy: Development, Reliability, and Validity of a New Measure. Educational Media International. 44(1) 53-65. Admiraal, W. (2015). A Role-Play Game to Facilitate the Development of Students' Reflective Internet Skills. Educational Technology & Society. 18(3), 301-308. Wiley, J., Goldman, S. R., Graesser, A. C., Sanchez, C. A., Ash, I. K, Hemmerich, J. A. (2009). Source Evaluation, Comprehension, and Learning in Internet Science Inquiry Tasks. American Educational Research Association. 46(4) 1060-1106. Stable URL: http://www.jstor.org/stable/40284747 Summary of Research A recent study by Zhang et al. (2016) was published after data was collected from 5th and 6th grade students in Beijing. The study was conducted because measuring digital media literacy is a difficult challenge. The study included creating a rating scale where students self reported their skills in the 4 declared areas of digital media literacy: technical skills, critical understanding, creation and communication and citizenship participation. The questionnaire included specific questions such as: “I am able to judge the reliability of the information and news on the internet” and “I am able to detect differences in the information I receive from different search engines.” The results concluded that there was a high level of digital media literacy skills amongst the students but there were limitations to the study due to the fact that students were self reporting and was not always accurate with regard to the students’ actual skill. Julie Coiro (2015) also conducted a study published which sought out to find evidence regarding what types of evidence do seventh grade students use to judge the quality of online information. She also sought out to find which patterns of evidence 7th graders use to justify their reasoning. The study found that students could easily identify the author and the author’s point of view but evaluating the author's expertise and the website's overall reliability posed more of a challenge to students. Hutchison et. al. additionally conducted research in 2016 regarding what digital skills preadolescent students had and the perceptions of their skills. Their study of nearly 1200 fourth and 5th grade students led them to discover that preadolescent students are moderately skilled at online search, evaluation and communication tasks. Students also showed a higher interest in using the internet but students reported that it was more difficult to read than a book or textbook. The study also found that students perceptions of their digital skills did not match their scores on a digital skills based test. References United States, Congress, Office of Education Technology, et al. United Stated Department of Education, Jan. 2017. tech.ed.gov/. United States, Congress, “Empowering Learning A Blueprint for California Education Technology .” Empowering Learning A Blueprint for California Education Technology , Apr. 2014. www.cde.ca.gov/ls/et/dc/. Zhang, H., & Zhu, C. (2016) A study of digital media literacy of the 5th and 6th grade primary students in Beijing. The Asia - Pacific Education Researcher, 25(4), 579-592 Coiro, J., & Coscarelli, C., & Maykey, C., & Forzani, E. (2015) Investigating criteria that seventh graders use to evaluate the quality of online information. Journal of Adolescent & Adult Literacy. 59(3), 287-297 Hutchison, A., Woodward, L., Colwell, J. (2016) What are preadolescent readers doing online? An examination of upper elementary students’ reading, writing and communication in digital spaces. International Literacy Association, Reading Research Quarterly. 51(1), 435-454 |
AuthorWrite something about yourself. No need to be fancy, just an overview. Archives
December 2017
Categories |
Photo used under Creative Commons from wuestenigel