You are on page 1of 19

Exploring Impacts of Interactive Whiteboard Use in Elementary School

Jessie Shipman
April 2011

Introduction
Interactive Whiteboards have become a prevalent technology in classrooms throughout the world
and have a major impact on the technology budget at many school districts across the country. This
holds true at Vega Elementary School, located in McKinney,TX. Almost every classroom has an
IWB at the front of the room. However the use of the IWBs varies considerably from teacher to
teacher and only a few are using them every day, or most days as an interactive learning tool.

According to most of the literature on IWB’s that I have so far reviewed, students find that they
enjoy learning more when the IWBs are used in their classroom interactively, or they feel that they
have learned more. However, I have not yet found any significant studies reporting notable learning
differences between using an Interactive Whiteboard and not using one – or rather using one, but
not interactively, i.e. as an expensive projection device.

The purpose of my inquiry is to discover whether Interactive Whiteboards have a marked difference on
student learning and to explore how that knowledge might be helpful to a school or individual teachers
in their decisions to adopt the tool more completely.

I have three years of teaching experience in a school where there was an IWB in every classroom
and many of them were not being utilized or utilized as an expensive projector system. As I began
to talk to other teachers at other schools it seemed as though they were having a similar
experience. In August I moved to Texas and began to look for employment but was unable to
obtain a position. As a result I was not able to have a place to conduct an action research project
that I was personally going to benefit from so I had to find a project that I already had an interest
in, i.e. Interactive Whiteboards. I first noticed that Vega Elementary had an IWB in every classroom
when I went in to do some volunteer work. As I began to observe their use, I noticed, like many of
the experiences I have heard and seen before, that their interactive use was not widespread.
When I recognized this I saw an opportunity to contribute my expertise to Vega and for Vega to
offer me the opportunity to do some action research and answer for me, at least on a small scale,
that IWBs are worth purchasing and that millions of schools have not made a huge monetary
mistake.

Discovering whether IWBs have an impact on learning is a highly important inquiry because it has
an impact not only on how teachers use them in the classroom and why they use them in the
classroom that directly effects students, but also because it has an impact on the budgetary
decisions that are made in districts, because each IWB costs between $2000 and $5000. The
budgetary decisions that are made are important because in this economy districts are being
forced to make important decisions about where they are spending their money. If IWBs aren’t
having an impact on learning perhaps that is more money that can go toward hiring teachers, or
purchasing other technologies that have more impact.
Research Questions
1. What kinds of learning impacts happen when an Interactive Whiteboard is used interactively vs.
non-interactively?
a.
Learning as measured by unit assessment
b.
Student reports of learning and engagement with the tool
c.
Teacher observation/reports of teaching, learning, and student engagement
2. Does using an IWB interactively create positive feelings about learning in students?
3. Do positive feelings or negative feelings about the use of an IWB interactively have any
correlation with information retention in students?

Methods

Participants
I will be using four classes of 20 to 25 5th grade elementary school students to compare pretest
and post test results of both a lesson using an IWB non-interactively and then using one
interactively. I am collaborating with their teacher, Ms. Jo Ann Holman to come up with an age and
content appropriate lesson, as my specialty is not in elementary education.

Data Collection
On Day One, all four 5th grade classes were presented a pre-assessment about the parts of a
book (see Appendix A). Once the students completed the pre-assessment I then proceeded to
present to them information on parts of a book using the document camera that is connected to
the SMART board system in the classroom. We did a class discussion about parts of the book and
students took notes while I also used the regular whiteboard to write down definitions. We then
looked online using the IWB to see what an MLA citation looked like and I switched back and
forth from the book on the document camera to the website and we wrote down the MLA
citation for that book. Then I used the IWB to show them how to access Google Books, how to
tell which books are entire books, and which books are previews and how to use them and cite
them. The students then did an activity with a worksheet where they used books to do a
scavenger hunt. The students saw, but never interacted with the IWB. Each class had
approximately an hour and fifteen minutes of direct instruction time.

On Day Two the students were presented with a post assessment about the parts of a book that
they had learned on Day One (See Appendix B). They then used the remainder of the period
looking at books and Google Books to do a research project on the American Revolution.

On Day Three I began the interactive lesson. Due to a scheduling conflict I only had each class for
approximately 50 minutes of direct instruction time. I started the lesson with a pre-assessment
about Internet safety (See Appendix C). I broke the students up into four groups and we rotated
through the groups to decide which student would be up at the IWB, breaking the rotation into
small rotating groups ensured that every student would have an opportunity to interact with the
board. I used an interactive lesson using the SMART technologies Notebook software. I started by
having a student circle what they believed to be the correct answer on a series of multiple choice
questions that had many possible correct answers. Once the student used the IWB to answer the
question we then had a class discussion about why the students believed that was the correct
answer. After that students were able to access different websites to determine whether they
were bogus or not. Again, we were rotating through the groups for each slide or activity and the
students were manipulating the IWB. Although I sat at the back of the classroom and guided the
discussion, the focus at the front of the room was the students using the IWB.

On Day Four we continued the interactive lesson. We watched a video produced by Answers.com
about internet safety and using and citing good sources for research. The students followed along
with the video with a fill in the blank worksheet. After the video completed students took turns by
rotating through their groups to answer the fill in the blanks by writing the answers on the SMART
board. This enabled the students to have a discussion to review the video content and to be able
to interact with the SMART board. After we finished reviewing the content we played an
interactive “hang man” type game in which the students had to identify words that helped them to
review the information that we had learned over the entire lesson. Students were able to be in
teams and they could go up to the SMART board and select a letter to fit into the word. The
students appeared to really enjoy being able to go to the board and work in teams and they were
very successful in completing the activity.

On Day Five they took their final Post-Assessment (See Appendix D) and then they got on the
computers in the library computer lab to continue doing research via the methods that they
learned throughout the interactive IWB lesson.

Data Analysis
When I collected the data I looked at the ratio of improvement from pre-assessment to post-
assessment scores. I took an average score for each of the assessments, both pre and post and
then found the difference between the post assessment score and the pre assessment score. In
addition I also looked at the difference between post assessment scores and pre assessment scores
for each student and compared the difference with their responses to the student survey (see
Appendix E) to determine if students who felt that they learned more better by using the IWB
interactively did in fact have a better score on the post assessment for the interactive lesson.

Question x Methods Table

Research Question Primary Data Source 2nd Source 3rd Source

Are there any Pre-Test Post-Test Student Survey LIterature Review


significant learning impacts Assessment and Data
when an Interactive White Analysis
board is used Interactively
vs. non-interactively?

Does using an IWB Student Survey LIterature Review


interactively create
positive feelings in
students?

Do positive feelings Pre/Post Assessment


influence how much Analysis
information students
retain?
Schedule
Last Day of TAKS Testing
scheduled April 4, 2011

Non Interactive Lesson Pre-Test and Post Test


scheduled April 5, 2011

Non-Interactive Post Post Test


scheduled April 8, 2011

Interactive Lesson, Pre-Test and Post Test


scheduled April 11, 2011

Student Survey
scheduled April 12, 2011

Interactive Lesson Post-Test


scheduled April 14, 2011

Ethical/Stakeholder Impacts
The positive impacts of the findings of this research project are many. Because the findings of this
research study show an improved learning impact from the use of IWBs interactively there may be
a big push for more training in the use of interactive lessons and how to get the students involved.
Hopefully the impact of that would be an overall performance improvement in the school on
statewide mandatory testing. Also if the findings of this research show improved impact on
learning due to IWB interactive use there may be an increase in the positive feelings achieved
through the school environment because it could be mandated that IWB interactive use become
more prevalent in the classroom. An increase in positive feelings toward the learning environment
could lead to a longer impactful learning experience for individual students throughout their time at
McKinney ISD.

This research project involves a great deal of student participation, but the way that it is being
conducted is not a far cry from what they are currently, or have previously done in their
educational experience. They use the IWBs more in some classes than in others and I took
precaution to use best practices in both the interactive and non-interactive lesson so that students
received the information they are required to have in both cases. This study established that the
use of IWBs interactively is the best practice. There is some risk though that because I established
that interactive use is best practice that students did not have had the best experience in the non-
interactive lesson that they could have, but again this was not unlike their current educational
experience.

Checks for Rigor


I integrated my research into lessons that this group of students would have already done without
me being there, so the assessments that I gave to them were not out of their ordinary realm of
their educational experience to ensure that there would be normal participation among the
students. Also each of the assessments counted toward their achievement points in the class which
also ensured buy-in.
When I analyzed the data I used pre and post assessment to check the ratio between interactive
and non-interactive to see if there was any increase in the retention of information. I have tried to
eliminate all other factors to ensure that I am isolating the variable of interactive. Of course,
because I am working with humans and humans are fallible, I have made note of any and all factors
that I observed that could have skewed the data and I have noted those variables in the discussion
section of my research.

I created the assessments to closely match assessments that they have seen before, they were
multiple choice to ensure the ease of data collection and analysis. All assessments will have similar
question types, but all will be slightly different questions or question word order especially between
the post test.

Findings:

Literature Review:

Many of the articles regarding the use of IWBs in classrooms reflect the same overall theme:
An Interactive Whiteboard is a wonderful tool that has taken up a great portion of technology
budgets, but are not particularly effective at increasing student learning. The research indicates that
most of the evidence that is in support of the continued purchase and use of IWBs in classrooms
are anecdotal. There were two major arguments that resulted from the literature. The first was
that IWBs do benefit and enhance student learning.

In a Point/Counterpoint article that was issued by ISTE, Jocelyn Johnson states that IWBs
facilitate student directed learning because it allows the students to manipulate the tool, thus
causing buy in. In addition she also states that the multimodal nature of IWBs allow for a variety of
resources that are not available without them (Johnson, 2010). She indicates that because of these
things that students and teachers are better prepared for higher order thinking skills, and it allows
them to collaborate via verbal responses (Johnson, 2010). Many of these claims are based solely
on observation and anecdotal evidence from teachers and students. Education Week’s Digital
Direction notes that the research to support the claim that IWBs increase student learning is
limited (Education Week, 2010). In one study, the IWBs were used for site word recognition and
literacy in autistic students. Being able to manipulate the IWB drastically increased the students
abilities to learn their sight words as demonstrated by 4 assessments directly following their lesson
with the IWB. It should be noted that without a constant recurrence of this information, the
students were not able to retain this information as evidenced by a long term assessment 35 days
post experiment (Mechling et. al., 2007). In addition a study was done with kindergartners to
determine the effectiveness of SMART technology on the teacher’s ability to assess science
recording skills. The study found that because students were able to manipulate objects on the
board, rather than writing down their observations they were better able to understand science
concepts better (Preston et al., 2007). Marzano did a large scale study of 200 classrooms that was
published for Promethean that found that when IWBs were used for at least 75% of the class time
and the teachers were confident in their ability to manipulate the tool, that students motivation
was boosted, which was the biggest benefit of the tool (Marzano, 2009). In 2005 a literature
review was done on IWBs and most of the information that they found at the time was in the
form of survey’s questionnaires, and interviews (Smith et. al., 2005). In these interviews many of
the reports indicated that students were eager to participate in classroom activities if they were
able to manipulate the board (Smith et. al., 2005). There was a multitude of evidence, however that
suggested that IWBs were not a solid investment and that their purchase does not always positively
affect student learning.

In the same Point/Counterpoint article Jim Hirsch argued that no study that has been published
has been able to prove that it is more the interactivity of the hardware, rather than the projection,
that is the reason for an increase in student achievement. He argues that a smaller amount of
money could be spent to gain the same results, and that it is not the IWB that engages students
and increases learning, but the effectiveness of the teacher in using the tool (Johnson, 2010).
Marzano also points to the teacher as being the biggest factor affecting student achievement. He
states that teachers that are more familiar and have used the tool time and again were getting
better achievement results (Marzano, 2009). Much of what IWBs offer go hand in hand with solid
teaching skills. It is not solely the whiteboard that increases student achievement but the use of
best practices when using the IWB in a classroom setting (Education Week, 2010). Whether you
are for or against the use of IWBs the bottom line is that there is very limited research that
indicates that they are a solid investment for schools.

While there is an abundance of literature on the subject of Interactive Whiteboards there are
very few solid studies in which the single variable of interactivity and its impact on student learning
is the focus. Much of the literature is anecdotal stating that teachers and students “feel” a certain
way about IWBs. Even the study that was performed by Marzano, that showed the effectiveness of
IWBs on student learning in a broad study still was performed for Promethian inc., and therefore
carries with it an inherent bias. The literature indicates that more research must be done
specifically trying to isolate the nature of IWBs as interactive and that particular function on
student learning and achievement.

The action research that I have performed begins to fill the gap in the literature, in particular
the lack of direct findings that the interactivity of IWBs is the single biggest factor that increases
student learning. My findings indicate that just simply using the IWB as a projector is not enough to
engage students and increase retention. Rather, it is the novelty of interactivity that engages the
student and motivates them which then, in turn, allows for better retention.

Question 1:What kinds of learning impacts happen when an Interactive Whiteboard is used interactively
vs. non-interactively?
There was not a significant difference in the average score between pre assessment and post
assessment for the non-interactive lesson. In fact the average score went down from a 6.86 to a
6.38 (out of 10).

In the pre and post assessment of the Interactive IWB lesson the average student score went up by
more than two points from a 4.68 to a 6.82(out of 10), a mean difference of 2.14. This is a
significant increase, especially compared to the mean difference of the non-interactive lesson which
was -.55. Although the average score was not significantly different than the average score for the
non-interactive lesson the increase from pre to post test is significant enough to discuss.
Table 1: Scores and Findings
Parts of Parts of a Difference Internet Internet Difference
Books Pre- Book Post Research Research
Assessment Assessment Pre- Post-
Assessment Assessment
Mean 6.93 6.4 -0.550 4.68 6.82 2.14

Table I.2: Responses to “I Enjoyed using the SMART Board on this lesson”

Strongly Agree Disagree Strongly


Agree Disagree
35 34 8 1

Table 1.3: Responses to “I felt I learned more because I was able to touch the SMART Board”

Strongly Agree Disagree Strongly


Agree Disagree
19 26 22 10

Table 1.4: Responses to “I would like to use the SMART board more often if I was the one who got to
touch it”

Strongly Agree Disagree Strongly


Agree Disagree
25 25 19 8

Question 2: Does using an IWB interactively create positive feelings about learning in students?

69 students out of 80 participating indicated that they Strongly Agreed or Agreed that “I enjoyed
using the SMART board on this lesson”.
“I enjoyed using the SMART board”

1%
10%

45%

44%

Strongly Agree
Agree
Disagree
Strongly Disagree

Question 3: Do positive feelings or negative feelings about the use of an IWB interactively have any
correlation with information retention in students?

Of the 29 students who had a 3 point or higher difference from pre-assessment to post
assessment for the interactive lesson, only one indicated that they did not enjoy interacting with the
SMART board. However, of the 80 students there were only 9 students who indicated that they
did not like using the IWB interactively. In addition, of the 9 students who indicated that they did
not enjoy using the IWB, only four had no or negative increase in score from pre-assessment to
post assessment.

Discussion:

Summary of Findings:
What kinds of learning impacts happen when an Interactive Whiteboard is used interactively vs. non-
interactively?

When I asked the students, via an informal class discussion, if they enjoyed the non-
interactive lesson many of them indicated that both the subject and the way in which it was
presented was boring. One student indicated that she did not feel engaged because all they do all
day is write on paper with pencil and they just wanted a break from the same old thing. This
indicates to me that students were not engaged in the material. Because the students were not
engaged in the presentation or the material there was no ownership of the information and thus a
lower average on the post assessment than on the pre-assessment. Because a majority of the
students had answered certain questions on the pre-assessment correctly, I chose to omit those
questions from the post test and gave them new questions based on the information that was
covered in class. The students were not given a chance to review their pre-assessment scores
before they were given the post assessment.

Another possible explanation for their lack of improvement from pre-assessment to post-
assessment is that I was a new face and my teaching style varies in both management and
experience from their everyday teacher. I do not normally teach in the fifth grade classroom, and
the management skills that I have learned in the high school classroom were not suiting me well for
managing a classroom of 10 to 11 year olds. Because students were distracted by both a new
teacher, and a need to test my boundaries this could have affected their ability to learn the
information from the non-interactive lesson. That being said, this lesson would not have been
presented by their everyday teacher anyway, as it is a special unit that is typically taught by the
Library and Media Specialist and the students may have had a similar reaction, although they know
the LMS teacher and are familiar with her.

The scores for the interactive IWB Lesson were interesting. These scores indicate that using the
IWB interactively vs. non-interactively does have a positive impact on student learning.

Other variables that may have played a role in the increased average include that the lesson was
later in the week and students were able to get a better handle on how I teach and what my
expectations in the classroom are. Student behavior did improve over the course of the five days,
and my ability to manage also adapted.

In addition, the post test for the interactive lesson had far fewer changes to its format and
questions because the scores were so low on the pre-test. The fewer changes could have had an
impact on how well the students did on the post test.

Does using an IWB interactively create positive feelings about learning in students?

The findings using the student survey indicated that a majority of students found that they enjoyed
using the IWB for the lesson. Of those who strongly agreed or agreed, an overwhelming majority
indicated that they enjoyed using the IWB for the interactive lesson because it was “Fun”. The
rationale of those that disagreed or strongly disagreed varied widely. Some of them didn’t seem to
understand the Likert scale. A few of them indicated that they did not believe that they actually
had to manipulate the IWB in order for them to learn from the IWB being used. A fewer number
of students who agreed or strongly agreed indicated that they learned better when they were able
to manipulate the information with the board. That it was actually getting out of their seat and
touching the IWB that helped them to learn.
Do positive feelings or negative feelings about the use of an IWB interactively have any correlation with
information retention in students?

The findings indicate that positive feelings about the use of IWB do not have a statistically
significant correlation with information retention. So although students indicated that they enjoyed
using the IWB, not all students had a significant increase in scores from pre-assessment to post-
assessment.

Because there is no significant correlation between positive feelings about using the IWB and
information retention in leads me to believe that the biggest impact of IWBs on student learning is
motivation. Students find that the novelty of using the IWB keeps them motivated.

Recommendations for Action:


There are a multitude of recommendations that should be considered in order to improve the use
of IWBs in the classrooms at Vega Elementary. The first is to distribute these findings and highlight
the areas of most interest, i.e. that using the IWB interactively is likely to increase student learning.
These findings will give teachers a data driven analysis that using the IWBs will aid their students
learning and that the burden of finding and using interactive lessons for the IWB does in fact make
a difference.

The second recommendation is to offer more training to teachers on how to use the IWB
interactively in their classroom. Since so much money was spent to get an IWB in every
classroom, and the findings indicate that using them interactively does show an increase in student
learning, there should not be a single teacher in the school who is not using their IWB interactively
at least once every unit, if not more. After additional training is offered there should be no excuse
for using the IWB simply as a projection device. Students should be able to manipulate the board
as often as possible, but not so often that they no longer see it as a novelty, as it seems that
motivation is the key driving factor in IWB use and an increase in learning.

The third recommendation is to make a collaborative work area on the web where teachers at
Vega can share the interactive IWB lessons that they have created or used with each other. This
collaboration of lessons themselves, and also resources on where to find lessons ensures that no
teacher is stuck having to create something from scratch unless they want to. The hardest part
about implementing new ideas is creating the lesson. If there are already lessons out there then
there is no need to re-invent the wheel and it gives students the opportunity to learn better.

Next Steps:
Once these recommendations have been seen through the next step in the process may be to
continue to conduct a follow up research inquiry with a multitude of classrooms across many
grades to see if the results of this inquiry can be repeated. In addition there may also be an
interesting inquiry into the novelty of interactivity. If motivation is influenced by novelty, then using
the IWB interactively all the time may actually decrease motivation over time. A logical next
question to inquire about is how often should an IWB be used interactively in order to maximize
motivation and therefore retention.
Reflection/Conclusion:
In conducting this research I put to ease my feelings that schools and districts had spent a great
deal of money on a technology that was not particularly useful. I now better understand that IWBs
when used interactively really do have a positive impact on student learning. After doing the
literature review I was feeling very cynical about the purchase of IWBs and their use in classrooms.
After having done this research, I have learned the importance of using the tool to its full capacity
by using it interactively. There is a lot to be said about ensuring that teachers are aware of these
findings and that they are using their IWBs interactively as often as possible to ensure that students
are given the opportunity to learn and retain information the best. I still believe that interactivity
can be achieved through less expensive purchases for those schools and districts who cannot
afford an IWB. However, for those classrooms that already have them installed it is imperative that
they be used interactively.
Appendices:
Appendix A: Non-Interactive Pre-Test

Parts of a Book Pre-Test! ! ! ! ! Name______________________

1. Where is the table of contents located in a book?


a. At the Beginning
b. On the Cover
c. In the Back
d. In the Middle between chapters 3 and 4
2. What information does a table of contents tell you?
a. Definitions for Keywords
b. Where to find Specific information
c. Where to find broad topics or chapters
d. Information that is not directly related to the whole book
3. Information in the Copywrite page allows you to create what?
a. A copy of the book
b. A citation
c. A research paper
d. An Appendix
Using the Image Below answer questions 4 and 5
4. What chapter would you be able to find information about how frogs live together?
a. Chapter 1
b. Chapter 4
c. Chapter 5
d. Chapter 8

5. On what page would you start looking if you were trying to figure out how frogs are
important?
a. The Cover
b. Page 68
c. Page 131
d. Page 15

6. What part of the book would you use if you needed to find the definition of a
Keyword?
a. Index
b. Appendix
c. Glossary
d. Table of Contents

Match the following parts of a book with its function

7. Index a. Defines Keywords found


8. Appendix throughout the book
9. Glossary b. Tells where specific information
10.Table of Contents can be found throughout the book
c. Tells where broad information and
the chapter beginning can be
found
d. Gives more information about the
subject but doesnʼt fit in with the
rest of the book well
Appendix B: Non-Interactive Post Test

Parts of a Book Post Test" Name__________________________________

1. What information does a table of contents tell you?


a. Where to find Broad Information
b. Where to find Specific information
c. Definitions for Keywords
d. Additional Information that does not fit into the body of the book
2. Information in the Copywrite page allows you to create what?
a. A copy of the book
b. A citation
c. A research paper
d. An Appendix
3. What does a citation tells the reader of our research?
a. How much information we found
b. Where we found our information
c. Who wrote our paper for us
d. That we can copywrite
Match the following parts of a book with its function

4. Index
5. Appendix a. Tells where broad information and
6. Glossary the chapter beginning can be
7. Table of Contents found
b. Tells where specific information
can be found throughout the book
c. Defines Keywords found
throughout the book
d. Gives more information about the
subject but doesnʼt fit in with the
rest of the book well

Use the following illustration to answer


question 8
Parts of a Book Post Test" Name__________________________________

8. If you are looking for information about how frogs swim what page would you look at?
a. 6
b. 24
c. 14
d. 17

9. If we are looking for additional information that would not fit in with the body of the
book what part of the book would we find that in?
a. Glossary
b. Appendix
c. Table of Contents
d. Index

10. The Glossary tells us what kind of information?


a. Additional Information that doesnʼt fit in the body of the book
b. Where to find broad information
c. Where to find specific information in the book
d. Definitions of keywords that are found in the book
Appendix C: Interactive Pre-Test

Internet Research Pre-Test!! ! ! ! Name _____________________

1. Who writes internet sites?


a. Adults
b. Kids
c. Teachers
d. Anyone
2. When you use an Internet site, what piece of information should you know?
a. If the person who wrote it is nice
b. How old the person who wrote it was
c. How many videos are on the site
d. Who wrote it
3. What does .com stand for?
a. Commercial
b. Company
c. Cancer Organization
d. Commander
4. How do you recognize a Bias on an internet site?
a. Information can be cross referenced across many sites
b. The website is trying to sell you something
c. The website has many authors who are experts
d. The website was created for educational use
5. When you do research on the internet what three things should you do?
a. Explore, Copy, Write
b. Search, Scribe, Cite
c. Search, Find, Copy
d. Find, Answer, Cite
6. What is a search engine?
a. A website for finding accurate information
b. A tool that collects information based on your keywords
c. A tool that does your research for you
d. A website that you can write questions and answers on
7. What is a database?
a. A search engine that is unreliable
b. A wiki in which lots of people contribute to the information
c. A reliable internet reference source that may require a subscription
d. A computer that holds lots of numbers and information
8. What is Plagiarism?
a. When you use your own ideas in somebody elseʼs work
b. When you use information in your research and you give credit to where you got
that information
c. When you use information in your research and you donʼt give any credit to where
you found that information
d. When you give people computer viruses through email
9. Where can you find accurate internet reference sources here at school?
a. Search Engines
b. Vega Elementaryʼs Website
c. Your Blog
d. Teacherʼs wiki
10. Why is a search engine NOT a good place to start your research?
a. There is too much information
b. They have fancy graphics
c. They are trying to sell you something
d. They are run by the government

Appendix D: Interactive Post Test

Internet Research Post-Test! ! ! ! Name _____________________

1. Why is a search engine NOT a good place to start your research?


a. There is too much information
b. They have fancy graphics
c. They are trying to sell you something
d. They are run by the government
2. When you use an Internet site, what piece of information should you know?
a. If the person who wrote it is nice
b. How old the person who wrote it was
c. How many videos are on the site
d. Who wrote it
3. What does .com stand for?
a. Commercial
b. Company
c. Cancer Organization
d. Commander
4. Who writes internet sites?
a. Adults
b. Kids
c. Teachers
d. Anyone
5. When you do research on the internet what three things should you do?
a. Explore, Copy, Write
b. Search, Scribe, Cite
c. Search, Find, Copy
d. Find, Answer, Cite
6. What is a search engine?
a. A website for finding accurate information
b. A tool that collects information based on your keywords
c. A tool that does your research for you
d. A website that you can write questions and answers on
7. What is Plagiarism?
a. When you use your own ideas in somebody elseʼs work
b. When you use information in your research and you give credit to where you got
that information
c. When you use information in your research and you donʼt give any credit to
where you found that information
d. When you give people computer viruses through email
8. How do you recognize a Bias on an internet site?
a. Information can be cross referenced across many sites
b. The website is trying to sell you something
c. The website has many authors who are experts
d. The website was created for educational use
9. What is a database?
a. A search engine that is unreliable
b. A wiki in which lots of people contribute to the information
c. A reliable internet reference source that may require a subscription
d. A computer that holds lots of numbers and information
10. Where can you find accurate internet reference sources here at school?
a. Search Engines
b. Vega Elementaryʼs Website
c. Your Blog
d. Teacherʼs wiki

Appendix E: Student Survey

Circle which one you think best fits how you feel in the following statements:

I enjoyed using the SMART Board in this lesson

Strongly Agree -----------------Agree------------------Disagree-----------------Strongly Disagree

Why?

I felt I learned more in this lesson because I was able to touch the SMART board

Strongly Agree -----------------Agree------------------Disagree-----------------Strongly Disagree

Why?
I would like to use the SMART board more often if I could be the one touching it

Strongly Agree -----------------Agree------------------Disagree-----------------Strongly Disagree

References:

Education Week's Digital Directions: Whiteboards' Impact on Teaching Seen as Uneven. (2010,
January 8). Education Week American Education News Site of Record. Retrieved March 13,
2011, from http://www.edweek.org/dd/articles/2010/01/08/02whiteboards.h03.html

Johnson, J., & Hirsch, J. (2010, June 15). International Society for Technology in Education - Learning
& Leading > Point/Counterpoint Are Interactive Whiteboards Worth the Investment? .
International Society for Technology in Education | Home . Retrieved March 13, 2011, from
http://www.iste.org/learn/publications/learning-and-leading/issues/
Point_Counterpoint_Are_Interactive_Whiteboards_Worth_the_Investment.aspx

Marzano, R., & Haystead, M. (2009, May 1). Evaluation Study of the Effects of Promethean
ActivClassroom on Student Achievement. Promethean World. Retrieved March 13, 2011,
from www.prometheanworld.com/upload/pdf/
Preliminary_Report_on_ActivClassroom_20090417112310.pdf

Mechling, L., Gast, D., & Krups, K. (2007). Impact o SMART Board Technology: An Investigation of
Sight Word Reading and Observational Learning. Journal of Autism and Developmental
Disorders, 37, 1869-1882.

Preston, C., & Mowbray, L. (2008, June 1). Use of SMART Boards for teaching, learning and
assessment in kindergarten science. Teaching Science, 54, 50-53.

Smith, H., Higgins, S., Wall, K., & Miller, J. (2005). Interactive whiteboards: boon or bandwagon? A
critical review of literature. Journal of Computer Assisted Learning, 21, 91-101.

You might also like