You are on page 1of 51

Texas Science Teacher

Volume 40, Number 2 November 2011

The

Changing Instructional Practice

Through Coaching in the Beginning Teacher Induction and Mentoring

Notable High School Chemistry Concepts


Not Mastered Prior to Entering General Chemistry

Using a Force Meter to Measure an Objects Mass


A Potential Misconception

to Assess the Effectiveness of a Scientific Inquiry Elementary Science Methods Course with Hispanic Preservice Elementary Teachers

Using Science Teaching Case Narratives

ASSOCI ATION
HE RS

OF

TE

AC

IEN

CE

STAT
TE XA S

Official Publication of the Science Teachers Association of Texas


The Texas Science Teacher Volume 40, Number 2 November 2011

SC

Lessons on Caring (contd.)


How can I motivate my students to love science?

Teachers submitting the most team projects win a Toshiba Tablet !

The Science of A-ha!


ExploraVision, the worlds largest K-12 science competition, offers teams of students an opportunity to create and explore their visions of future technologies. Up to $240,000 in savings bonds is awarded each year, plus expense-paid trips to Washington, DC for national winning students and their parents. Schools, coaches, and mentors win too! Your students can reach that incredible A-ha! moment when all their real-world learning comes together in problem-solving, critical thinking, collaboration, and recognition.

at

g2 in

0 years o f

s ea id

celeb r

oraVision

Through Toshiba's shared mission partnership with NSTA, the Toshiba/NSTA ExploraVision competition makes a vital contribution to the educational community.

Visit www.exploravision.org/texasscience for details.

1-800-EXPLOR-9 exploravision@nsta.org

www.Facebook.com/ToshibaInnovation

@ToshibaInnovate

The Texas Science Teacher Volume 40, Number 2

November 2011

TST1110

Succeed with Science


Our science solutions integrate the right web-delivered curriculum, the right teacher tools, and the instructional expertise of the best implementation, PD and support team in the business. We can help you overcome the major challenges facing science educators today: Leverage your existing resources Identify new resourcesmany free Provide focused, effective professional development (PD) Motivate, engage and excite students Improve science test scores Create a truly integrated, project-based learning program

Contact me today to get going!


Jim Wheat jwheat@learning.com 512-913-5765

The Texas Science Teacher Volume 40, Number 2

November 2011

The Texas Science Teacher


Volume 40, Number 2
Contents

November 2011

Changing Instructional Practice


by Terry Talley

Notable High School Chemistry Concepts


by Anna B. George and Diana Mason

Using a Force Meter to Measure an Objects Mass


by Andrzej Sokolowski

Using Science Teaching Case Narratives


by Ron and Amy Wagler

Cover Photo: A Potential Horizon. All Rights Reserved. Image Credit: Ismael Ramon, student at Palo Duro High School.
The Texas Science Teacher, official journal of the Science Teachers Association of Texas, is published semiannually in April and October. Enumeration of each volume begins with the April issue. Editorial contents are copyrighted. All material appearing in The Texas Science Teacher (including editorials, articles, letters, etc.) reflects the views of the author(s) and/or advertisers, and does not necessarily reflect the views of the Science Teachers Association of Texas (STAT) or its Board of Directors. Announcements and advertisements for products published in this journal do not imply endorsement by the Science Teachers Association of Texas. STAT reserves the right to refuse any announcement or advertisement that appears to be in conflict with the mission or positions of the Science Teachers Association of Texas. Permission is granted by STAT for libraries and other users to make single reproductions of The Texas Science Teacher for their personal, noncommercial, or internal use. Authors are granted unlimited noncommercial use. This permission does not extend to any commercial, advertising, promotional, or any other work, including new collective work, which may reasonably be considered to generate a profit. For more information regarding permissions, contact the Editor: jpalmer59@gmail.com

The Texas Science Teacher Volume 40, Number 2

November 2011

Changing Instructional Practice through Coaching in the Beginning Teacher Induction and Mentoring
by Dr. Terry Talley
Mentoring Science Teachers in the Galveston County Regional Collaborative eaching is possibly the only profession which tries to give the impression that all who enter the classroom know all instructional best practices and can handle any situation starting on day one. It is only after several years of trial and error that the novice teacher learns to appreciate the collaborative gestures of her peers and learns to ask for ideas when she does not have the knowledge, skills or resources needed. The Texas Regional Collaborative (TRC) offered a grant funded by the Texas Education Agency (TEA) to establish Beginning Teacher Induction and Mentoring Programs (BTIM) through the Regional Collaboratives. The grant provided training through Mentoring Texas in using research based practices. The grant began in October 2009 and will follow new science teachers through their first two years in the classroom, with the grant period ending in April 2011. Although, the BTIM programs throughout Texas have different settings and address novice teachers from various programs, the underlying premise is the same providing academic coaching and supportive relationships. This model, most importantly includes providing a professional - collegial relationship which will assist in welcoming and bolstering a self-doubting and often isolated neophyte into the world of teaching. Rationale for BTIM Program Mentor/ Coaches Based on the 2003 meta-analysis research of the Rand Corporation Teachers in the fields of science and mathematics were more likely to leave teaching than teachers in other fields. The Rand Study 5 also stated that the research on in-service policies that affect teacher retention stated; schools that provided mentoring and induction programs, particularly those related to collegial support, had lower rates of turnover among beginning teachers; that schools that provided teachers with more autonomy and administrative support had lower levels of teacher attrition and migration; and that schools with fewer disciplinary problems or those that gave teachers discretion over setting disciplinary policies had lower levels of teacher attrition and dissatisfaction (Rand, 2003) The Rand research (2003) went on to state, schools with high percentages of minority students are difficult to staff, and that teachers tend to leave these schools when more attractive opportunities present themselves. It is also evident, however, that factors that can be altered through policy can have an impact on the decisions of individuals to enter teaching and on teachers decisions to migrate to other schools or quit teaching. The Rand research (2003) also offers information on the effectiveness of a number of different options in the areas of compensation, pre-service policies, and inservice policies, although rigorous research evaluating the latter two types of policies is relatively scarce. The data used in the Rand study are from the nationally representative 1999 2000 Schools and Staffing Survey. The results indicate that beginning teachers who were provided with mentors from the same subject field and who participated in collective induction activities, such as planning and collaboration with other teachers, were less likely to move to other schools and less likely to leave the teaching occupation after their first year of teaching. (2003) November 2011

The Texas Science Teacher Volume 40, Number 2

Twenty Ways to Teach Vocbulary (contd.) Changing Instructional Practice (contd.)


Lessons on Caring (contd.)
The training provided by the Texas Regional Collaborative is based on the research of the Professional Development Group in Birmingham, Alabama. For the training, two books by Paula Rutherford were provided; Why Didnt I Learn This in College: Teaching and Learning in the 21st Century (2009) and The 21st Century Mentors Handbook: Creating a Culture for Learning (2005). In establishing a rationale for the BTIM program a quote from the forward of Rutherfords 2009 book gives the TRC-BTIM training a lightning clear focus. The quote is from Frank McDonalds, A Study of Induction Programs for Beginning Teachers:
It is a truism among teachers and especially teacher educators that within the first six months of the first experience of teaching, the teacher will have adopted his or her basic teaching style. Experience indicates that once a teachers basic teaching style has stabilized, it remains in that form until some other event causes a change, and at the present time, there are not many such events producing change. If the style adapted is a highly effective one and is the source of stimulation to continuous growth, there would be no problem. But if teachers abandon their ideals and become cynical, see management at any price as essential, constrict the range of instruction alternatives they will try or use; if they become mediocre teachers or minimally competent, then the effect of the transition period on this is a major concern and a problem that needs direct attention. (McDonald, 1980)

centered on a common learning theme. Further discourse was encouraged and facilitated through the TOLC (Texas Online Learning Community) site for professional discourse and posting of resource for sharing. PLC topics included: Using the Walls as Instructional Tools Misconceptions that Interfere with Learning Science Questions, Wait Time and Classroom Discussions Inquiry, Labs, Data Tables, Graphs and Charts Science Literacy and Notebooks Using Models in Science and Moving Learning from Concrete to Abstract Follow up discussions on the Texas Regional Collaborative TOLC site was established for the GCRC-BTIM www.theTRC.org for after hour collaboration and sharing of resources among the teachers in the program. Campus and Classroom Interactions The second component, Campus and Classroom Interactions includes observations both scheduled and unscheduled, coaching, providing resources, as well as offering assistance by model teaching, coteaching, lesson planning and listening. Classroom Walk-Through Visits (CWT) based on the model by Carolyn Downey in her book, The Three-Minute Classroom Walk-Through: Changing School Supervisory Practice One Teacher at a Time, (2004) where the mentor visits a classroom for a short period of time, sitting down in the back of the classroom to observe how the students were responding to the teachers planned lesson for the day. Often, students would share what they are learning or involve the observer in a lab they were doing. November 2011

The Components of the BTIM Program: A Three-Tiered Approach Professional Learning Communities for Collegial Support The first component is providing for professional discourse in a structured setting with specific outcomes and goals in mind. The first structure incorporated into the BTIM was the Professional Learning Community (PLC). Meeting monthly as a community of learners, the BTIM teachers gathered to learn more, reflect on successes and struggles, as well as share resources 6

The Texas Science Teacher Volume 40, Number 2

Changing Instructional Practice (contd.)


During these observations the mentor/coach would look for artifacts of learning, student work, student engagement with the lesson, journaling, work and words on the walls, posters students constructed as well as models about the room. Data Collection Observations are also campus interaction which encompasses an entire science class period. These monthly scheduled observations include the collection of data concerning student engagement throughout a lesson as well as the interactions between the teacher and students in the room. Coaching Sessions are 30 minute in length and are scheduled monthly during a teachers planning period the week after a scheduled observation. The focus of the observation is to share the data collected during the scheduled observation concerning student engagement and teacher interactions with students. The session ends with the determination of which data is to be gathered during the next -scheduled observation. The date and time for the observation is placed on the calendar. Planning, assisting and modeling lessons occur during a one-hour visit. The mentee decides which activity the mentor is to do. The mentor may be asked to assist with a lab, or model a lesson so the mentee can watch the flow or pacing. Within the same session, student and materials management could occur. Many mentees request assistance in planning a future lesson or a unit of study which incorporates resources and ideas the mentor has provided in previous sessions or she may be asked to assist in locating resources that are appropriate or assist in differentiating a lesson as a Response to Intervention (RTI) for a special needs student or for meeting the English 7 Language Proficiency Standards (ELPS) for an English Language Learner. Professional Development for Content Knowledge The third component is Professional Content Learning. Often first and second year science teachers come to the classroom with a general understanding of their grade level content, but gain self-confidence from an opportunity to learn more specific and detailed content prior to instruction. Well-researched and standards-based science content is easily accessed through Online NSTA provided to all BTIM participants. Sustained learning opportunities are offered through many opportunities such as the BTIM threeday Best Practices in Science Mini-Conference which provides an in depth study of the BSCS 5 E Lesson Model (BSCS, 2006) and an infusion of high-yield strategies as discussed by Marzano, Pickering and Pollock in their meta-analysis: Classroom Instruction that Works (2001), and student- based technology such as force and motion probes and computer simulations. The another sustained learning program for the BTIM participants is free access to the summer professional development offered by the Galveston County Regional Collaborative (GCRC) through the UTMB Office of Education Outreach and the Southeast Regional T-STEM (SRT-STEM) Center. The GCRC sponsored a three-day Introduction to Inquiry Institute based on the training from the Exploratorium Museums Institute for Inquiry. The SRT-STEM offered a two-day Lego Robotics Academy, and many other TSTEM Bio-Technology opportunities. Observing for Implementation of Best Practices Based on the research of instructional practices which yield high levels of student November 2011

The Texas Science Teacher Volume 40, Number 2

Changing Instructional Practice (contd.)


achievement reported as effect size (Marzano,2001), data was collected during observations throughout the year from January to May, using an adapted observation checklist designed by the Charles A. Dana Center for their Instructional Leadership Academies (2009) and the Downey Classroom Walk Through Protocol (Downey,2004). Using these tools, collected data provides opportunities to analyze implementation of the instructional practices and ways teachers modified instructional materials as discussed during coaching sessions and as part of the Professional Learning Communities. To determine the effectiveness of the use of coaching and professional learning communities (PLC) as key factors in a mentoring program, before and after observations, will be compared to determine the levels of implementation of the key components of effective instruction as identified by the two observation checklists. The focus of the observation protocol was the collection of data in four main areas which reveal teacher growth towards a transformed classroom where the student is the focus of instruction rather than the teacher: Focus on the Curriculum Were the instructional goals and state standards noted by the teacher, evident to students, and on grade level? Focus on Instruction Did the lesson plan incorporate high yield strategies that were student-centered? Focus on the Student Were the students engaged in meaningful work that was cognitively appropriate? Focus on the Environment Was the classroom environment set up for student success with meaningful artifacts and structures? 8 Focus on Curriculum In a comparison of the initial observation in January and the final observation in May, there was only a slight change in the number of teachers who posted their objectives, based on district requirements that daily learning objectives be presented to the students. These data changes do not reflect general practice among BTIM teachers, but were based failure to post daily objectives on the board area labeled as Agenda as opposed to including them in oral introductory routines for classes. Alignment of objectives to on grade-level objectives improved when focus was brought to the level of the verbs in the objectives. See Figure 6.
Focus on Curriculum The Objective of the Lesson
19 17 15 13 11 9 7 5 3 1 1a Objective known 1b Evident to students 1c On grade level

January May

Figure 6. Focus on Instruction Initial observation data indicates an over dependence on PowerPoint basedlectures, packaged computer software for instruction and teacher questions to check for understanding. In a majority of classrooms the most frequently observed model of instruction was direct instruction. Most often this model displayed an absence of student engagement, students were not given time to discuss the content or make meaning of new knowledge. It teachers did not provide time for closure to the lesson. Teacher-asked questions were the dominant November 2011

The Texas Science Teacher Volume 40, Number 2

Confusing Language (cont .) Changing Instructional Practice d(contd.)


engagement activity one teacher question to one student answer. There was little or no discussion. Often times the teacher would begin directly with the lecture without engaging prior knowledge or creating student interest in the subject matter. In the data table identified as Figure 7, the data show the planned instructional strategies of the 20 secondary BTIM mentees in January, compared to May. There is a decrease in the use of lectures and an increase in discussions, modeling and providing opportunities for students to practice with the information through hands-on experiences, student to student discussions and teacher coaching with facilitating questions. One area of concern which materialized was a major decrease in the use of feedback at the end of the lesson, based on the daily objectives although an increase in did occur in recognition of effort to increase motivation of students. Teachers appeared to be rushed at the end of the class period and sacrificed closure and feedback for the lesson objective to more time in class for the activities.

Figure 7. Another aspect of lesson design is in the way the teacher plans for the students to interact with each othe and the materials. Based on a comparison of the observations in January and May, as seen in Figure 8, there was a decrease in the selection of whole group activities and greater use of activities in smaller groups and pairs.
Planning for Instruction - Grouping Format
20 Number of Classrooms 15 10 5 0 Whole Small Group Paired Individual
January May

Figure 8. 9

The Texas Science Teacher Volume 40, Number 2

November 2011

Confusing Language (cont .) Changing Instructional Practice d(contd.)


As part of the lesson planning process, teachers make instructional decisions based on knowledge of research-based instructional practices. As teachers gain experience in lesson planning and making conscious decisions to select more-effective strategies, there is a greater number of instructional strategies incorporated into the lesson design and a greater variety is employed through out the lesson. As seen in Figure 9 below, among the 20 secondary BTIM teachers observed, there was a decrease in the use of only scaffolded advance organizers such as notetaking worksheets and an increase in the use of a variety of other meaningful strategies such as non-linguistic representations, summarizing and notetaking, as well as similarities and diffferences, being incoprorated throughout the lesson. It is important to note that with the increase in lab activities came an increase in the use of generating and testing of hypotheses. For both observations, when more than one strategy was incoprorated into the lesson successfully, it was recorded. The number of strategies used totals more than the twenty secondary teachers represented in the study when teachers used more than one strategy successfully.

Planning for Instruction - High Yield Strategies


None Questions, Cues and Advance Organizers Generating Hypothesis Setting Objectives / Providing Feedback Cooperative Learning Nonlinguistic Repr. HW and Practice Reinforcing Effort - Recognition Summarizing/Notetaking May January

Figure 9.

Similarities and Differences 0 2 4 6 8 10 12 14 16 18 20

Focus on the Learner The third area of focus in the observation is on the student and what the student does during the lesson. The primary student activities changed dramatically from the initial observation in January to final observation in May. From the data represented in Figure 10, there is a marked decrease in the time the teacher spends speaking (lecturing and giving directions) with the students listening and the time in which students are working with hands-on materials - speaking and listening to each other concerning their learning. It appears that later in the year the students spend a more balanced amount of time speaking, listening, writing and working with hands on materials.

10

The Texas Science Teacher Volume 40, Number 2

November 2011

Confusing Language (cont .) Changing Instructional Practice d(contd.)

Focus on the Learner - Student Actions


14 12 10 8 6 4 2 0 Listening Reading Speaking January Writing May Hands On Materials None
14 12 10 8 6 4 2 0

Focus on the Learner - Instructional Materials


January May

Hand-held Technology

Websites

Worksheets
None

Video

Real-world objects

Student-created materials

Published Print Materials

Computers

Figure 10.

Comprehension

Application

Knowledge-recall

Figure 12. 11 The Texas Science Teacher Volume 40, Number 2 November 2011

Evaluation

Synthesis

Analysis

Another aspect of Focusing on the Figure 11. Student is looking at what the student is given to use for instructional materials. In The use of well-designed, student-centhe data comparison represented by Figure tered instructional materials became more 11, there was a marked change in the types evident in the BTIM teachers classrooms of instructional materials prepared for stuin May compared to January. Teachers dents in May compared to January. There was a decrease in the use of worksheets and were learning how to ask questions requirpublished print materials and an increase in ing more cognitive processing. This change the use of real-world materials, student cre- shows as higher-cognitive rigor in the student work. Figure 12 compares observation ated materials, and lab activity sheets. This data, based on the highest level of Blooms change followed several coaching sessions and several Professional Learning Commutaxonomy encountered during the observed nity (PLC) discussions, where attention was lessons and student work in January and given to the quality and depth of teacher May. There is a significant difference in the expectations and student products in during created materials compared to worksheets the passage of the school year. provided by textbook ancillary materials such as guided reading workbooks or black line masters downloaded from the Internet. Focus on the Learner - Levels of Student Work These materials did not reach the rigor of the TEKS standards. As teachers became January May C N l more sophisticated in their understanding u 16 a m 14 s and selection of instructional materials, stub s 12 e r dents became more engaged in the learning 10 r o 8 o process. Student collaboration became como 6 m f mon and student created graphic organizers s 4 2 and folding organizers were found in student 0 journals.

OH/Board/Flip Chart

Manipulatives

Lab/Activity Sheet

Textbook

None

Oral

Confusing Language (cont .) Changing Instructional Practice d(contd.)


In addition to preparing students work that is respectful of their time and challenging of their intellect, the teacher is mindful that the students should be able to become fully engaged in the lesson and the materials. Based on the Dana Center Checklist (2009), the criteria for engagement are as follows: Highly engaged most students are authentically engaged. Well managed students are willingly compliant, ritually engaged. Disengaged many students actively reject the assigned task or substitute other activity. (Charles A Dana Center, Window on the Classroom, 2009) Classroom observations regarding the levels of student engagement, conducted in January and May, reveal significant changes. In conjunction with the changed lesson format, engaging instructional materials, and setting up structures for student interactions, Figure 13 represents data which reveals an overall higher level of student engagement compared to the well-managed or disengaged classroom earlier in the school year.
Focus on the Learner Level of Class Engagement
January 16 14 12 10 8 6 4 2 0 May

Figure 13.

Highly Engaged

Well Managed

Disengaged

Focusing on the Learning Environment The final focus of the observations included the learning environment. The arrangement of classroom materials, desks, as well as items on the walls play a major role in setting the stage for learning and for supporting retention of learning for greater gains in achievement. Based on the data collected during the two compared observations, and as displayed in Figure 13, there is a marked increase in the use of the walls as an important part of the learning environment as well for the display of exemplars, models and student work. Placing directions for expected routines, protocols, and behavior became more evident in the BTIM teachers classrooms as they began to discuss the advantages of these reminders in PLC sessions as well as in online discussions.

12

The Texas Science Teacher Volume 40, Number 2

November 2011

Confusing Language (cont .) Changing Instructional Practice d(contd.)


Focus on the Classroom Environment
20 18 16 14 12 10 8 6 4 2 0 January May

pacting classroom practice, especially among first and second year science teachers. Conclusions Based on the First Year of BTIM As indicated in the research by Ingersoll and Smith (2001, 2003) and the Rand Corporation (2003) there is a need for administrative support for beginning year science teachers. Administrative support was gained through the letters of support provided by the school districts the GRC-BTIM is serving within Galveston County. BTIM mentors received support and encouragement from the campus and district administration as the program mentors continued to visit teachers in their classrooms, provide resources, facilitated PLC meeting, as well as provide additional professional development through the Regional Collaboratives. As we visited teachers regularly, we also met with campus administrators to keep open the lines of communication. As we evaluate the successes and missteps from our first year, and begin the start of the new school year, we have had requests from the administration of these districts to continue and expand the support we are providing. At a time of diminishing budgets, grant funded projects are prized and utilized. There are strong indicators for the effectiveness of the BTIM program, such as, nearly perfect attendance at each of the PLC meetings, mentee requests for more frequent visits, district personnel requests service to more novice teachers, and the data analysis comparing observations in January and May.

Figure 14. Upon analysis of observation data, after 5 months in the BTIM program which included 5 PLC sessions and three coaching sessions, the comparative data reveal an overall significant change in the quality of planning for instruction with the inclusion of a greater variety of high-yield instructional strategies, in preparing and using high-quality instruction materials and in providing a classroom environment where learning is evident and supports student achievement. The data collected through observation protocols were focused in three main areas: 1) Instruction instructional practices, group format, and instructional strategies; 2) Learner student actions, instructional materials and levels of student work; as well as 3) Environment - the walls, desk arrangements, and support materials. The conclusions that can be drawn by comparing the data from the initial observation in January and the final observation in May lead to an understanding of the potential structures such a Professional Learning Communities (PLC) and coaching can have towards im13

The Texas Science Teacher Volume 40, Number 2

November 2011

Changing Instructional Practice (contd.)


The next step in the evaluation process will be an analysis of achievement gains on State Assessments such as the TAKS, STAAR, and EOC, for those taught by teachers in mentoring programs such as BTIM compared to those who are mentored in other programs, as well as those not mentored at all. We would also like to evaluate new teacher retention based on teacher participation in a mentoring program such as the BTIM compared to other mentoring programs and those not mentored at all. Finally, a follow-up comparison will be conducted to determine if there is increased improvement in teacher effectiveness as novice teachers begin their second year of teaching while being supported by a mentoring program, compared to those who continue teaching without mentoring support.

Terry Talley, Ed.D. earned her Doctorate in Curriculum and Instruction from the University of North Texas. She recently retired after 25 years in public education from Lewisville ISD where she served as Secondary Science Supervisor. Terry is past-president of the Texas Science Education Leadership Association (TSELA) and a member of the Science Teachers Association of Texas (STAT) and the National Science Teachers Association (NSTA). Dr. Talley is living on Galveston Island, where she is active in the education community by consulting, serving as Project Manager and mentor for the BTIM Program, and working part-time as the Co-Director of the SRT-STEM Center both sponsored by UTMB Off ice of Educational Outreach.

Resources:
ACT (2007) Rigor at Risk: Reaffirming Quality in the High School Core Curriculum. Iowa City, IA Bybee, Roger Y., Taylor, Joseph A., Gardner, April, Van Scotter, Pamela, Powell, Carlson, Westbrook, Anne and Landes, Nancy. (2006) The BSCS 5E Instructional Model: Origins, Effectiveness, and Applications, Colorado Springs, CO: BSCS. Charles A. Dana Center. (2010) Instructional Leadership. Austin, TX: The Charles A. Dana Center. Downey, Carolyn J., Steffy, Betty E., English, Fenwick W., Frase, Larry E., and Poston, Jr., William K. (2004) The Three-Minute Classroom Walk-Through: Changing School Supervisory Practice One Teacher at a Time. Thousand Oaks, CA: Corwin Press. Ingersoll, Richard M. (2001) Teacher turnover and teacher shortages: An organizational analysis. American Educational Research Journal; fall 2001; 38, 3; pg. 499. Retrieved online on July 17, 2010 from ABI/INFORM Global Ingersoll, Richard M. and Smith, Thomas M., (2003). The Wrong Solution to the Teacher Shortage. Education Leadership; May 2003; 60, 8: pp 30-33. Retrieved July 17, 2010 Online from Ebsco AN9722710. Marzano, Robert J., Pickering, Debra J., and Pollock, Jane E. (2001) Classroom Instruction that Works: Research-based Strategies for Increasing Student Achievement. Alexandria, VA: ASCD. Rutherford, Paula (2009) Why Didnt I Learn This in College: Teaching and Learning in the 21st Century. Alexandria, VA: Just Ask Publications.

14

The Texas Science Teacher Volume 40, Number 2

November 2011

Science....Blast Off!

ST AR Aligned A

Scopes
Fun and Complet e!

Elementary K-5 Middle School: 6-8

Comprehensive online student learning experiences tightly aligned to TEKS and STAAR that enhance teaching for beginning to master teachers with hands-on science activities using the researchbased 5E lesson cycle.

5-8th grade approved resource for the Texas Supplemental Science Adoption!
Hundreds of evaluation and intervention tools 5E lessons for all STAAR Science Readiness and Supporting Standards K-8

To see FREE all that STEMscopes has to offer and to view sample scopes for each grade level

STEMscopes
Rice University 713-348-5433 STEMscopes@rice.edu

1. Go to: sample.stem scopes.com 2. Enter ID: gu est & Password: guest 2. Click on S copes and explore!

Center for Technology in Teaching and Learning

www.stemscopes.com

15

The Texas Science Teacher Volume 40, Number 2

November 2011

Notable High School Chemistry Concepts Not Mastered Prior to Entering General Chemistry
by Anna B. George and Diana Mason
Abstract ith the advent of the end-ofcourse (EOC) State of Texas Assessment of Academic Readiness (STAAR) exams in chemistry, it is necessary to hone in on specific topics that need targeted attention. In this study 286 postsecondary students enrolled at a large north Texas public university were evaluated as to their retention of typical first semester general chemistry concepts using the nationally recognized American Chemical Society (ACS) California Chemistry Diagnostic Exam 1997 (CA Dx). The five most common misconceptions held by these general chemistry students were identified as: bond polarity, use of significant figures in laboratory procedures, Lewis dot structures, nomenclature, and algebraic relationships in gas laws. In addition, possible sources of these errors and suggestions for correction are discussed. Keywords: high school chemistry standards, college readiness, general chemistry, misconceptions, mastery Introduction What is learned in high school chemistry is important to students future success. General chemistry, a known gateway course to several STEM degrees including biology, biochemistry, engineering, and chemistry ultimately impacts future STEM careers. The Texas Education Agency (TEA) sets the standards for public education from first grade to high school in Texas. High school teachers are supposed to base their curricula on the Texas Essential Knowledge and Skills (TEKS). The TEKS were initially adopted in July 1997 and have been revised many times since. The TEKS are tested on the Texas Assessment of Knowledge and 16

Skills (TAKS), a test that students must pass in order to graduate from high school (Texas Education Agency and Pearson, 2009). The State of Texas Assessments of Academic Readiness (STAAR) program, which consists of 12 end-of-course exams (EOCs), will replace the TAKS test as a graduation requirement for students in the ninth grade during the 2011-2012 school year according to the House Bill 3 Transition Plan (Texas Education Agency, 2010a). Since the Chemistry STAAR has yet to be instituted, this study can only assess students knowledge of those who were required to sit for the generic high-stakes TAKS Science exam. This study also serves to document persistent problem areas that need concentrated attention for current secondary students who choose to matriculate to postsecondary opportunities.

Be SMART. Think SMART. iSMART.

Integrated Science, Math And Reflective Teaching!


Fully Funded M.Ed. for Texas Area Middle School Math & Science Teachers!

Come visit our booth at CAST.


Dallas Convention Center November 17-19, 2011

iSMART
Email: ismart@uh.edu http://www.coe.uh.edu/academic-programs/ismart

The Texas Science Teacher Volume 40, Number 2

November 2011

Notable High School Chemistry Concepts (contd.)


The Texas Higher Education Coordinating Board (THECB) works to ensure the quality of postsecondary education for Texas students. Texas is among the first states to develop a set of readiness standards. These standards have been published as the Texas College and Career Readiness Standards (TCCRS) that were adopted in January 2008 (Texas Education Agency, 2010b). The TCCRS for chemistry include specific competencies for the following concepts: matter and its properties, atomic structure, periodic table, chemical bonding, chemical reactions, chemical nomenclature, the mole and stoichiometry, thermochemistry, properties and behavior of gases, liquids, and solids, basic structure and function of biological molecules, and nuclear chemistry (THECB and TEA, 2008). These standards have played an influential role in the current revised TEKS of 2010. What is college readiness? College Readiness Assessment in High School Mastery of the TEKS is currently measured by performance on the TAKS. The TAKS test was mandated by the Texas Legislature in 1999 and was first administered in the spring of 2002-2003 school year to students in grade 11 (Texas Education Agency and Pearson, 2009). The exitlevel TAKS given in grade 11 became a high school graduation requirement for the students that were in grade 8 as of January 1, 2001 (Texas Education Agency, Pearson Educational Measurement, Harcourt Educational Measurement, and BETA Inc., 2004). This test is now being phased out and replaced with the STAAR EOC exams, one of which will be in chemistry. The graduating class of the 2014-2015 school year will be the first cohort of students to be required to take and pass STAAR exams as part of their 17 graduation requirements pending any legislative changes according to the House Bill 3 Transition Plan (Texas Education Agency, 2010a). As of now, Texas Education Code TAC 74.62, which discusses graduation requirements, states that students must meet state assessment requirements as well as complete and pass several courses including a minimum of three credits of mathematics (including one year of Algebra I and one year of Geometry), and two credits of science (including one year of Biology and one year of Integrated Physics and Chemistry (IPC) or one year of a separate Chemistry course) (Texas Administrative Code, 2010). The Exit Level TAKS test includes four sections: English Language Arts, Social Studies, Mathematics, and Science. The TAKS measures statewide curricula in Reading at grades 3-9; in Writing at grades 4 and 7; in English Language Arts at grades 10 and 11; Social Studies at grades 8, 10, and 11; in Mathematics at grades 3-11; and in Science at grades 5, 10, and 11. A student must have satisfactory performance on all sections of the TAKS tests administered in grade 11 to be eligible for a high school diploma in the state of Texas. If a student does not pass the test during this administration, the student has other opportunities to retake and pass the test in order to successfully complete high school (Texas Education Agency and Pearson, 2009). It is assumed that students at the University of North Texas (UNT) who enroll in General Chemistry for Science Majors (gen chem I) have met the prerequisite requirements for course enrollment. According to the 2010-2011 UNT course catalog, students are required to take and pass College Algebra (or equivalent) before they are allowed to register for this course. The preNovember 2011

The Texas Science Teacher Volume 40, Number 2

Notable High School Chemistry Concepts (contd.)


Lessons on Caring (contd.)
requisite for College Algebra is two years of high school algebra, one year of geometry, or the consent of the mathematics department indicating that the equivalent of the College Algebra level has been acquired (University of North Texas, 2010). Assessment of College Readiness in College Level Chemistry Noncognitive Predictor: Motivation According to Zusho, Pintrich, and Coppola (2003) the issue of the students view of themselves as chemistry students and their impression of the subject of chemistry impact their level of achievement in college chemistry courses. This study found that as students received feedback from their examinations, their confidence levels fell with the exception of the students characterized as high achievers. The authors conclusions emphasized the importance of maintaining self-efficacy levels and observed that successful students began using selfregulatory and organizational strategies as the course progressed. This study pointed out that in addition to students who typically achieve higher scores in postsecondary chemistry, motivated middle achievers did well in this course (Zusho et al., 2003). According to a recent student evaluation in gen chem I, prior knowledge is the most important factor that can be used to predict success in this course (Manrique, 2010). This is consistent with the Unified Learning Model (ULM) of Shell, Brooks, Trainin, Wilson, Kauffman, and Herr (2010), and suggests how important it is for high school teachers to successfully teach chemistry material to students. A students logic skills were also shown to be very important to succeed in the chemistry classroom. A scientist needs logic skills to solve complex problems. The ULM focuses on the basic 18 components of learning that are common amongst all learning theories. It is a simple model that can be used to explain all observed learning phenomena (Manrique, 2010). The main components of this model are: prior knowledge, working memory, and motivation. The working memory is the location where new knowledge is temporarily stored and processed. Knowledge is defined as everything we know stored in long-term memory or our prior knowledge. This prior knowledge includes everything from facts, skills, behaviors and thinking processes. Motivation is the catalyst to learning. If a student is not motivated to learn a new concept, the new knowledge will not even be temporarily stored into the working memory. Motivation directs the working memory to learn a new task (Shell et al., 2010). Cognitive Predictor: Prior Knowledge The California Chemistry Diagnostic Test 1997 (CA Dx) was originally designed to be used as screening tool for students interested in enrolling in college level general chemistry in California and has evolved into a useful diagnostic tool (Russell, 1994). It was validated in 1995 as a predictor for academic success (Karp, 1995). This study focused on the use of the CA Dx as a tool for assessment of college readiness for students enrolled in gen chem I. The CA Dx requires that 44 questions be answered in 45 minutes; any question left blank is counted as a wrong answer. The CA Dx has been given as a diagnostic pre-test by the second author since fall 2001 generating a mean (standard deviation) of 18.41 (6.29) with a range of 5 to 42 for a student population of n = 1,638, which is below the national mean of 20.45 (7.56). A copy of the CA Dx exam may be ordered from http://chemexams.chem.iastate.edu/ order/index.cfm (American Chemical Society Division of Chemical Education, 2009). November 2011

The Texas Science Teacher Volume 40, Number 2

Twenty Ways to Chemistry Concepts dd.) Enhancing Science Knowledge (cont .) Notable High School Teach Vocbulary(cont(contd.)
Lessons on Caring (contd.)
Some schools use the CA Dx as an optional test that allows students to enroll directly into general chemistry when a preparatory course is available. Students at Winthrop University in South Carolina, University of Nevada, Las Vegas and Santa Monica College in California can enroll directly into general chemistry and avoid taking introductory chemistry by passing the CA Dx (Santa Monica College, 2007; University of Nevada, Las Vegas; Winthrop University). UNT does not have this option so all students who enroll in a science major sequence must take General Chemistry for Science Majors. Another option is to score a 3, 4, or 5 on the College Board Advanced Placement Chemistry (AP exam) that usually places students into the second semester of general chemistry (University of California, Riverside, 2010). Not all universities offer an introductory chemistry course nor will all universities accept AP credit. At UNT students who have completed the published prerequisites are allowed to enroll in gen chem I and are expected to acquire any deficient background knowledge on their own. definitions or not using their time allotted wisely. The purpose of this investigation is to identify the most common concepts not retained by postsecondary students (i.e., misconceptions of students enrolled in entry-level gen chem). After identification, the approach evolves to identifying the most commonly chosen wrong answers of the most commonly missed questions on the CA Dx and attempting to give supporting explanations for these persistent misconceptions that directly relate to their prior chemistry content knowledge. Method The Students The students involved in this study have been admitted to one of the top four largest universities in Texas. Students enrolled in gen chem I are mostly science majors as the title of the course implies, but some are engineering majors and a few others (e.g., education and psychology majors) are enrolled. Data from the CA Dx were used to assess the prior chemistry content knowledge of the 286 students who gave IRB consent. Responses of these students were chosen based on their enrollment in the course during one of three consecutive semesters. All of these courses were sections of gen chem I during the long-term semesters (i.e., no summer sessions were included).

Problem Despite the national and state standards required to graduate from high school, there will always be concepts that are not retained by students between the time they are evaluated on the TEKS and when they enter general chemistry at the postsecondary level. Students enrolled at UNT have been shown to lack knowledge of foundaThe Test The means (standard deviations) for tional general chemistry concepts such as the students who participated in this study significant figures (especially those needed to employ rules for adding/subtracting), are listed in Table 1. These means are chemical structure (such as bond polarslightly below what was reported above for ity and Lewis structures), basic chemical the entire sample. In general, fall-semester nomenclature, and algebraic relationships students (n = 1111) available for study out(such as those used in gas law calculations). perform the spring students (n = 527) by Students are also making careless errors 1.70 points of the 44 total points on the CA such as not paying attention to accepted Dx instrument. The general conscience for 19 The Texas Science Teacher Volume 40, Number 2 November 2011

Enhancing Science Knowledge (contd.) Notable High School Chemistry Concepts (contd.)
this discrepancy is that the spring students usually do not have the required mathematics (i.e., successful completion of college algebra) or have a negative perception to studying chemistry, which has delayed them from beginning the required courses for their respective science and engineering degrees. In this particular sample (n = 286), there was no significant difference in the CA Dx means. The item analysis results of these tests were combined to determine the top five missed questions on the CA Dx exam and the most common incorrect answers for these questions in order to examine misconceptions held by entering gen chem I students. Table 1. Student Averages on the ACS California Diagnostic Exam N CA Dx Mean (SD) Semester 1 101 18.23 (6.00) Semester 2 43 18.40 (6.60) Semester 3 135 18.39 (6.35) Combined 286 18.33 (6.26) This test is given to students enrolled at the beginning of the semester as a pretest to assess prior content knowledge. The students are told that the results of this test would not impact their course grade. The instructions on the test indicate that only one answer choice is correct and the final score is based on the number of correct responses. Access to a periodic table of the elements and table of abbreviations/symbols are available as part of the CA Dx exam; the use of a non-programmable calculator is permitted. Data Analysis The responses provided by each student were entered into a Microsoft Excel spreadsheet to determine the number of 20 responses for each answer choice on each question. The number of responses was compiled as indicated by the number of the most commonly chosen wrong answers and the number of correct responses. The z score value was calculated for the most commonly chosen wrong answer responses and the correct responses. The occurrences of the most commonly chosen wrong answer choice and the correct answer choice were tested to determine if statistically significant difference existed at the 95% level of significance. The z critical value for this sample size for a two tailed hypothesis test with an alpha of 0.05 was +/- 1.96. The common wrong answers with positive z scores above +1.96 were considered choices that were chosen more frequently than they would have if all answers were chosen randomly. An interpretation of this situation is that many students thought that these were the correct answers in addition to the random guesses. The correctly chosen answers with negative z scores below -1.96 were considered choices that were chosen statistically less often than they should have been, based on a 25% chance at being chosen at random (i.e., each of the 44 questions has 4 possible choices). An interpretation of these results is that there was another answer choice that was a successful distractor indicative of a misconception. The 44 questions were ranked from most correct to least correct for the five questions that produced a negative z score below -1.96 for the number of correct responses along with a positive z score above +1.96 for the most commonly chosen wrong answer (see Table 2). The five questions in which the correct answers produced negative z values < -1.96 were the top five most missed questions, and the most commonly chosen wrong answer showed positive z November 2011

The Texas Science Teacher Volume 40, Number 2

Twenty Ways to Chemistry Concepts dd.) Enhancing Science Knowledge (cont .) Notable High School Teach Vocbulary(cont(contd.)
Lessons on Caring (contd.)
scores > +1.96. The calculated z values of these five questions indicated that students chose the most common wrong answers more than randomly predicted and the correct answers less than randomly predicted. These results are most likely due to misconceptions or wrong concepts that students held at the time of the test. Table 2. Most Common Misconceptions on the CA Dx Exam (n = 286) Most Missed Question Number z z Most Common (least to greatest): Topic Wrong Correct Wrong Response 19: Bond Polarity 5.12 -2.94 109 34: Significant Figures 16.18 -2.94 190 24: Lewis Dot Structures 15.64 -3.89 186 2: Nomenclature 16.73 -4.57 194 44: Algebraic relationships in 6.62 -5.67 120 gas laws

Correct Response 50 50 43 38 30

Results The fifth most commonly missed question ranked in the top 5 most missed questions for each administration of the test. This question has a z value of 5.12 for the most popular wrong answer and a z value for the correct answer of -2.94. In other words, 109/286 or 39.1% of the students tested chose the same wrong answer. This question asked the student to choose the bond with the highest polarity from a list of bonds. The most commonly chosen wrong answer was a pure covalent nonpolar bond, the exact opposite of what the question was asking. Fifty-three students may not have seen the more electronegative element on the periodic table. Sixty-nine students chose the least polar bond of the polar bonds given. Five students left this question blank and only 50 chose the correct answer. It appears that these students do not know the definition of a polar bond or how elements differ in electronegativity. This concept corresponds to TEKS Chemistry 5C, which states that students are expected to use the periodic table to identify and explain periodic trends, including atomic and ionic radii, electronegativity, and ionization energy (Texas Administrative Code, 2009a). Students should be able to determine if a molecule is polar, according to TCCRS (THECB and TEA, 2008). The fourth most commonly missed question had the second largest z value for the commonly chosen wrong answer out of all of the questions at 16.18. In other words 190/286 or 68.1% of the students tested chose the same wrong answer. The z score for the students that chose the correct answer was -2.94. The results of this most commonly chosen wrong answer is indicative that the concept tested is either a common misconception or concept that failed to be retained. This question asked about a laboratory technique using a balance, and reported the measurement of the weighed container with and without the mass using different numbers of significant figures. One hundred ninety students chose the answer that indicated an understanding of the procedure, but disregarded the add/subtract rule of using significant figures when reporting answers. Thirty-nine students chose the distractor that failed to take into account the combined mass of the container and 21 The Texas Science Teacher Volume 40, Number 2 November 2011

Enhancing Science Knowledge (contd.) Notable High School Chemistry Concepts (contd.)
Lessons on Caring (contd.)
object, and only gave the containers mass. Three students chose the other distractor and four left this question blank. The results of this question show that students are not aware of significant figure rules at the time of the test. According to Benchmarks for Science Literacy: Project 2061, Students by the end of the 8th grade should know that calculations (as on calculators) can give more digits than make sense or are useful and decide what degree of precision is adequate and round off the result of calculator operations to enough significant figures to reasonably reflect those of the inputs (American Association for the Advancement of Science, 1993). This also corresponds with TEKS Chemistry 2F, which states that students are expected to collect data and make measurements with accuracy and precision, and 2G express and manipulate chemical quantities using scientific conventions and mathematical procedures, including dimensional analysis, scientific notation, and significant figures (Texas Administrative Code, 2009a). Significant figures are listed in the TCCRS under the Geometry standards and under the Foundation Skills: Scientific Applications of Mathematics section of the Science standards (THECB and TEA, 2008) and will supposedly be stressed on the upcoming Chemistry STAAR exam. The third most missed question with the third most commonly chosen wrong answer had a z value of 15.64. The correct response had a z value of -3.89. The most commonly chosen wrong answer for this question was in the top 5 most commonly chosen wrong answers for each administration of the test and produced a most common wrong answer rate of 186/286 or 66.7%. This concept is another concept that needs to be looked at more closely in order 22 to improve the quality of chemistry instruction based on these z values. This question tested the understanding of Lewis dot structures. The most commonly chosen wrong answer misinterpreted the dots on the diagram as the atomic number, as opposed to the number of valence electrons. This question involves knowledge of the structure of an element, specifically the Lewis dots, which represent valence electrons. This knowledge corresponds to TEKS Chemistry 6E, which states that the student is expected to express the arrangement of electrons in atoms through electron configurations and Lewis valence electron dot structures (Texas Administrative Code, 2009a). The TCCRS state that students should be able to draw Lewis dot structures for simple molecules (THECB and TEA, 2008). The American Association for the Advancement of Science states, By the end of the 12th grade, students should know that atoms are made of a positive nucleus surrounded by negative electrons. An atoms electron configuration, particularly the outermost electrons, determines how the atom can interact with other atoms. Atoms form bonds to other atoms by transferring or sharing electrons (American Association for the Advancement of Science, 1993). Under the National Science Education Standards by the National Research Council (1996) students in grades 9-12 in physical science are to master the following related concepts:

Atoms interact with one another by transferring or sharing electrons that are furthest from the nucleus. These outer electrons govern the chemical properties November 2011

The Texas Science Teacher Volume 40, Number 2

Enhancing Science Knowledge (contd.) Notable High School Chemistry Concepts (contd.)
of the element. An element is composed of a single type of atom. When elements are listed in order according to the number of protons (called the atomic number), repeating patterns of physical and chemical properties identify families of elements with similar properties. This Periodic Table is a consequence of the repeating pattern of outermost electrons and their permitted energies. (pp. 178-179) Education Standards by the National Research Council (1996) students in grades 9-12 in physical science are to master the following related concepts:
Bonds between atoms are created when electrons are paired up by being transferred or shared. A substance composed of a single kind of atom is called an element. The atoms may be bonded together into molecules or crystalline solids. A compound is formed when two or more kinds of atoms bind together chemically. (p. 179)

The second most commonly missed concept regarded formula writing for ionic compounds. The most commonly chosen wrong answer for this question used the symbols for the ions in the compound, but disregarded the impact of the charges of the individual ions to determine the subscripts. This response had a z score of 16.73 with 194/286 or 69.5% of the students choosing this response. The answer choice that involved using the charge of the cation to determine the subscript of the anion without using the charge of the anion was chosen by 29 students. Seventeen students chose the answer in which the charge of the ion was used as the subscript for that ion and eight failed to respond. The expectation of writing a chemical formula is also expressed in the TEKS. This concept corresponds with TEK 7B, which states that students should be able to write the chemical formulas of common polyatomic ions, ionic compounds containing main group or transition metals, covalent compounds, acids, and bases (Texas Administrative Code, 2009a). According to Benchmarks for Science Literacy: Project 2061, students should know that atoms combine with one another in distinct patterns (American Association for the Advancement of Science, 1993). Under the National Science 23

The question that was missed the most overall was also either the most or second most commonly missed questions for each administration of the test. This question had the lowest z score for the correct answer of all of the items included on the test. For this question, 120/286 or 43.0% of the students tested selected the same wrong answer. The z score for the correct answer was -5.67, with the z score for the most commonly chosen distractor being 6.62. The wrong answer for this question was the eighth most commonly chosen wrong answer overall. The question asked for students to consider a formula and answer a conceptual question regarding how a relationship would change in light of maintaining a constant, if two variables were changed (i.e., increasing one by a factor of X and decreasing another by a factor of Y). In order to get this incorrect answer, the students failed to take into account that the direction of change in the numerator increased and the direction of change in the denominator decreased along with the fact that a constant must be maintained. The second most common incorrect answer (i.e., 73 responses) reported the correct overall direction of change but did not take into account that the denominators variable was decreasing and needed to be compensated for by increasing the numerator by that factor. The answer choice for the third most common wrong answer (i.e., November 2011

The Texas Science Teacher Volume 40, Number 2

Enhancing Science Knowledge (contd.) Notable High School Chemistry Concepts (contd.)
46 responses) had the correct magnitude of change but opposite direction indicating they may have understood the magnitude of change but not the concept of a constant. This question was left blank by 17 students and answered correctly by only 30 students (just over 12% of the student responses evaluated). This question may have thrown students off because it is a question concerning gas laws without any reference to gases, corresponding to TEKS Chemistry 9A, which states that the student is expected to describe and calculate the relations between volume, pressure, number of moles, and temperature for an ideal gas as described by Boyles law, Charles law, Avogadros law, Daltons law of partial pressure, and the ideal gas law (Texas Administrative Code, 2009a). The TCCRS state that students should be able to solve for gas temperature, pressure, or volume using algebraic symbols and formulae (THECB and TEA, 2008). This question was the last question on the exam and mathematically the most challenging, since changes in different directions of multiple variables were involved. However, prior chemistry knowledge was not important to finding the answer to this questiononly good algebraic skills! This question had the third most responses that were left blank out of all of the questions further supporting how important algebraic skills are to success in general chemistry and the importance of teaching gas laws from a conceptual standpoint. Discussion Possible Sources of Error One cannot determine the intentions of the students beyond their responses on the answer sheet and so all of the answer sheets that had any responses on them 24 counted toward these results. It is possible that students may not have taken the test seriously having the knowledge that the results of this test would not affect their grade in the course, but most students do take this exam seriously since it is usually the first test they ever taken in college and they desire to get off to a good start. Explanation of Findings Students entering gen chem I are expected to be proficient on the topics tested on the CA Dx upon entry into the course. There are a few explanations as for why these students had not mastered these concepts before entering this course. The concepts targeted in these results were bond polarity, significant figures in laboratory procedures, Lewis dot structures, nomenclature and algebraic relationships in gas laws. All of these concepts are indicated as college readiness standards as of fall 2010 (THECB and TEA, 2008). At the time of this study several of these concepts have not been tested on the TAKS test because the TAKS test was designed to ask chemistry questions based on the more basic IPC course. Since current graduation plans still allow for IPC to count as a year of science, this provides a loophole that allows students to be able to graduate high school without a full year of chemistry (Texas Administrative Code, 2010). In light of the recent changes to the state standards, high school teachers are now making changes to their course curricula that reflect the new expectations. It may also be possible that the revisions that have been made to support the new standards need more work in order to be effectively received by students. Conclusions and Suggestions Students are not retaining or lack knowledge of general chemistry concepts November 2011

The Texas Science Teacher Volume 40, Number 2

Enhancing Science Knowledge (contd.) Notable High School Chemistry Concepts (contd.)
that are expected of a student entering gen chem I, such as polarity, significant figures, periodicity, naming and algebraic manipulations. Students are making careless errors such as not paying attention to the definition of a constant or failing to apply skills that should have been acquired before entering college, such as manipulation of fractions and decimals (Texas Education Code, 2006) and proportional reasoning (Texas Education Code, 2009b). The next generation of the TEKS assessment is the STAAR program which, according to the House Bill 3 Transition Plan, is designed to increase the rigor of course assessment so that students will know when they meet a higher level of academic knowledge and skills needed to meet the challenges of the 21st century (Texas Education Agency, 2010a). However since the STAAR results on individual subject tests can be combined to determine a students eligibility for graduation, this still leaves room for vital chemistry concepts to fall through the cracks. These topics (bonding, significant figures, Lewis dot structures, nomenclature, and gas laws) are basic concepts that a student should not leave high school chemistry without. Our data also indicate that mastery of mathematical understanding is very important to student success even on a conceptual chemistry exam. Finally, it is important that chemistry instructors of all levels make chemistry relevant to their students. The relevance of chemistry in everyday life helps students identify and grasp some concepts more readily than others. Students should therefore be given the opportunity to practice these concepts and delved more in depth into more complex concepts at different cognitive levels so that they are aware of what is expected of them now and in the future. At the very least, assessments, assignments, and lectures should be designed to complement each other and provide students with the foundational knowledge they need to excel in gen chem I. Students need to meet educators half way, but educators need to be prepared to guide their students through possible roadblocks that may thwart their success in the courses. The material presented in the high school classroom needs to provide the student with a basis to continue their education whether it is at a postsecondary institution, career, or independent study beyond the course. The guidelines set up for high school teachers to follow need to adequately reflect the purpose of these courses. This will aid in maintaining the students academic self-image, assuming that they are motivated to succeed in the course.

25

The Texas Science Teacher Volume 40, Number 2

November 2011

Notable High School Chemistry Concepts (contd.)


Anna George is currently pursuing a PhD in Chemistry with the emphasis in Chemical Education at the University of North Texas. She has taught chemistry in the north Texas area for the past five years at the high school and university levels..

Dr. Mason is an Associate Professor of Chemistry at the University of North Texas. She received her BA in Zoology from UT, Austin, holds an MS in Zoology from Texas A&M, Commerce, and earned her PhD in Science Education from UT, Austin. Her research interest lies in how freshman chemistry students learn to learn chemistry including the effectiveness of electronic homework systems. She is on the Board of Trustees for the Fort Worth Regional Science and Engineering Fair, a Regional Director of the Associated Chemistry References Teachers of Texas, and is a member of the 2011 Class of ACS Fellows.
American Association for the Advancement of Science. (1993). Benchmarks for science literacy: project 2061. New York: Oxford University Press. American Chemical Society Division of Chemical Education. (2009). Examinations Institute. Retrieved August 1, 2011, from Ordering Information: http://chemexams.chem.iastate.edu/order/index.cfm Bunce, D., & Hutchinson, K. J. (1998). The use of the GALT (Group Assessment of Logical Thinking) as a predictor of academic success in college chemistry. Journal of Chemical Education, 70(3), 183-187. doi: 10.1021/ed070p183 House, J. D. (1995). Noncognitive predictors of achievement in introductory college chemistry. Research in Higher Education, 36(4), 473-490 . doi: 10.1007/BF02207907 Karpp, E. (1995). Validating the California Chemistry Diagnostic Test for local use (Paths to success, Volume III). Glendale Community Coll., CA: Planning and Research Office. Manrique, C. (2011). Effects of using logic and spatial cybergames to improve student success rates in lower-division chemistry courses. (Unpublished doctoral dissertation). University of North Texas, Denton, TX. National Research Council. (1996). National science education standards. Washington, DC: National Academy Press. Russell, A. A. (1994). A rationally designed general chemistry diagnostic test. Journal of Chemical Education, 71(4), 314. doi: 10.1021/ed071p314 Santa Monica College. (2007). Chemistry challenge exam. Retrieved April 22, 2011, from Santa Monica College: http://www.smc.edu/apps/pub.asp?Q=58 Shell, D. F., Brooks, D. W., Trainin, G., Wilson, K. M., Kauffman, D. F., & Herr, L. M. (2010). The Unified Learning Model: How motivational cognitive, and neurobiological sciences inform the best teaching practices (Vol. 1). New York, NY: Spring Science + Business Media. Texas Administrative Code. (2006, August 1). 19 TAC Chapter 111, Texas Essential Knowledge and Skills for Mathematics Subchapter A Elementary School. Retrieved September 29, 2011, from Texas Administrative Code Index: http://ritter.tea.state.tx.us/rules/tac/chapter111/ ch111b.html

References (contd.)
Texas Administrative Code. (2009a, August 4). 19 TAC Chapter 112, Texas Essential Knowledge and Skills for Science Subchapter C High School. Retrieved April 22, 2011, from:

26

The Texas Science Teacher Volume 40, Number 2

November 2011

Notable High School Chemistry Concepts (contd.)

http://ritter.tea.state.tx.us/rules/tac/chapter112/ch112c.html Texas Administrative Code. (2009b, February 23). 19 TAC Chapter 111, Texas Essential Knowledge and Skills for Mathematics Subchapter B Middle School. Retrieved September 29, 2011, from: http://ritter.tea.state.tx.us/rules/tac/chapter112/ch111b.html Texas Administrative Code. (2010, August 23). 19 TAC Chapter 74, Subchapter F. Retrieved April 22, 2011, from Texas Administrative Code Index: http://ritter.tea.state.tx.us/rules/tac/chapter074/ch074f.html Texas Education Agency. (2010a). House Bill 3 transition plan. Austin: Texas Education Agency. Retrieved April 22, 2011, from Texas Education Agency: http://www.tea.state.tx.us/student.assessment/hb3plan/ Texas Education Agency. (2010b, February 23). Texas College and Career Readiness Standards more comprehensive than national standards. Texas Education Agency News. Retrieved April 3, 2011, from Texas Education Agency: http://www.tea.state.tx.us/index4.aspx?id=8061 Texas Education Agency and Pearson. (2009). Technical digest from academic school year 2007-2008. Austin, TX: \ Texas Education Agency. Retrieved April 22, 2011, from Texas Education Agency: http://www.tea.state.tx.us/index3.aspx?id=4326&menu_id3=793 Texas Education Agency, Pearson Educational Measurement, Harcourt Educational Measurement, and BETA Inc. (2004). Texas student assessment program technical digest for the academic year 2002-2003. Austin: Texas Education Agency. Texas Higher Education Coordinating Board (THECB) and Texas Education Agency (TEA). (2008). Texas college and career readiness standards. Austin, TX: Texas Education Agency. University of California, Riverside. (2010, June). 2010-2011 University of California, Riverside general catalog. Retrieved April 22, 2011, from UCR Catalog: http://catalog.ucr.edu/catalog.html University of Nevada, Las Vegas. (n.d.). Chem 121 placement exam. Retrieved April 22, 2011, from UNLV Department of Chemistry: http://sciences.unlv.edu/Chemistry/policy.htm University of North Texas. (2010, July 1). 2010-2011 Undergraduate catalog. Retrieved April 22, 2011, from University of North Texas: http://www.unt.edu/catalog/undergrad/index.htm Winthrop University. (n.d.). Chem 105 placement. Retrieved April 22, 2011, from Winthrop University: http://chem.winthrop.edu/chem105_placement.htm Zusho, A., Pintrich, P. R., & Coppola, B. (2003). Skill and will: The role of motivation and cognition in the learning of college chemistry. International Journal of Science Education, 25(9), 10811094. doi: 10.1080/0950069032000052207

27

The Texas Science Teacher Volume 40, Number 2

November 2011

STAT MEMBERS
WARDS PROUDLY SUPPORTS STAT MEMBERS
Visit our booth for:
Hands-on exploration of new activities Cutting-edge technology for your classroom Chance to win valuable prizes

Wards in-house experts will be in the booth for your questions, suggestions, and ideas
P: 800-962-2660 | F: 800-635-8439

wardsci.com/APScience

STAT programAd.indd 1

8/11/11 4:29 PM

Enable Your District to Meet Science TEKS and Prepare Your Students for STAAR and EOC Tests!
CPO Science and Delta Education offer both Instructional Materials and Supplemental Instructional Materials
CPO Science offers a Program for Grades 68 that prepares students for the 8 grade assessment. This program includes
th

Online Solution for New and Expanded Science TEKS for Grades 35 from Delta Education

For more information:


Northern Texas VErNE ISbELL 800-338-5270 x172 verne.isbell@schoolspecialty.com Southern Texas JOAN LyLES 800-338-5270 x188 joan.lyles@schoolspecialty.com Inside Sales JANIcE MArcuS 800-338-5270 x189 janice.marcus@schoolspecialty.com

the new TEKS and fills in the gaps so students learn everything they need to know.

SEE SAMPLE LESSON PAGES AT www.cpo.com/home/2/Your State/TX.aspx


CPO Science also offers Foundations of Physics, Second Edition program for Grades 1012

30 DAy ONLINE PrEVIEw www.neworexpandedteks.schoolspecialty.com

For more information:


SALLy DuDLEy 800-237-1371 sally.dudley@schoolspecialty.com

28
40142 2011 STAT TX ad.indd 1

www.SchoolSpecialtyScience.com The Texas Science Teacher Volume 40, Number 2 November 2011
8/4/11 3:11 PM

Using a Force Meter to Measure an Objects Mass: A Potential Misconception


by Andrzej Sokolowski
n the process of strengthening high school physics program, most of the emphasis is placed on the curriculum content (Texas TEKS for Physics, 2011). Physical instruments used by students for data gathering seem to be a secondary concern in this process. A discussion that follows is to signal that verification of these instruments for adherence with principles of physics might also be needed to provide students with a sound physics inquiry. Commonly used single spring force meters are dually calibrated; they measure the amount of objects substance expressed in kilograms (or grams), and simultaneously they can measure the objects weight (or force) expressed in newtons.

the misconception from occurring. Following is a problem whose solution leads to the contradictory data. I conduct the thought experiment with my advanced physics students. Being placed in the roles of assessors of the measuring instruments, they also learn that simplifications might sometimes lead to faulty designs. Is mass dependent on gravity? The answer to this question is apparent; mass is independent from intensity of gravitational field. Mixed responses to this question can be generated when a dual spring scale is used to verify the answer. After posing this question to students, I conduct the following thought experiment. Students are given objects (for example 100 g density blocks) and a dual force meter (see Fig. 1). I formulate the following problem: Suppose we want to measure the objects weight and its mass on the Earth and on the Moon using the same spring force meter. What readings will this force meter show on the Earth and on the Moon? Students will find the readings of the objects weight and mass on the Earth easily. They will also correctly hypothesize the objects weight on the Moon referring the Moons intensity of gravitational field to be about 1.6 N/kg (Serway, 2005). Estimating the measurements of the mass using the same force meter on the Moon will puzzle them. They will predict that a lower gravitational field on the Moon will produce a shorter stretch of the spring of the force meter. Since simultaneously the same spring measures objects mass, the amount of the mass of the object will show to be less than that on the Earth! They arrive at a contraNovember 2011

Fig.1. Force meters calibrated in grams and newtons. Source www.sargentwelch.com Although they are convenient to use and provide relatively accurate data for classroom analysis, utilizing them to measure objects weight and mass might create in students minds a misconception that objects mass depends on the intensity of gravitational field. The goal of the paper is show physics colleagues the weakness of such designed force meter and consequently alert students to prevent the likelihood of 29

The Texas Science Teacher Volume 40, Number 2

Using a Force Meter (contd.)


diction; they realize that mass should not change, however using the force meter they conclude that mass depends on the gravitational field. This serious misconception confusing the concept of mass and weight might have a negative impact on students further studies of dynamics. The table 1 below shows the expected readings. The mass of 100g will show according to the thought process, as mass of 16 g on the Moon which is not correct. The amount of substance is still 100g and disregarding the intensity of gravitational field. Mass Force of Gravity 0.981N http://phet.colorado.edu/sims/mass-spring-lab/massspring-lab_en.html.

Fig. 2. Screen shots of simulation showing different stretches of a spring scale; left on the Earth, right on the Moon. Source: Source: PhET Interactive Simulations, University of Colorado www.phet.colorado.edu. Different amount of stretch indicates different gravitational fields. The hung mass in both locations is 100g. A discussion of how a triple beam balance scale works and if it provides correct mass measurements independent from gravitational field can also be included in this discussion. Along the idea of torque and conditions for a stable equilibrium can also be brought up. Additional Questions 1. Suppose that a force meter is designed and calibrated on the Moon and brought on the Earth. What readings (mass and weight) will it show if a 100 g mass is attached on it on the Earth? 2. Suppose that a force meter that calibrated on the Equator on the Earth to measure mass and weight is used on the North Pole. Will it show correct measurements? Consider mass and weight separately. What is the associated percent error for measuring mass and weight? www1.bipm.org/en/si, www1.bipm.org/en/si

Measurements 100.0g on the Earth Measurements 16.0g 0.160N predicted by using a dual force meter on the Moon Table1. Force meter readings on the Earth and predicted readings on the Moon. We conclude that the dual force meter has certain limitations that we need to be aware of. An afterward discussion can focus on identifying conditions when the device shows correct readings. We conclude that the force meter measures properly objects mass and weight under the condition that it is calibrated at the same place where it is used. As a verification of different spring stretches due to gravity, a physics simulation Masses and springs can be utilized. The simulation is created by PhET Interactive Simulations Project at Colorado University and it can be found at: 30

The Texas Science Teacher Volume 40, Number 2

November 2011

Using a Force Meter (contd.)

Reflections The idea of designing springs measuring mass and weight might originate from the unit of 1 gram-force that is equal to 0.0098 N (NIST, 1995). Gram-force is not though equivalent to gram-mass, and the designers of the spring force meter do not make this distinction. In addition, in the current International System of Units (SI) gram-force and kilogram-force are not listed as units of force (The International System of Units, 1960). In order to decrease the likelihood of induction of the misconception in physics classes, to measure mass I use triple beam scale. Force meter is used only to measure force or weight.

ovich Odan Cathie eadership L 2009 Finalist Award

Andrzej Sokolowski is a doctoral student in the Department of Math Education at Texas A&M University, College Station, TX. He holds a Masters degree in physics from Gdansk University, Poland. He is a fulltime mathematics and physics teacher at Magnolia West High School, Magnolia, and a math adjunct professor at LSC-Tomball, TX. His research interest includes contextualization of mathematics concepts through scientific representations.

Linds 2011 ey Richar ds R Star ising Winn er

how the great educators in your community just how much they are appreciated. Nominate them for the H-E-B Excellence in Education Awards! Teachers can win $5,000 to $25,000, with a matching grant for their school. Principals can win $10,000 with a $25,000 grant for their school. School districts can win $50,000 or $100,000. Visit heb.com/education today to nominate your favorite teacher, principal or school district.

References Serway, R. A. Moses, C. J. & Moyer, C. A. (2005). Modern Physics, 3rd ed. Thomson Brooks/Cole, Belmont. NIST Guide for the Use of the International System of Units (SI) Special Publication 811, (1995) page 51. Texas TEKS for Physics. Retrieved July 22, 2011 from http://www.tea.state.tx.us. The International System of Units. Retrieved July 22, 2011 from http://www1.bipm.org/en/si.
2011 HEB, 11-3420

Shay Harman 2009 Rising Star Winner

31

The Texas Science Teacher Volume 40, Number 2

November 2011

Using Science Teaching Case Narratives to Assess the Effectiveness of a Scientific Inquiry Elementary Science Methods Course with Hispanic Preservice Elementary Teachers
by Ron and Amy Wagler Abstract he Positive Science Teaching Case Narrative (PSTCN) and the Negative Science Teaching Case Narrative (NSTCN) were developed to evaluate preservice elementary teachers acceptance of scientific inquiry teaching as defined by the National Science Education Standards. The purpose of this study was to validate the use of the PSTCN and NSTCN for a population of Hispanic preservice elementary teachers and to assess the impact of a scientific inquiry elementary science methods course on the target populations level of acceptance of scientific inquiry teaching. Findings include that the PSTCN can be modified for Hispanic populations of preservice elementary teachers and remain a valid and reliable instrument for evaluating preservice elementary teachers acceptance of scientific inquiry teaching; that the elementary science methods course had a statistically significant positive effect on the PSTCN scores of the preservice elementary teachers and that the elementary science methods course had no effect on the NSTCN scores of the preservice elementary teachers. Elementary science methods course characteristics are presented that can guide instructors of elementary science methods courses with a scientific inquiry teaching component. Key Words: Case Narratives; Hispanic; Preservice Elementary Teachers; Science Methods Course; Scientific Inquiry. Introduction The National Science Education Standards (NSES) formally define scientific inquiry as a set of interrelated processes by which scientists and students pose questions about the natural world and investigate phenomena (NRC, 1996, p. 214). The concept of scientific inquiry is infused into all parts of the NSES and is central to science learning (NRC, 1996, p. 2). When students perform these processes they acquire knowledge and develop a rich understanding of concepts, principles, models, and theories (NRC, 1996, p. 214) about the natural world and scientific phenomena. The NSES further states that inquiry is a critical component of a science program at all grade levels and in every domain of science, and designers of curricula and programs must be sure that the approach to content, as well as the teaching and assessment strategies, reflect the acquisition of scientific understanding through inquiry. Students then will learn science in a way that reflects how science actually works (NRC, 1996, p. 214). Processes of Scientific Inquiry These flexible processes of science (NRC, p. 105, 1996) children perform when engaging in scientific inquiry include: Ask a question about objects, organisms, and events in the environment Plan and conduct a simple investigation Use appropriate tools and techniques to gather and interpret data Use evidence and scientific knowledge to develop explanations Communicate investigations, data, and explanations to others1 (Wagler, 2010, p. 216) In the kindergarten through fourth grade classroom (i.e., students approximately five to ten years of age), the National 32 The Texas Science Teacher Volume 40, Number 2 November 2011

Using Science Teaching Case Narratives (contd.)


Science Education Standards (NRC, 1996, 2000) suggest the use of progressively more complex scientific inquiry investigations based on a childs fine motor skills and cognitive abilities. In the context of the kindergarten through fourth (i.e., K 4) grade classroom, these scientific inquiry activities begin in the early elementary classroom with science investigations that are centered on observations. These investigations, in the latter elementary classroom, develop into scientific inquiry investigations where only one variable at a time is changed (i.e., fair test) (NRC 1996, p. 122) Case Narratives A case (i.e., case narrative) has been defined as a narrative organized around a key event and portraying particular characters that is structured to invite engagement by participants in a discussion (Miller & Kantrov, 1998, p. 2). Case narratives, in the context of a science methods course, are used to introduce preservice teachers to events that they have experienced or have a high probability of experiencing during their field experiences, student teaching internship or during their teaching career (Koballa & Tippins, 2004). Help! How Can I Teach without Supplies? (Howe & Nichols, 2001) is a case narrative pertaining to an elementary science teacher who is trying to get money to buy equipment and science materials for her classes. Help! How Can I Teach without Supplies? addresses this problem, explains how the elementary science teacher solved this problem and ends with another problem the teacher is attempting to solve. After the preservice teacher reads the case narrative and reflects upon the events described they then discuss the case narrative with their science education methods instructor 33 and their peers in the course. Other assignments, beyond the discussion, can also be aligned to the case narrative such as field experiences or writing assignments. The PSTCN and NSTCN2 The Positive Science Teaching Case Narrative (PSTCN) and the Negative Science Teaching Case Narrative (NSTCN) were constructed to evaluate preservice elementary teachers level of acceptance of scientific inquiry teaching as defined by the National Science Education Standards (Wagler, 2010). The PSTCN and the NSTCN (Wagler, 2010) offer a simple, efficient and effective tool for evaluating preservice elementary teachers. The science teaching case narratives are short, easy to read and allow for quick administering and gathering of the data. The analysis of the science teaching case narrative data is simple whether you are trying to assess a specific preservice elementary teacher or a class of preservice elementary teachers. In the original (Wagler, 2010) study the PSTCN and NSTCN were administered at a single time (i.e., at the end of an elementary science methods course) and no randomization and control group were used in the design. This study builds upon this research. The purpose of this study was to validate the use of the PSTCN and NSTCN for a population of Hispanic preservice elementary teachers and to utilize these instruments in a pre/post randomized design with a control group in order to assess the impact of a scientific inquiry (NRC, 1996) elementary science methods course (i.e., treatment) on the target populations level of acceptance of scientific inquiry teaching (NRC, 1996). The term acceptance, in this study, is defined as the preservice elementary teacher believing scientific inquiry (NRC, 1996) is an November 2011

The Texas Science Teacher Volume 40, Number 2

Using Science Teaching Case Narratives (contd.)


efficacious way to construct a highly effective science teaching and learning environment. In a practical sense this occurs when the individual preservice teacher rates the positive science teaching case narrative as a very good (4) or excellent (5) teaching event (Wagler, 2010, p. 217). Methodology Research Questions Two research questions defined the study: Research Question One: Can the PSTCN and the NSTCN be modified for a Hispanic population of preservice elementary teachers and remain a valid instrument for evaluating preservice elementary teachers level of acceptance of scientific inquiry teaching? Research Question Two: Will the elementary science methods course effect the Hispanic preservice elementary teachers scores on the PSTCN and the NSTCN? Study Participants Treatment Group The participants for the treatment group (i.e., enrolled in an elementary science methods course with an very strong emphasis on scientific inquiry [NRC, 1996]) consisted of 156 K - 4 preservice elementary teachers enrolled in the last year of their bachelors degree program at a midsized urban southwestern United States of America (USA) border region university with a predominantly Hispanic population. The treatment group predominantly consisted of Hispanic females. Of the 156 preservice elementary teachers in the treatment group, 149 were female (95.5%) and seven (4.5%) were male. The preservice elementary teachers mean age was 26.1 years. Of the 156 preservice elementary teachers, 145 were 34 Hispanic (93.0%), seven were White, three were Black and one was Asian/Pacific Islander. All of the preservice elementary teachers in both the treatment and control group were participating in their senior level university public school teaching internship and were simultaneously enrolled in two university education courses that consisted of an elementary science methods and elementary social studies methods course. The preservice elementary teachers in the study did not choose what sections of their senior level university education courses they were enrolled in. They were placed in these sections by the university. This is taken into consideration in the statistical analysis. Control Group The participants for the control group (i.e., enrolled in an elementary science methods course with extremely minimal or no emphasis on scientific inquiry [NRC, 1996]) consisted of 86 K-4 preservice elementary teachers enrolled in the last year of their bachelors degree program at a midsized urban southwestern USA border region university with a predominantly Hispanic population. The control group predominantly consisted of Hispanic females. The treatment and control groups were homogenous with respect to gender and ethnicity. Of the 86 preservice elementary teachers 83 were female (96.5%) and 3 (3.5%) were male. The preservice elementary teachers mean age was 26.4 years. Of the 86 preservice elementary teachers, 81 were Hispanic (94.2%), three were White and two were Black. Randomization of Study For the purposes of data collection all senior level university education course sections were randomized. Based on the outcome of these random numbers, a random selection of sections (i.e., treatment November 2011

The Texas Science Teacher Volume 40, Number 2

Using Science Teaching Case Narratives (contd.)


and control groups) was chosen from which to gather data. During the pretest the preservice elementary teachers were randomly assigned either the PSTCN or the NSTCN. Elementary Science Methods Course The majority of the elementary science methods course (i.e., treatment) was focused on scientific inquiry teaching lessons, class discussions and public school teaching internship experiences (i.e., observations and teaching). Each preservice elementary teacher enrolled in the course was required to develop three scientific inquiry teaching lessons. All three lessons were based on the processes of science (NRC, p. 105, 1996; Carin, Bass & Contant, 2005, p. 21) modeled in the PSTCN (Wagler, 2010). The criteria and components of each lesson progressively increased in grade level, complexity, point value (i.e., Lesson 1: 5% of total grade; Lesson 2: 15% of total grade and Lesson 3: 25% of total grade) and instructor expectation. For example, scientific inquiry teaching lesson one was developed for use with kindergartners or first graders and focused on the scientific inquiry skill of observation. Scientific inquiry teaching lesson two was developed for either second or third graders and required the preservice elementary teachers to utilize at least two components of the processes of science. The third and final lesson of the course, scientific inquiry teaching lesson three, was developed for fourth graders and required the preservice elementary teachers to utilize all components of the processes of science and a fair test (a test in which only one variable at a time is changed). This was based on the NSES recommendation that the idea of a fair test is possible for many students to consider by fourth grade (NRC, 1996 p. 122). 35 All three lessons were taught, by the preservice elementary teachers who developed the lesson, to their peers in the elementary science methods course. All three lessons were aligned to the grade appropriate state science process/content standards and were given both oral and written peer feedback. Lastly, after each lesson was taught it was formally assessed by the developer of the lesson (i.e., preservice elementary teacher) for: age appropriate science content; processes of science (NRC, p. 105, 1996; Carin, Bass & Contant, 2005, p. 21) performed by the students; alignment to the grade appropriate state science process/ content standards; type of assessment and effectiveness of assessment utilized; a reflection component identifying ineffective components of the lesson; and how these ineffective components could be modified to be made effective. After this process each preservice elementary teachers taught the age appropriate scientific inquiry teaching lesson (either lesson one, two or three) in the public school classroom where they were interning. A very small percentage of the course was focused on course lectures. All course lectures were conducted via PowerPoint (with multimedia enhancement), were no longer than fifteen minutes and concluded with a full class participation scientific inquiry (NRC, 1996) activity (i.e., systematic observations and/or fair test) that reinforced and modeled the topic that was discussed in the lecture. Many of these K - 4 observational and fair test scientific inquiry (NRC, 1996) activities were insect (i.e., Madagascar hissing cockroach) activities (i.e., Wagler & Moseley, 2005; Wagler, 2009; Wagler, 2010a). The course lecture topics included scientific inquiry (NRC, 1996), national (NRC, 1996) and state science teaching standards, instructional strategies, science November 2011

The Texas Science Teacher Volume 40, Number 2

Using Science Teaching Case Narratives (contd.)


education safety standards, assessment in science education, classroom management, equity in science education, using animals in your classroom and local science education resources. The NSES Content Standards: K - 4 (NRC, 1996, p. 120 - 141) was one of the assigned readings for the course. Three reflection assignments were associated with this document. Other written assignments and a science journal were also part of the course. The preservice elementary teachers science journal consisted of all the courses lecture notes, periodic spontaneous written class reflections and observations (i.e., systematic observations) and data collected (i.e., fair test) during the in - class scientific inquiry activities the preservice elementary teachers performed. The data the preservice elementary teacher collected during these activities was used to construct tables and charts in their science journals. Lastly, the science journals were also used by the preservice elementary teachers to record written observations that occurred during their public school teaching internship. Instruments The Positive Science Teaching Case Narrative (PSTCN) and Negative Science Teaching Case Narrative (NSTCN) (Wagler, 2010) are science teaching case narratives that are used to evaluate the level of acceptance of scientific inquiry (NRC, 1996) teaching in preservice elementary teachers. The PSTCN reflects a classroom teaching event modeled after the processes of scientific inquiry (see scientific inquiry section) as defined by the National Science Education Standards (NRC, 1996). The NSTCN reflects a classroom event where human cooperation, human exchange of ideas and human group participation were not encouraged 36 or allowed by the teacher. None of the processes of scientific inquiry as defined by the National Science Education Standards (NRC, 1996) occur within the science teaching case narrative. The PSTCN and the NSTCN have the same six science teaching case narrative rating questions. These questions are answered after the science teaching case narratives are read twice. The science teaching case narrative rating questions were developed so the preservice elementary teachers could evaluate the scientific inquiry events that occurred in the classroom environment. See Wagler (2010) for a thorough review of the development and use of the PSTCN, NSTCN and the science teaching case narrative rating questions. It is suggested (Wagler, 2010) that the attributes of the teacher presented in the science teaching case narratives can be modified to be similar to the attributes of the specific preservice elementary teachers in the course you are teaching (p.224 - 225) but this was not assessed in the original study (Wagler, 2010). Based upon this suggestion both the PSTCN and NSTCN were modified (see Appendix) so the attributes of the science teacher presented in both the PSTCN and NSTCN were similar to the most common attributes of the preservice elementary teachers in the elementary science methods courses that were being evaluated as part of the study. Preliminary demographic data was collected on the preservice teachers before they completed the PSTCN and NSTCN pretest. This data included the preservice elementary teachers first name, last name, age, grade level they planned to teach, gender and ethnicity. The preservice elementary teachers were also asked to provide the three most common girl/boy first names of the students in the main public school internships class they observed/ November 2011

The Texas Science Teacher Volume 40, Number 2

Using Science Teaching Case Narratives (contd.)


taught in. The most common first name among the studies preservice elementary teachers was Maria, the most common last name was Garcia, the average age was 26, the most common grade level the preservice elementary teachers planned to teach was third, the most common gender was female and the most common ethnicity was Hispanic. As in the original (Wagler, 2010) study, the fictitious teacher presented in this studies science teaching case narratives was based on the collected demographic data of the preservice elementary teachers enrolled in the elementary science methods courses. For example, the teachers name in the original science teaching case narratives (Wagler, 2010) was changed from Jennifer Lewis to Maria Garcia as was the grade she taught (i.e., third) and her age (i.e., 26). These changed were based on the most common demographic data of the preservice elementary teachers participating in the study. Some of the students names in the case narratives were also changed to common Hispanic first names (e.g. Juan, Ana or Carlos) based on the most common student first names in the main public school internships class the preservice elementary teachers observed/taught in. For example Kelly was changed to Ana and Jeff was changed to Carlos. These changes were based on the researchers knowledge that all of the preservice elementary teachers in the study were participating in public school internships that allowed them to observe/teach in classrooms with predominantly Hispanic students. All of the changes that were made to the original PSTCN and NSTCN (Wagler, 2010) were performed so that the preservice elementary teachers were able to better relate and find commonality with the teacher presented in the case narratives (Kazdin, 37 1974; Suls & Miller, 1977)(Wagler, 2010, p. ). The PSTCN, NSTCN and the case narrative rating questions (see Appendix) reflect these changes. Study Procedure Treatment Group The PSTCN and the NSTCN were administered on the first day of the elementary science methods course before any information had been presented (i.e., pretest). During the pretest the preservice elementary teachers were randomly assigned either the PSTCN or the NSTCN. The positive and negative science teaching case narratives were randomized. Based on this randomization they were distributed to the preservice elementary teachers. Through this process 76 preservice (n=76) elementary teachers received the PSTCN and 80 (n=80) preservice elementary teachers received the NSTCN. The students were then asked to read their science teaching case narrative twice, either the PSTCN or the NSTCN, and answer the six science teaching case narrative rating questions. The PSTCN and the NSTCN were administered again on the last day of the elementary science methods course after all information had been presented (i.e., posttest). The preservice elementary teachers who had received the PSTCN for the pretest (n=76) received the PSTCN for the posttest and the preservice elementary teachers who had received the NSTCN for the pretest (n=80) received the NSTCN for the posttest. For both the pretest and the posttest the preservice elementary teachers were unaware that there were two versions of the science teaching case narrative and that one was positive (i.e., PSTCN) and one was negative (i.e., NSTCN).

The Texas Science Teacher Volume 40, Number 2

November 2011

Science-Fair Scorecard (contd.) Using Science Teaching Case Narratives (contd.)


Lessons on Caring (contd.)
Control Group The study procedure for the control group was the same as for the treatment group. Based on the random assignment of the PSTCN or the NSTCN, 42 preservice (n=42) elementary teachers received the PSTCN and 44 (n=44) preservice elementary teachers received the NSTCN. The preservice elementary teachers who had received the PSTCN for the pretest (n=42) received the PSTCN for the posttest and the preservice elementary teachers who had received the NSTCN for the pretest (n=44) received the NSTCN for the posttest. Results Validity and Reliability of the PSTCN and NSTCN The validity of the PSTCN and NSTCN were evaluated using exploratory factorial models (EFA) estimated in the open source software package R (R development core team, 2008). For both the PSTCN and NSTCN, the eigenvalues were estimated based on a polychoric correlation matrix (a correlation matrix more suitable than the Pearson correlation matrix for ordinal data) (Uebersax, 2009). The first eigenvalues were 4.09 and 4.25 for the PSTCN and NSTCN, respectively. All remaining eigenvalues were smaller than 1 for both instruments. The uni-dimensional exploratory factor analytic models, estimated in R using factanal, and based on the polychoric correlation matrix exhibited good fit for both the PSTCN and NSTCN. For the PSTCN, the loadings ranged from 0.627 to 0.918 and the one factor model accounted for 61.2% of the score variability. For the NSTCN, the loading ranged from 0.561 to 0.907 and the one factor model accounted for 65.6% of the score variability. According to the results, the PSTCN and NSTCN appear valid measures of Hispanic preservice teachers level of acceptance of scientific inquiry. The internal consistency of the PSTCN and NSTCN was assessed via Cronbachs alpha based on the polychoric correlation matrix. This resulted in lower bounds on reliability of 0.906 and 0.915 for the PSTCN and NSTCN, respectively. These statistics provide adequate evidence that the PSTCN and NSTCN are reliable measures of the levels of acceptance of scientific inquiry. Table 1 presents the pretest, posttest, and the overall Likert scale means and standard deviations for each of the questions associated with the PSTCN Treatment Group (n=76), NSTCN Treatment Group (n=80), PSTCN Control Group (n=42) and NSTCN Control Group (n=44). In order to assess the impact of the scientific methods course, the data is analyzed using a repeated measures model on the mean scores for the PSTCN and NSTCN where the treatment/control groups are considered a fixed factor. Additionally, a mixed repeated measures model is utilized where the treatment/control groups are regarded as a random factor. The mixed model is employed because the university placed these students into the groups in a manner not likely to effect the results but ultimately using a non - random procedure. First, the conventional repeated measures model is examined. Table 2 presents the repeated measures ANOVA results for the case narrative data. There is a statistically significant three - way interaction between the test time (i.e., pretest or posttest), case narrative (i.e., positive or negative), and group (treatment or control) variables (F=13.271, p - value<0.001). This interaction implies there is a difference in how students respond be38 The Texas Science Teacher Volume 40, Number 2 November 2011

Science-Fair Scorecard (contd.) Using Science Teaching Case Narratives (contd.)


Lessons on Caring (contd.)
tween the pretest and the posttest mean scores that depends on which case narrative they read and to which group they were assigned. Overall, for any difference in the group or case narrative variables, the model implies a difference in the pretest and posttest change. Given the interaction between test time, case narrative, and group is significant, the lower level interaction effects and main effects for case narrative and group are not particularly meaningful. Table 1 Likert Scale Pretest and Posttest Means (Standard Deviation) 1 2 3 4 5 3.658 (1.21) 4.447 (0.79) 1.363 (0.80) 1.238 (0.53) 3.761 (1.19) 3.833 (1.25) 1.31 (0.73) 1.222 (0.42) 3.039 (1.41) 4.461 (0.81) 1.250 (0.65) 1.225 (0.55) 3.142 (1.30) 3.214 (1.52) 1.222 (0.60) 1.200 (0.50) 3.171 (1.46) 4.421 (0.88) 1.325 (0.67) 1.238 (0.51) 3.357 (1.39) 3.286 (1.49) 1.288 (0.63) 1.178 (0.39) 3.947 (1.06) 4.461 (0.90) 1.275 (0.57) 1.200 (0.56) 4.119 (1.02) 3.881 (1.15) 1.266 (0.58) 1.200 (0.55) 3.658 (1.22) 4.500 (0.77) 1.650 (0.90) 1.275 (0.48) 3.810 (1.17) 3.571 (1.40) 1.666 (0.95) 1.178 (0.39)

Question

6 3.987 (1.01) 4.566 (0.79) 1.375 (0.62) 1.188 (0.51) 4.143 (1.07) 3.905 (1.08) 1.288 (0.59) 1.156 (0.47)

PSTCN Treatment Pretest N=76 PSTCN Treatment Posttest N=76 NSTCN Treatment Pretest N=80 NSTCN Treatment Posttest N=80 PSTCN Control Pretest N=42 PSTCN Control Posttest N=42 NSTCN Control Pretest N=45 NSTCN Control Posttest N=45

Overall Mean 3.577 (1.28) 4.476 (0.82) 1.373 (0.72) 1.227 (0.52) 3.722 (1.24) 3.615 (1.34) 1.340 (0.70) 1.189 (0.45)

Table 2 Repeated Measures ANOVA on Attitude Means Model Effects df SS MS F Test Time 1 4.23 4.23 8.042 Group 1 4.60 4.60 8.747 Case Narrative 1 827.540 827.540 1573.741 Group: Case Narrative 1 2.90 2.90 5.517 Test Time: Group 1 6.80 6.80 12.935 Test Time: Case Narrative 1 14.32 14.32 27.237 Test Time: Group: Case 1 6.98 6.98 13.271 Narrative Error 478 251.35 0.53 39 The Texas Science Teacher Volume 40, Number 2 November 2011

P - value 0.005 0.003 <0.001 0. 019 <0.001 <0.001 <0.001

Science-Fair Scorecard (contd.) Using Science Teaching Case Narratives (contd.)


Figure 1 presents the pretest (i.e., 1) and posttest (i.e., 2) overall Likert scale means for the PSTCN Treatment Group (n=76), NSTCN Treatment Group (n=80), PSTCN Control Group (n=42) and NSTCN Control Group (n=44).

Overall Likert Scale Means


5 4.5 4 3.5 3 Score 2.5 2 1.5 1 0.5 0 PSTCN Treatment NSTCN Treatment NSTCN Control PSTCN Control

Figure 1. Pretest and Posttest Overall Likert Scale Means for all Groups Table 3 contains the estimated mean differences between the pretest and posttest times for each of the group by case narrative combinations. The means from the three way interaction effects are compared while controlling for multiplicity using Tukeys honest significant difference (HSD). The HSD was computed to be 0.208. Any difference in the means greater than the magnitude of the HSD (i.e., 0.208), is deemed statistically different and any difference less in magnitude than the HSD is not statistically different. The PSTCN treatment posttest - pretest mean is statistically significant according to this criterion.
Table 3 Posttest - Pretest Mean Differences for each Group by Science Teaching Case Narrative Treatment Combination

PSTCN Treatment Control Posttest Pretest = Posttest Pretest = 0.899 - 0.150

NSTCN Treatment Control Posttest Pretest = Posttest Pretest = - 0.110 - 0.150

40

The Texas Science Teacher Volume 40, Number 2

November 2011

Science-Fair Scorecard (contd.) Using Science Teaching Case Narratives (contd.)


A Mixed Repeated Measures Model The university divided the preservice elementary teachers into the treatment and control groups by which sections the students were assigned. Most likely the lack of true randomization did not bias the results of the repeated measures ANOVA. However, we analyzed the data with the Group and Group by Case Scenario interaction variables included as a random effects rather than fixed effects in the model (i.e., a mixed repeated measure model). Analysis demonstrated that the model with just the Group variable included as a random effect provides the most parsimonious fit; thus, the model with Time, Case Narrative, and Time by Case Narrative interaction included as fixed effects and Group included as a random effect along with random error is assumed for the mixed analysis. As a consequence of including Group as a random effect, the overall experimental variance is partitioned into Group and random error variability so that it does not bias the resulting F tests. When the mixed model is analyzed, the F test for the interaction between Time and Case Narrative is still significant (F=25.8605, p - value<0.001). The likelihood ratio test for evaluating whether the variability due to Group is different than 0 is marginally significant with type I error rate equal to 0.05 (L=4.309, p - value=0.0379). The significance likelihood ratio test implies that there is substantial variability introduced into the study by the Group factor. However, whether Group is a fixed or random factor does not effect the F tests for significance or the conclusions made for this study. As a consequence, the previous repeated measures ANOVA yields similar results to the mixed analysis; namely, there is an observed interaction between the factors and this interaction depends on whether the subject belongs to the treatment (i.e., scien41 tific inquiry) group or the control. Discussion This study was conducted with a specific population, USA Hispanic preservice elementary teachers. This is an important population for teacher education program to consider. The reason is three - fold: 1) Hispanic students are a growing USA college student demographic; 2) Hispanic students are a growing USA public school (i.e., K - 12) student demographic; and 3) there is a serious shortage of Hispanic teachers in USA public education. Hispanics are the fastest growing demographic in the USA and are predicted to be a majority in the USA by the year 2042 (USA Census Bureau, 2008). In October 2007, 12% of USA college students were of Hispanic origin. This is an increase from 10% in 2005 and the proportion is projected to increase further (USA Census Bureau, 2007). It can be inferred that just as the number of Hispanic USA college students has increased (and will further increase in the future) so has (and will) the number of USA Hispanic preservice elementary teachers increased. The population of Hispanic students is also increasing in USA public schools. The ratio of USA public school K 12 student that are English language learners (ELLs) has gone from 1 in 20 in 1990 to 1 in 9 in 2008 (Goldberg, 2008). Goldberg reports that ELLs in the United States come from 400 different language backgrounds, but 80% are Spanish speakers (Lesser & Windsor, 2009, p. 5). Researchers suggest a need for cultural compatibility between public school students and at least some of the teachers in the schools (Delpit, 1996; Haberman, 1988; Ladson - Billings, 1997; Nieto, 2000; November 2011

The Texas Science Teacher Volume 40, Number 2

Science-Fair Scorecard (contd.) Using Science Teaching Case Narratives (contd.)

Valenzuela, 1999) (Battle & Cuellar, 2006, p. 2). In many USA states, including the state of Texas, there is extreme disparity between the proportion of Hispanic students and educators (USA Census Bureau, 2007). The Texas Education Agency (2004) reported that of 80,000 new students received annually in public schools, 57.6% are Hispanic, and of those, 75% demonstrate limited English proficiency. At the same time, only 12.9% of the teachers in the state are Hispanic (Battle & Cuellar, 2006). To date, this is the first research study conducted on this demographic of preservice teachers with regard to acceptance of scientific inquiry. Based on the data of this study (i.e., pre/post randomized design with a control group), there is a difference in how the preservice elementary teachers responded between the pretest and the posttest that was dependant on which case narrative they read and to which group they were assigned (see Table 2). Only the overall means for the PSTCN treatment group achieved a statistically significant difference between the pretest and posttest scores (see Figure 1). Since there is a significant positive upward trend from the pretest (i.e., 3.577, see Table 1) to the posttest (i.e., 4.476, see Table 1) for only the overall means for the PSTCN treatment group we can conclude that the elementary science methods course caused an increase in the level of acceptance of scientific inquiry (NRC, 1996) teaching in the preservice elementary teachers. All of the other groups (i.e., PSTCN control, NSTCN treatment and NSTCN control) experienced no change, from pretest to posttest, in the level of acceptance of scientific inquiry (NRC, 1996) teaching in preservice elementary teachers.

Implications These findings have direct implications for the use of the PSTCN and NSTCN with preservice elementary teacher populations in elementary science methods courses that place special emphasis on scientific inquiry teaching (NRC, 1996). The results presented in this article demonstrate that the PSCTN and NSCTN may be modified for other preservice teacher populations (such as Hispanic) and remain valid and reliable instruments for assessing the level of acceptance of scientific inquiry. PSTCN2 This study found that the PSTCN (Wagler, 2010) can be modified (see Appendix) for a Hispanic populations of preservice elementary teachers and still remain a valid and reliable instrument for evaluating preservice elementary teachers acceptance of scientific inquiry teaching as defined by the National Science Education Standards (NRC, 1996). This study also found that the elementary science methods course had a statistically significant positive effect on the PSTCN scores of the preservice elementary teachers. NSTCN2 This study found that the NSTCN (Wagler, 2010) can also be modified (see Appendix) for Hispanic populations of preservice elementary teachers and still remain a valid and reliable instrument for evaluating preservice elementary teachers acceptance of non-scientific inquiry teaching. Additionally, this article provides evidence that the elementary science methods course (i.e., treatment) had no effect on the NSTCN scores of the preservice elementary teachers. The preservice elementary teachers entered the elementary science methods course able to recognize negative teaching and completed November 2011

42

The Texas Science Teacher Volume 40, Number 2

Using Science Teaching Case Narratives (contd.)


the elementary science methods course with the same level of recognition associated with negative teaching. It is unclear, from this study, what prior factors influenced the preservice elementary teachers recognition abilities associated with negative teaching. This is an area where further research is needed to identify what prior factors, and to what degree these prior factors, have influenced the preservice elementary teachers recognition abilities associated with negative teaching. It can be speculated that the preservice elementary teachers prior experiences associated with their own past negative educational experiences and/or their current teacher education program has equipped the preservice elementary teachers with the abilities needed to recognize negative teaching. Effective Overarching Characteristics of the Elementary Science Methods Course Evidence from scientific inquiry research3 has presented a strong case for how scientific inquiry (NRC, 1996) teaching can construct a highly effective science learning environment with positive student learning outcomes. Efficacious training of preservice and inservice teachers, in constructing these environments, is essential to the implementation of highly effective scientific inquiry (NRC, 1996) environments in schools. This study found that the elementary science methods course, which placed special emphasis on scientific inquiry (NRC, 1996) teaching, positively increased the preservice elementary teachers level of acceptance of scientific inquiry teaching by 0.899 on a 5 point Likert scale (see Appendix for scale). This was an overall mean positive upward movement from the Good category to the Very Good category. Based on this uptown trend from 3.577 to 4.476, the data 43 provides evidence that the overarching characteristics of the elementary science methods course were effective in increasing the preservice elementary teachers level of acceptance of scientific inquiry teaching (NRC, 1996). These four overarching course characteristics can guide instructors of elementary science methods courses with a scientific inquiry (NRC, 1996) teaching component. One: A Holistic Approach All of the components of the elementary science methods course (e.g., scientific inquiry teaching lessons, scientific inquiry activities, reflections, science journals, public school internship teaching experiences, class discussions) were influenced directly or indirectly by the National Science Education Standards (NRC, 1996) understanding of scientific inquiry (NRC, 1996). In a practical sense, this facilitated a classroom environment where the science methods instructor guided the preservice elementary teachers in experientially understanding how scientific inquiry can become an essential component of all aspects of a highly effective learning environment. This continued repetition, throughout the semester, generated an environment where the preservice elementary teachers began to comprehend the difficulties and complexities of constructing a highly effective scientific inquiry environment where every student was held to high expectations and achieved maximum learning. This knowledge and these experiences were then utilized and challenged when the preservice elementary teachers taught their scientific inquiry lessons in the public school.

The Texas Science Teacher Volume 40, Number 2

November 2011

Using Science Teaching Case Narratives (contd.)


Two: From Low to High Levels of Complexity Most of the preservice elementary teachers self - reported that when they entered the course they believed that they had low levels of scientific knowledge (i.e., content and process) and low levels of confidence associated with developing and teaching science lesson plans. Based on this information, the criteria and components of each lesson progressively increased in grade level, complexity, point value (i.e., Lesson 1: 5% of total grade; Lesson 2: 15% of total grade and Lesson 3: 25% of total grade) and instructor expectation. This level of progressive complexity was also coupled with the repetitive nature of the lessons (i.e., lessons 1 - 3); continued feedback from the instructor; participation in scientific inquiry activities that focused on science knowledge (i.e., content and process); and continued exposure to educational tools that could make their lessons and pedagogical techniques more effective. All of these components were put in place so the preservice elementary could develop and teach their lessons in a positive, safe environment conducive to increasing their levels of scientific knowledge (i.e., content and process), scientific inquiry (NRC, 1996) and teaching self-confidence. After this the preservice elementary teachers were removed from this safe environment and taught one of the lessons they had developed in the public school where they were interning. Then the preservice elementary teachers brought this experience, in written form, back to the elementary science methods course for discussion. Three: Instructor Modeling Another characteristic of the elementary science methods course was that the instructor of the course modeled scientific inquiry teaching throughout the semester instead of just lecturing about scientific 44 inquiry teaching. This was considered an essential component for two main reasons. One, as scientific inquiry research3 shows students tend to learn more when they are allowed to do something versus hearing about something. And two, when an individual sees another individual successfully model a given event (i.e., scientific inquiry teaching) typically the observers efficacy beliefs are raised. This is especially true if the individual observed is deemed competent by the observer. Competence at a given task, activity or event has been shown to be more effective at increasing efficacy than the age of the model, sex of the model or other personal characteristics (Bandura, 1997). Model competence is an especially influential factor when observers have a lot to learn and models have much they can teach them by instructive demonstration of skills and strategies (Bandura, 1997, p.101). Four: Public School Teaching Experiences The processes of scientific inquiry1, as they are presented, consist of five simple statements. The effective implementation of those processes in a real world situation (e.g. public school classroom) is complex and often difficult for even an experienced teacher who has received continuing professional development training in scientific inquiry. Educational researchers (Wagler & Moseley, 2005a) have suggested that public school teaching experiences (i.e., observations and teaching) should be a part of science methods courses. Public school teaching experiences were an ongoing integrated component of the elementary science methods course. These experiences have an authenticity that hypothetical situations (e.g., role playing and microteaching) within the methods class, lack. These experiences, which were brought back to the class in written form, allowed for rich discussions on many pertiNovember 2011

The Texas Science Teacher Volume 40, Number 2

Using Science Teaching Case Narratives (contd.)


nent topics such as scientific inquiry (NRC, 1996) teaching, classroom management, instructional strategies and student engagement. Concluding Remarks The results of this study show that the PSTCN can be modified for Hispanic populations of preservice elementary teachers and remain a valid instrument for evaluating preservice elementary teachers level of acceptance of scientific inquiry teaching; that the elementary science methods course had no effect on the NSTCN scores of the preservice elementary teachers; that the elementary science methods course had a statistically significant positive effect on the PSTCN scores of the preservice elementary teachers and that the preservice elementary teachers level of acceptance of scientific inquiry teaching increased from the Likert scale category of Good to Very Good. The need for highly qualified Hispanic preservice elementary teachers is apparent. This preservice teacher populations acceptance of scientific inquiry teaching is important to evaluate because of the highly effective nature of scientific inquiry teaching3 (NRC, 1996). This study presents a simple, efficient, valid and reliable instrument (i.e., PSTCN and NSTCN) for evaluating the effect of elementary science methods experiences in this growing population and presents evidence for expanded use of the instrument with preservice elementary teachers in general. Effective overarching science methods course characteristics have also been presented that can further guide instructors of elementary science methods courses in increasing the level of acceptance of scientific inquiry teaching in their own preservice elementary teachers. It is ultimately the desire of the authors that these resources posi45 tively impact preservice elementary teachers, who in turn, will construct highly effective scientific inquiry (NRC, 1996) learning environment that will serve the specific needs of their students. Appendix Positive Science Teaching Case Narrative (PSTCN) Please read the paragraph below two times and answer the questions that follow: Maria Garcia is a twenty-six year old elementary school teacher who teaches third grade. It is the beginning of science class and today Miss Garcia has developed a scientific inquiry plant activity for the students to work on instead of the worksheets she used last year. She is a little nervous about trying this new teaching method but feels it is the best way to get the students doing science versus learning about science. Yesterday she put the students in groups and spent the hour letting them develop a plant question they could investigate. She now feels that today the students are ready to develop their own investigations. Miss Garcia begins class by placing the students back in their groups and showing the students the materials and equipment they will be able to use for the investigations they design. She then lets them begin working on developing their investigations. Many of the students in the class are visibly excited about the chance to develop an investigation and the room is instantly filled with lively discussion about what they should do for their group investigations. This will be cool! one of the students says. My Mom told me plants can make their own food. We should investigate that! another states. As the students work Miss Garcia moves from group to group and assists the students with the investigations they are beginning to develop. Miss Garcia do you think we November 2011

The Texas Science Teacher Volume 40, Number 2

Using Science Teaching Case Narratives (contd.)


can develop an investigation using the pH meter you showed us? Ana asks. Sure why dont you see if you can first find out what pH is and then think about how that might effect your plant. Once you know that you can set up and perform your investigation. Please let me know if you have other questions. Miss Garcia moves on to another group. Our group wants to try and find out what soil our plants grow best in. We were going to use the soil testing kit to analyze different soil types. Do you think we can do that? Juan asks. Miss Garcia looks at their investigation question and their investigation. This looks very good! Why dont you begin your investigation. Carlos raises his hand and Miss Garcia approaches his group. We want to do something with plants and wind erosion. Can I bring a fan from home so that we can test that? Carlos asks. I have one in the closet that your group can use. Why dont you go get it. Can I look at your groups investigation? Miss Garcia asks. This looks great! See if you and your group can figure out a way to measure the wind speed from your fan. After you have figured out a way to measure wind speed begin your investigation. As the hour progresses the room is filled with active discussion about the data the students are collecting from their investigations. Wow! I didnt know that would happen Elizabeth declares. What did you get when you changed the wind speed? asks Monica. It looks like were getting more erosion explains Carlos. Miss Garcia continues to move throughout the room assisting the groups as they perform their investigations. I want to congratulate all the groups for working so hard and staying on task. Science class is almost over but tomorrow we will continue to perform our investigations. Please find a good stopping place in your investigations and get ready for math class Miss Garcia states. 46 Negative Science Teaching Case Narrative (NSTCN) Please read the paragraph below two times and answer the questions that follow: Maria Garcia is a twenty-six year old elementary school teacher who teaches third grade. It is the beginning of science class and todays lesson is on plants. Miss Garcia has decided to use worksheets for the lesson. Miss Garcia gets up from her desk and begins to hand out the worksheets for the students to complete. The worksheets consist of terms to define and multiple choice questions. She instructs the students that the answers to the plant worksheets are in their science textbook and they are to work alone to find these answers. The students begin looking through their science textbooks for the answers as Miss Garcia begins to grade papers. The room is silent except for the sound of pages being turned. After about fifteen minutes Catherine, a girl in the front row, raises her hand. Miss Garcia calls upon Catherine. Miss Garcia, I cant find one of the definitions. What should I do? Catherine asks. Why dont you use the index in the back of your textbook to look up the term Miss Garcia replies. Cathy begins to try and find the index in her textbook as Miss Garcia goes back to grading papers. Some of the students in the class begin to talk quietly and Miss Garcia asks them to stop talking. The room returns to silence and the students continue to work on their worksheets. Carlos, a boy in the second row, is having trouble answering question 4 on his worksheet. What did you get for number 4? he whispers to Rosa the girl sitting next to him. I got photosynthesis. What is photosynthesis? Carlos asks. I dont know explains Rosa. Miss Garcia looks up and notices that two students in the 3rd row are passing notes. She decides not to do anything because they are remaining quiet. The November 2011

The Texas Science Teacher Volume 40, Number 2

Using Science Teaching Case Narratives (contd.)


rest of the class continues to work on their worksheets. The class period is now coming to an end and some of the students have finished their worksheets. They begin to quietly bring them to the front of the room, place them on Miss Garcias desk and return to their seats. Science class is almost over. Please put up your science materials and get out your math books Miss Garcia states. If you were not able to finish your science worksheets then this will be part of your homework for tonight. I will collect your worksheets at the beginning of the day tomorrow. Now lets turn to page thirty four in your math books and begin working on problems one through ten. Case Narrative Rating Questions Please rate the paragraph you have just read by addressing the statements below. Indicate your response by circling the appropriate number to the right of the statement. 1 Poor 2 Fair 3 Good 4 Very Good Poor 1. Rate the quality of the lesson that Miss Garcia developed. 1 2. Rate how well Miss Garcia taught the lesson. 1 3. Rate how well the students learned the content 1 of the lesson. 4. Rate how well Miss Garcia engaged the students 1 in the lesson. 5. Rate how well Miss Garcia managed the 1 classroom (students, materials and time). 6. Rate how well Miss Garcia created a positive 1 learning environment. 2 2 2 2 2 2 3 3 3 3 3 3 4 4 4 4 4 4 5 Excellent Excellent 5 5 5 5 5 5

Notes 1 Modified version of the processes of scientific inquiry from the National Science Education Standards (NRC, 1996, pp. 122, 145) for grades K - 8. Modified by Carin, Bass & Contant, 2005, p. 21. For the specific details associated with administering and assessing the PSTCN and NSTCN in your course see Wagler, 2010. The specific details associated with administering and assessing the PSTCN and NSTCN (Wagler, 2010) also apply to using the instruments presented in the Appendix.
2

For a more thorough review of this scientific inquiry research see NRC, 2000, p. 114 - 129 and NRC, 2007, p. 251 - 295.
3

47

The Texas Science Teacher Volume 40, Number 2

November 2011

Using Science Teaching Case Narratives (contd.)

Amy Wagler is an assistant professor at the University of Texas at El Paso with a PhD in statistics from Oklahoma State University. Her areas of research are generalized linear models (GLM), generalized linear mixed models (GLMM) and simultaneous inference in these settings. Current research includes dose response models and simultaneous inferences on the mean or on any function of the model parameters. Application of GLM and GLMM models to educational data is also a research focus. She is currently investigating the performance simultaneous testing of measurement invariance of educational instruments across populations when independence assumptions are violated. She is also developing interval-based methods of assessing heterogeneity in mixed models with hierarchical structures, such as classroom settings. Her focus in educational studies is effective use of experimental designs when assessing interventions and learning outcomes. Dr. Ron Wagler is an Assistant Professor of Science Education at the University of Texas at El Paso with a PhD in environmental science from Oklahoma State University. He teaches undergraduate and graduate science education and environmental education courses. His current research interests include human-arthropod relationships, teacher efficacy, environmental education, arthropods education, scientific inquiry, evolution education and Madagascar hissing cockroach curriculum development. References
Bandura, A. (1997). Self-efficacy: The exercise of control. New York: W. H. Freeman. Battle, J. & Cuellar, R. (2006). Obstacles to overcome: Mexican American pre - service teachers share their insights. National Forum of Multicultural Issues Journal - Electronic, 3 (2), 1 - 14. Carin, A. A. Bass, J. E. & Contant, T. L. (2005). Teaching science as inquiry. Upper Saddle River, NJ: Pearson Merrill Prentice Hall. Delpit, L. (1996). Other peoples children: Cultural conflict in the classroom. New York: The New Press. Goldberg, C. (2008). Teaching English language learners: What the research does - and does not say. American Educator, 33(2), 8 - 19, 22 - 23, 42 - 44. Haberman, M. (1988). Proposals for recruiting minority teachers: Promising practices and attractive detours. Journal of Teacher Education. 29 (4). 38 - 44. Howe, A. C. & Nichols, S. E. (2001) Case studies in elementary science: Learning from teachers. Upper Saddle River, NJ: Merrill Prentice Hall. Kazdin, A. E. (1974). Covert modeling, model similarity, and reduction of avoidance behavior. Behavioral Therapy, 5, 325340. Koballa, T. R. & Tippins, D. J. (2004). Cases in middle and secondary science education: The promise and dilemmas. Upper Saddle \ River, NJ: Pearson Merrill Prentice Hall. Ladsen - Billings, G. (1997). The dreamkeepers: Successful teachers of african-american children. San Francisco, CA: Josseu Bass Education Series.

48

The Texas Science Teacher Volume 40, Number 2

November 2011

Using Science Teaching Case Narratives (contd.)


References (contd.)
Lesser, M. L. & Windsor, S. M. (2009). English language learners in introductory statistics: Lessons learned from an exploratory case study of two pre - service teachers. Statistics Education Research Journal, 8(2), 5 - 32. Miller, B. & Kantrov, I. (1998). A guide to facilitating cases in education. Portsmouth, NH: Heinemann. National Research Council (NRC). (1996). National Science Education Standards. Washington, DC: National Academies Press. National Research Council (NRC). (2000). Inquiry and the national science education standards: A guide for teaching and learning. Washington, DC: National Academies Press. National Research Council (NRC). (2007). Taking science to school: Learning and teaching science in grades k - 8. Washington, DC: National Academies Press. Nieto, S. (2000). Affirming diversity: The sociopolitical context of multicultural Education. New York: Longman. R Development Core Team. (2008). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Suls, J. M. & Miller, R. L. (1977). Social comparison processes: Theoretical and empirical perspectives. Washington, DC: Hemisphere. Uebersax, J. S. (2009). The tetrachoric and polychoric correlation coefficients. Retrieved June 16, 2010, from Statistical Methods for Rater and Diagnostic Agreement Web site: http://www.john-uebersax.com/stat/tetra.ht USA Census Bureau. (2007). Retrieved October 14, 2010 from: http://factfinder.census.gov/servlet/DatasetMainPageServlet?_program=DEC&_submenuId=datasets_2&_lang=en&_ts USA Census Bureau, (2008). An older and more diverse nation by mid - century. Retrieved October 15, 2010 from: http://www.census.gov/newsroom/releases/archives/ population/cb08-123.html USA Census Bureau. (2010). Retrieved May 17, 2010 from: http://factfinder.census.gov/servlet/DatasetMainPageServlet?_program=DEC&_submenuId=datasets_2&_lang=en&_ts Valenzuela, A. (1999). Subtractive schooling: U.S. mexican american youth and the politics of caring. New York: State University of New York Press. Wagler, R. (2009). Chow down! Using Madagascar hissing cockroaches to explore basic nutrition concepts. Science Scope, 32 (7), 12 - 18. Wagler, R. (2010). Using science teaching case narratives to evaluate the level of acceptance of scientific inquiry teaching in preservice elementary teachers. The Journal of Science Teacher Education, 21 (2), 215 - 226, doi: 10.1007/s10972-009-9160-9. Wagler, R. (2010a). Home sweet home: How to build a Madagascar hissing cockroach habitat out of recycled materials. Science Scope. 33 (8), 34 - 39. Wagler, R. & Moseley, C. (2005). Cockroaches in the Classroom: Incorporating the Madagascar Hissing Cockroach into your science curriculum. Science Scope, 28(6), 34 - 37. Wagler, R. & Moseley, C. (2005a). Preservice teacher efficacy: Effect of a secondary education methods course and student teaching. Teacher Education and Practice, 18(4), 442 - 457.

49

The Texas Science Teacher Volume 40, Number 2

November 2011

50

The Texas Science Teacher Volume 40, Number 2

November 2011

The Publications of the Science Teachers Association of Texas Solicit Manuscripts

The Science Teachers Association of Texas (STAT) publishes two periodicals: The Statellite and The Texas Science Teacher. The Statellite is the associations newsletter. It contains innovative science activities, STAT leadership news, and current information on membership benefits. The Texas Science Teacher is a peer-reviewed journal that publishes papers pertinent to science education from all fields of science and science teaching. Contributions can be research articles, research notes, book reviews, and essays of general scientific interest. For Both Publications: All submitted material must be a significant original contribution not being considered elsewhere for publication. Inform the editor if material included in the article is published on the web, as excessive duplication should be avoided and adequate links must be established. All manuscripts must be written in English. Send an electronic copy of your manuscript to: the STAT Editor at stat@bizaustin.rr.com Include in the e-mail the author name(s), current e-mail and physical address(es), and a contact phone number. Indicate the publication for which the manuscript is submitted. Two referees (reviewers) and the editor review all manuscripts. You will receive communication of original receipt and then of completed reviews. Submissions for both publications should follow the Publication Manual of the American Psychological Association, Fifth Edition. Guidelines - The submission guidelines on-line: http://www.statweb.org/texas-science-teacher/tst-guidelines Upon Acceptance - Return the edited manuscript as soon as possible as an e-mail attachment to the editor. The manuscript must be returned in strict adherence to the instructions you receive with your manuscript. Tables and Figures - All tables must be separate files in Microsoft Word format. All images must be separate files in .jpg, .psd, .ai, or other standard format. The file name of each table or figure must relate to its place in the document (e.g. Figure 1.jpg). If submitting picture, they must be accompanied by a separate file, including a caption and the source (i.e. the name of the photographer and/or the image copyright owner) for each image.

51

The Texas Science Teacher Volume 40, Number 2

November 2011

You might also like