The University of Edinburgh

  • Schools & departments

scientific literacy and critical thinking skills answer key

Critical thinking

Advice and resources to help you develop your critical voice.

Developing critical thinking skills is essential to your success at University and beyond.  We all need to be critical thinkers to help us navigate our way through an information-rich world. 

Whatever your discipline, you will engage with a wide variety of sources of information and evidence.  You will develop the skills to make judgements about this evidence to form your own views and to present your views clearly.

One of the most common types of feedback received by students is that their work is ‘too descriptive’.  This usually means that they have just stated what others have said and have not reflected critically on the material.  They have not evaluated the evidence and constructed an argument.

What is critical thinking?

Critical thinking is the art of making clear, reasoned judgements based on interpreting, understanding, applying and synthesising evidence gathered from observation, reading and experimentation. Burns, T., & Sinfield, S. (2016)  Essential Study Skills: The Complete Guide to Success at University (4th ed.) London: SAGE, p94.

Being critical does not just mean finding fault.  It means assessing evidence from a variety of sources and making reasoned conclusions.  As a result of your analysis you may decide that a particular piece of evidence is not robust, or that you disagree with the conclusion, but you should be able to state why you have come to this view and incorporate this into a bigger picture of the literature.

Being critical goes beyond describing what you have heard in lectures or what you have read.  It involves synthesising, analysing and evaluating what you have learned to develop your own argument or position.

Critical thinking is important in all subjects and disciplines – in science and engineering, as well as the arts and humanities.  The types of evidence used to develop arguments may be very different but the processes and techniques are similar.  Critical thinking is required for both undergraduate and postgraduate levels of study.

What, where, when, who, why, how?

Purposeful reading can help with critical thinking because it encourages you to read actively rather than passively.  When you read, ask yourself questions about what you are reading and make notes to record your views.  Ask questions like:

  • What is the main point of this paper/ article/ paragraph/ report/ blog?
  • Who wrote it?
  • Why was it written?
  • When was it written?
  • Has the context changed since it was written?
  • Is the evidence presented robust?
  • How did the authors come to their conclusions?
  • Do you agree with the conclusions?
  • What does this add to our knowledge?
  • Why is it useful?

Our web page covering Reading at university includes a handout to help you develop your own critical reading form and a suggested reading notes record sheet.  These resources will help you record your thoughts after you read, which will help you to construct your argument. 

Reading at university

Developing an argument

Being a university student is about learning how to think, not what to think.  Critical thinking shapes your own values and attitudes through a process of deliberating, debating and persuasion.   Through developing your critical thinking you can move on from simply disagreeing to constructively assessing alternatives by building on doubts.

There are several key stages involved in developing your ideas and constructing an argument.  You might like to use a form to help you think about the features of critical thinking and to break down the stages of developing your argument.

Features of critical thinking (pdf)

Features of critical thinking (Word rtf)

Our webpage on Academic writing includes a useful handout ‘Building an argument as you go’.

Academic writing

You should also consider the language you will use to introduce a range of viewpoints and to evaluate the various sources of evidence.  This will help your reader to follow your argument.  To get you started, the University of Manchester's Academic Phrasebank has a useful section on Being Critical. 

Academic Phrasebank

Developing your critical thinking

Set yourself some tasks to help develop your critical thinking skills.  Discuss material presented in lectures or from resource lists with your peers.  Set up a critical reading group or use an online discussion forum.  Think about a point you would like to make during discussions in tutorials and be prepared to back up your argument with evidence.

For more suggestions:

Developing your critical thinking - ideas (pdf)

Developing your critical thinking - ideas (Word rtf)

Published guides

For further advice and more detailed resources please see the Critical Thinking section of our list of published Study skills guides.

Study skills guides  

Accelerate Learning

  • MISSION / VISION
  • DIVERSITY STATEMENT
  • CAREER OPPORTUNITIES
  • Kide Science
  • STEMscopes Science
  • Collaborate Science
  • STEMscopes Math
  • Math Nation
  • STEMscopes Coding
  • Mastery Coding
  • DIVE-in Engineering
  • STEMscopes Streaming
  • Tuva Data Literacy
  • NATIONAL INSTITUTE FOR STEM EDUCATION
  • STEMSCOPES PROFESSIONAL LEARNING
  • RESEARCH & EFFICACY STUDIES
  • STEM EDUCATION WEBINARS
  • LEARNING EQUITY
  • DISTANCE LEARNING
  • PRODUCT UPDATES
  • LMS INTEGRATIONS
  • STEMSCOPES BLOG
  • FREE RESOURCES
  • TESTIMONIALS

Critical Thinking in Science: Fostering Scientific Reasoning Skills in Students

Thinking like a scientist is a central goal of all science curricula.

As students learn facts, methodologies, and methods, what matters most is that all their learning happens through the lens of scientific reasoning what matters most is that it’s all through the lens of scientific reasoning.

That way, when it comes time for them to take on a little science themselves, either in the lab or by theoretically thinking through a solution, they understand how to do it in the right context.

One component of this type of thinking is being critical. Based on facts and evidence, critical thinking in science isn’t exactly the same as critical thinking in other subjects.

Students have to doubt the information they’re given until they can prove it’s right.

They have to truly understand what’s true and what’s hearsay. It’s complex, but with the right tools and plenty of practice, students can get it right.

What is critical thinking?

This particular style of thinking stands out because it requires reflection and analysis. Based on what's logical and rational, thinking critically is all about digging deep and going beyond the surface of a question to establish the quality of the question itself.

It ensures students put their brains to work when confronted with a question rather than taking every piece of information they’re given at face value.

It’s engaged, higher-level thinking that will serve them well in school and throughout their lives.

Why is critical thinking important?

Critical thinking is important when it comes to making good decisions.

It gives us the tools to think through a choice rather than quickly picking an option — and probably guessing wrong. Think of it as the all-important ‘why.’

Why is that true? Why is that right? Why is this the only option?

Finding answers to questions like these requires critical thinking. They require you to really analyze both the question itself and the possible solutions to establish validity.

Will that choice work for me? Does this feel right based on the evidence?

How does critical thinking in science impact students?

Critical thinking is essential in science.

It’s what naturally takes students in the direction of scientific reasoning since evidence is a key component of this style of thought.

It’s not just about whether evidence is available to support a particular answer but how valid that evidence is.

It’s about whether the information the student has fits together to create a strong argument and how to use verifiable facts to get a proper response.

Critical thinking in science helps students:

  • Actively evaluate information
  • Identify bias
  • Separate the logic within arguments
  • Analyze evidence

4 Ways to promote critical thinking

Figuring out how to develop critical thinking skills in science means looking at multiple strategies and deciding what will work best at your school and in your class.

Based on your student population, their needs and abilities, not every option will be a home run.

These particular examples are all based on the idea that for students to really learn how to think critically, they have to practice doing it. 

Each focuses on engaging students with science in a way that will motivate them to work independently as they hone their scientific reasoning skills.

Project-Based Learning

Project-based learning centers on critical thinking.

Teachers can shape a project around the thinking style to give students practice with evaluating evidence or other critical thinking skills.

Critical thinking also happens during collaboration, evidence-based thought, and reflection.

For example, setting students up for a research project is not only a great way to get them to think critically, but it also helps motivate them to learn.

Allowing them to pick the topic (that isn’t easy to look up online), develop their own research questions, and establish a process to collect data to find an answer lets students personally connect to science while using critical thinking at each stage of the assignment.

They’ll have to evaluate the quality of the research they find and make evidence-based decisions.

Self-Reflection

Adding a question or two to any lab practicum or activity requiring students to pause and reflect on what they did or learned also helps them practice critical thinking.

At this point in an assignment, they’ll pause and assess independently. 

You can ask students to reflect on the conclusions they came up with for a completed activity, which really makes them think about whether there's any bias in their answer.

Addressing Assumptions

One way critical thinking aligns so perfectly with scientific reasoning is that it encourages students to challenge all assumptions. 

Evidence is king in the science classroom, but even when students work with hard facts, there comes the risk of a little assumptive thinking.

Working with students to identify assumptions in existing research or asking them to address an issue where they suspend their own judgment and simply look at established facts polishes their that critical eye.

They’re getting practice without tossing out opinions, unproven hypotheses, and speculation in exchange for real data and real results, just like a scientist has to do.

Lab Activities With Trial-And-Error

Another component of critical thinking (as well as thinking like a scientist) is figuring out what to do when you get something wrong.

Backtracking can mean you have to rethink a process, redesign an experiment, or reevaluate data because the outcomes don’t make sense, but it’s okay.

The ability to get something wrong and recover is not only a valuable life skill, but it’s where most scientific breakthroughs start. Reminding students of this is always a valuable lesson.

Labs that include comparative activities are one way to increase critical thinking skills, especially when introducing new evidence that might cause students to change their conclusions once the lab has begun.

For example, you provide students with two distinct data sets and ask them to compare them.

With only two choices, there are a finite amount of conclusions to draw, but then what happens when you bring in a third data set? Will it void certain conclusions? Will it allow students to make new conclusions, ones even more deeply rooted in evidence?

Thinking like a scientist

When students get the opportunity to think critically, they’re learning to trust the data over their ‘gut,’ to approach problems systematically and make informed decisions using ‘good’ evidence.

When practiced enough, this ability will engage students in science in a whole new way, providing them with opportunities to dig deeper and learn more.

It can help enrich science and motivate students to approach the subject just like a professional would.

New call-to-action

Topics: science , "STEM" , project-based learning

Recent Posts

Posts by tag.

  • "STEM" (72)
  • science (33)
  • teacher writers (17)
  • Teaching Strategies (16)
  • distance learning (14)
  • STEM careers (11)
  • early childhood education (11)
  • "remote learning" (10)
  • real world connection (9)
  • technology (9)
  • mathematical reasoning (8)
  • productive struggle (8)
  • project-based learning (8)
  • COVID-19 (7)
  • classroom resources (6)
  • inquiry-based learning (6)
  • math strategies (6)
  • learning strategies (5)
  • virtual teaching (5)
  • Community of Learners (4)
  • inquiry (4)
  • k12 education (4)
  • student engagement (4)
  • teaching online (4)
  • engineering (3)
  • stem majors (3)
  • teacher appreciation (3)
  • "parents" (2)
  • ALI employees (2)
  • Guidelines of Best Practice (2)
  • aerospace engineering (2)
  • cooperative learning (2)
  • multilingual learners (2)
  • phenomena (2)
  • professional development (2)
  • stem innovators (2)
  • stem strategies (2)
  • strategies (2)
  • student confidence (2)
  • student-centered learning (2)
  • summer break activities (2)
  • take home activities (2)
  • teaching (2)
  • women in stem (2)
  • Academic Language (1)
  • Black History Month (1)
  • Black scientists (1)
  • CRA approach (1)
  • ChatGPT (1)
  • Computation Fluency (1)
  • Conceptual Understanding (1)
  • Earth Day (1)
  • Google Classroom (1)
  • Google Classroom LMS (1)
  • Hispanic Heritage (1)
  • Lab Safety (1)
  • Paleontology (1)
  • Parent/Student Content Support (1)
  • abstract (1)
  • at-home (1)
  • cares act (1)
  • college readiness (1)
  • concrete (1)
  • confidence (1)
  • constructivism (1)
  • constructivist (1)
  • data literacy (1)
  • data science (1)
  • digital literacy (1)
  • dinosaurs (1)
  • diversity and inclusion (1)
  • driving question (1)
  • educative curriculum (1)
  • effective math class (1)
  • everyday science (1)
  • flipped classroom (1)
  • future of stem (1)
  • game-based learning (1)
  • gamification (1)
  • geology (1)
  • intentional discourse (1)
  • interactive math (1)
  • intervention (1)
  • k12 funding (1)
  • learning community (1)
  • math activities (1)
  • math anxiety (1)
  • math chats (1)
  • math learning loss (1)
  • number sense (1)
  • peer learning (1)
  • perserverance (1)
  • representational (1)
  • robotics (1)
  • science strategies (1)
  • small-group instruction (1)
  • struggling students (1)
  • student autonomy (1)
  • warm-ups (1)
  • women scientists (1)

Subscribe Here!

STEMscopes Tech Specifications      STEMscopes Security Information & Compliance      Privacy Policy      Terms and Conditions

© 2024 Accelerate Learning  

National Academies Press: OpenBook

National Science Education Standards (1996)

Chapter: 2 principles and definitions, chapter 2 principles and definitions.

scientific literacy and critical thinking skills answer key

The development of the National Science Education Standards was guided by certain principles. Those principles are

Science is for all students.

Learning science is an active process.

School science reflects the intellectual and cultural traditions that characterize the practice of contemporary science.

Improving science education is part of systemic education reform.

Tension inevitably accompanied the incorporation of these principles into standards. Tension also will arise as the principles are applied in school science programs and classrooms. The following discussion elaborates upon the principles and clarifies some of the associated difficulties.

[See Teaching Standard B, Assessment Standard D, Program Standard E, and System Standard E]

SCIENCE IS FOR ALL STUDENTS. This principle is one of equity and excellence. Science in our schools must be for all students: All students, regardless of age, sex, cultural or ethnic background, disabilities, aspirations, or interest and motivation in science, should have the opportunity to attain high levels of scientific literacy.

The Standards assume the inclusion of all students in challenging science learning opportunities and define levels of understanding and abilities that all should develop. They emphatically reject any situation in science education where some people—for example, members of certain populations—are discouraged from pursuing science and excluded from opportunities to learn science.

Excellence in science education embodies the ideal that all students can achieve understanding of science if they are given the opportunity. The content standards describe outcomes—what students should understand and be able to do, not the manner in which students will achieve those outcomes. Students will achieve understanding in different ways and at different depths as they answer questions about the natural world. And students will achieve the outcomes at different rates, some sooner than others. But all should have opportunities in the form of multiple experiences over several years to develop the understanding associated with the Standards .

[See Program Standard D and System Standard D]

The commitment to science for all students has implications for both program design and the education system. In particular, resources must be allocated to ensure that the Standards do not exacerbate the differences in opportunities to learn that currently exist between advantaged and disadvantaged students.

[See Teaching Standard B]

LEARNING SCIENCE IS AN ACTIVE PROCESS. Learning science is something students do, not something that is done to them. In learning science, students describe objects and events, ask questions, acquire knowledge, construct explanations of natural phenomena, test those explanations in many different ways, and communicate their ideas to others.

In the National Science Education Standards , the term "active process" implies physical and mental activity. Hands-on activities are not enough—students also must have "minds-on" experiences.

Science teaching must involve students in inquiry-oriented investigations in which they interact with their teachers and peers. Students establish connections between their current knowledge of science and the scientific knowledge found in many sources; they apply science content to new questions; they engage in problem solving, planning, decision making, and group discussions; and they experience assessments that are consistent with an active approach to learning.

Emphasizing active science learning means shifting emphasis away from teachers presenting information and covering science topics. The perceived need to include all the topics, vocabulary, and information in

textbooks is in direct conflict with the central goal of having students learn scientific knowledge with understanding.

SCHOOL SCIENCE REFLECTS THE INTELLECTUAL AND CULTURAL TRADITIONS THAT CHARACTERIZE THE PRACTICE OF CONTEMPORARY SCIENCE. To develop a rich knowledge of science and the natural world, students must become familiar with modes of scientific inquiry, rules of evidence, ways of formulating questions, and

ways of proposing explanations. The relation of science to mathematics and to technology and an understanding of the nature of science should also be part of their education.

[See definition of science literacy]

An explicit goal of the National Science Education Standards is to establish high levels of scientific literacy in the United States. An essential aspect of scientific literacy is greater knowledge and understanding of science subject matter, that is, the knowledge specifically associated with the physical, life, and earth sciences. Scientific literacy also includes understanding the nature of science, the scientific enterprise, and the role of science in society and personal life. The Standards recognize that many individuals have contributed to the traditions of science and that, in historical perspective, science has been practiced in many different cultures.

Science is a way of knowing that is characterized by empirical criteria, logical argument, and skeptical review. Students should develop an understanding of what science is, what science is not, what science can and cannot do, and how science contributes to culture.

IMPROVING SCIENCE EDUCATION IS PART OF SYSTEMIC EDUCATION REFORM. National goals and standards contribute to state and local systemic initiatives, and the national and local reform efforts complement each other. Within the larger education system, we can view science education as a subsystem with both shared and unique components. The components include students and teachers; schools with principals, superintendents, and school boards; teacher education programs in colleges and universities; textbooks and textbook publishers; communities of parents and of students; scientists and engineers; science museums; business and industry; and legislators. The National Science Education Standards provide the unity of purpose and vision required to focus all of those components effectively on the important task of improving science education for all students, supplying a consistency that is needed for the long-term changes required.

Perspectives and Terms in the National Science Education Standards

Although terms such as ''scientific literacy" and "science content and curriculum" frequently appear in education discussions and in the popular press without definition, those terms have a specific meaning as used in the National Science Education Standards .

SCIENTIFIC LITERACY. Scientific literacy is the knowledge and understanding of scientific concepts and processes required for personal decision making, participation in civic and cultural affairs, and economic productivity. It also includes specific types of abilities. In the National Science Education Standards , the content standards define scientific literacy.

Scientific literacy means that a person can ask, find, or determine answers to questions derived from curiosity about everyday experiences. It means that a person has the ability to describe, explain, and predict natural phenomena. Scientific literacy entails being able to read with understanding articles about science in the popular press and to engage in social conversation about the validity of the conclusions. Scientific literacy implies that a person can identify scientific issues underlying national and local decisions and express positions that are scientifically and technologically informed. A literate citizen should be able to evaluate the quality of scientific information on the basis of its source and the methods used to generate it. Scientific literacy also implies the capacity to pose and evaluate arguments based on evidence and to apply conclusions from such arguments appropriately.

Individuals will display their scientific literacy in different ways, such as appropriately using technical terms, or applying scientific concepts and processes. And individuals often will have differences in literacy in different domains, such as more understanding of life-science concepts and words, and less understanding of physical-science concepts and words.

Scientific literacy has different degrees and forms; it expands and deepens over a lifetime, not just during the years in school. But the attitudes and values established toward science in the early years will shape a person's development of scientific literacy as an adult.

[See Program Standard B]

CONTENT AND CURRICULUM. The content of school science is broadly defined to include specific capacities, understandings, and abilities in science. The content standards are not a science curriculum. Curriculum is the way content is delivered: It includes the structure, organization, balance, and presentation of the content in the classroom.

The content standards are not science lessons, classes, courses of study, or school science programs. The components of the science content described can be organized with a variety of emphases and perspectives into many different curricula. The organizational schemes of the content standards are not intended to be used as curricula;

instead, the scope, sequence, and coordination of concepts, processes, and topics are left to those who design and implement curricula in science programs.

Curricula often will integrate topics from different subject-matter areas—such as life and physical sciences—from different content standards—such as life sciences and

science in personal and social perspectives—and from different school subjects—such as science and mathematics, science and language arts, or science and history.

KNOWLEDGE AND UNDERSTANDING. Implementing the National Science Education Standards implies the acquisition of scientific knowledge and the development of understanding. Scientific knowledge refers to facts, concepts, principles, laws, theories, and models and can be acquired in many ways. Understanding science requires that an individual integrate a complex structure of many types of knowledge, including the ideas of science, relationships between ideas, reasons for these relationships, ways to use the ideas to explain and predict other natural phenomena, and ways to apply them to many events. Understanding encompasses the ability to use knowledge, and it entails the ability to distinguish between what is and what is not a scientific idea. Developing understanding presupposes that students are actively engaged with the ideas of science and have many experiences with the natural world.

[See Content Standards A & G (all grade levels)]

INQUIRY. Scientific inquiry refers to the diverse ways in which scientists study the natural world and propose explanations based on the evidence derived from their work. Inquiry also refers to the activities of students in which they develop knowledge and understanding of scientific ideas, as well as an understanding of how scientists study the natural world.

Inquiry is a multifaceted activity that involves making observations; posing questions; examining books and other sources of information to see what is already known; planning investigations; reviewing what is already known in light of experimental evidence; using tools to gather, analyze, and interpret data; proposing answers, explanations, and predictions; and communicating the results. Inquiry requires identification of assumptions, use of critical and logical thinking, and consideration of alternative explanations. Students will engage in selected aspects of inquiry as they learn the scientific way of knowing the natural world, but they also should develop the capacity to conduct complete inquiries.

Although the Standards emphasize inquiry, this should not be interpreted as recommending a single approach to science teaching. Teachers should use different strategies to develop the knowledge, understandings, and abilities described in the content standards. Conducting hands-on science activities does not guarantee inquiry, nor is reading about science incompatible with inquiry. Attaining the understandings and abilities described in Chapter 6 cannot

be achieved by any single teaching strategy or learning experience.

SCIENCE AND TECHNOLOGY. As used in the Standards , the central distinguishing characteristic between science and technology is a difference in goal: The goal of science is to understand the natural world, and the goal of technology is to make modifications in the world to meet human needs. Technology as design is included in the Standards as parallel to science as inquiry.

Technology and science are closely related. A single problem often has both scientific and technological aspects. The need to answer questions in the natural world drives the development of technological products; moreover, technological needs can drive scientific research. And technological products, from pencils to computers, provide tools that promote the understanding of natural phenomena.

[See Content Standard E (all grade levels)]

The use of "technology" in the Standards is not to be confused with "instructional technology," which provides students and teachers with exciting tools—such as computers—to conduct inquiry and to understand science.

Additional terms important to the National Science Education Standards, such as "teaching," "assessment," and "opportunity to learn," are defined in the chapters and sections where they are used. Throughout, we have tried to avoid using terms that have different meanings to the many different groups that will be involved in implementing the Standards .

References for Further Reading

AAUW (American Association of University Women). 1992. How Schools Shortchange Girls. Washington, DC: AAUW.

Beane, D.B. 1988. Mathematics and Science: Critical Filters for the Future of Minority Students. Washington, DC: The Mid-Atlantic Center for Race Equity.

Brown, A.L. 1992. Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. The Journal of the Learning Sciences, 2: 141-178.

Brown, J.S., A. Collins, and P. Duguid. 1989. Situated cognition and the culture of learning. Educational Researcher, 18: 32-42.

Bruer, J.T. 1993. Schools for Thought: A Science of Learning in the Classroom. Cambridge, MA: The MIT Press/Bradford Books.

Bybee, R.W. 1994. Reforming Science Education: Social Perspectives and Personal Reflections. New York: Teachers College Press, Columbia University.

Bybee, R.W., and G. DeBoer. 1994. Research as goals for the science curriculum. In Handbook on Research on Science Teaching and Learning, D. Gabel, ed. New York: MacMillan Publishing Company.

Champagne, A.B., and L.E. Hornig. 1987. Practical Application of Theories About Learning. In This Year in School Science 1987: The Report of the National Forum for School Science, A.B. Champagne and L.E. Hornig, eds. Washington, DC: American Association for the Advancement of Science .

Clewell, B.C., B.T. Anderson, and M.E. Thorpe. 1992. Breaking the Barriers: Helping Female and Minority Students Succeed in Mathematics and Science. San Francisco: Jossey-Bass.

DeBoer, G. 1991. A History of Ideas in Science Education: Implications for Practice. New York: Teachers College Press.

Forman, E.A., and C.B. Cazden. 1985. Exploring Vygotskian perspectives in education: The cognitive value of peer interaction. In Culture, Communication, and Cognition: Vygotskian Perspectives, J.V. Wertsch, ed: 323-347. New York: Cambridge University Press.

Frederiksen, N. 1984. Implications of cognitive theory for instruction in problem solving. Review of Educational Research, 54: 363-407.

Greeno, J.G. 1989. Situations, mental models, and generative knowledge. In Complex Information Processing: The Impact of Herbert A. Simon. D. Klahr and K. Kotovsky eds. Hillsdale, NJ: Lawrence Erlbaum and Associates.

Gross, P.R., and N. Levitt. 1994. Higher Superstition: The Academic Left and Its Quarrels With Science. Baltimore, MD: Johns Hopkins University Press.

Holton, G. 1993. Science and Anti-science. Cambridge, MA: Harvard University Press.

Johnson, D.W., and F. Johnson. 1994. Joining Together: Group Theory and Group Skills, 5th ed. Boston: Allyn and Bacon.

Kahle, J.B. 1988. Gender and science education II. In Development and Dilemmas in Science Education, P. Fensham, ed. New York: Falmer Press.

Lee, O., and C.W. Anderson. 1993. Task engagement and conceptual change in middle school science classrooms. American Educational Research Journal, 30: 585-610.

NCTM (National Council of Teachers of Mathematics). 1989. Curriculum and Evaluation Standards for School Mathematics. Reston, VA: NCTM.

NRC (National Research Council). 1989. High-School Biology Today and Tomorrow. Washington, DC: National Academy Press.

NRC (National Research Council). 1987. Education and Learning to Think, L. Resnick, ed. Washington, DC: National Academy Press.

NSF (National Science Foundation). 1992. The Influence of Testing on Teaching Math and Science in Grades 4-12: Report of a Study. Chestnut Hill, MA: Center for the Study of Testing, Evaluation, and Educational Policy .

Oakes, J. 1990. Lost Talent: The Underparticipation of Women, Minorities, and Disabled Persons in Science. Santa Monica, CA: RAND Corporation.

Ohlsson, S. 1992. The Cognitive Skill of Theory Articulation: A Neglected Aspect of Science Education. Science & Education, 1 (2): 121-92.

Piaget, J. 1970. Genetic Epistemology. Translated by Eleanor Duckworth. New York: Columbia University Press.

Piaget, J. 1954. The Construction of Reality in the Child. Translated by M. Cook. New York: Ballantine Books.

Resnick, L.B., and L.E. Klopfer, eds. 1989. Toward the Thinking Curriculum: Current Cognitive Research. Alexandria, VA: Association for Supervision and Curriculum Development.

Shayer, M., and P. Adey. 1981. Towards a Science of Science Teaching: Cognitive Development and Curriculum Demand. London: Heinemann Educational Books.

Tyson-Bernstein, H. 1988. America's Textbook Fiasco: A Conspiracy of Good Intentions. Washington, DC: Council for Basic Education.

Vera, A.H., and H.A. Simon. 1993. Situated action: a symbolic interpretation. Cognitive Science, 17: 7-48.

White, B.Y. 1993. Thinkertools: Causal models, conceptual change, and science education. Cognition and Instruction, 10: 1-100.

scientific literacy and critical thinking skills answer key

Americans agree that our students urgently need better science education. But what should they be expected to know and be able to do? Can the same expectations be applied across our diverse society?

These and other fundamental issues are addressed in National Science Education Standards —a landmark development effort that reflects the contributions of thousands of teachers, scientists, science educators, and other experts across the country.

The National Science Education Standards offer a coherent vision of what it means to be scientifically literate, describing what all students regardless of background or circumstance should understand and be able to do at different grade levels in various science categories.

The standards address:

  • The exemplary practice of science teaching that provides students with experiences that enable them to achieve scientific literacy.
  • Criteria for assessing and analyzing students' attainments in science and the learning opportunities that school science programs afford.
  • The nature and design of the school and district science program.
  • The support and resources needed for students to learn science.

These standards reflect the principles that learning science is an inquiry-based process, that science in schools should reflect the intellectual traditions of contemporary science, and that all Americans have a role in improving science education.

This document will be invaluable to education policymakers, school system administrators, teacher educators, individual teachers, and concerned parents.

Welcome to OpenBook!

You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

Do you want to take a quick tour of the OpenBook's features?

Show this book's table of contents , where you can jump to any chapter by name.

...or use these buttons to go back to the previous chapter or skip to the next one.

Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

Switch between the Original Pages , where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

To search the entire text of this book, type in your search term here and press Enter .

Share a link to this book page on your preferred social network or via email.

View our suggested citation for this chapter.

Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

Get Email Updates

Do you enjoy reading reports from the Academies online for free ? Sign up for email notifications and we'll let you know about new publications in your areas of interest when they're released.

Back Home

  • Search Search Search …
  • Search Search …

Scientific Literacy and Critical Thinking Skills: Nurturing a Better Future

Scientific Literacy and Critical Thinking Skills

Scientific literacy and critical thinking are essential components of a well-rounded education, preparing students to better understand the world we live in and make informed decisions. As science and technology continue to advance and impact various aspects of our lives, it is increasingly important for individuals to develop the ability to think critically about scientific information, fostering a deeper understanding of the implications and consequences of such advancements. By fostering scientific literacy, students become equipped with the knowledge and skills to actively engage with science-related issues in a responsible and informed manner.

The development of critical thinking skills is crucial not only within the realm of science, but across all disciplines and aspects of life. These skills enable individuals to analyze, evaluate, and synthesize information—essential attributes for navigating the modern world. As science communication and dissemination become more widespread, having the ability to critically assess validity, objectivity, and authority is paramount to being a responsible and engaged citizen.

Focusing on scientific literacy and critical thinking in education prepares students for a world where science and technology play a pivotal role across numerous fields. By cultivating these capacities, students will be better prepared to face complex issues and tasks, contribute positively to society, and pave the way for continued advancements and innovations.

Key Concepts and Principles

Science education foundations.

Scientific literacy and critical thinking are essential components of a well-rounded science education. These foundational skills equip students with the ability to understand key concepts, develop scientific reasoning, and utilize scientific knowledge for personal and social purposes as defined in Science for All Americans .

A strong science education involves:

  • Acquiring scientific knowledge and understanding the core concepts of various disciplines
  • Developing the ability to analyze and evaluate scientific claims and arguments
  • Enhancing writing and communication skills to effectively convey scientific ideas

By focusing on these elements, educators empower students to think and function as responsible citizens in an increasingly science-driven world.

Metacognition and Reflection

Metacognition, or the process of thinking about one’s own thinking, plays a crucial role in fostering critical thinking skills in science education. Cambridge highlights key steps in the critical thinking process, which include:

  • Identifying a problem and asking questions about that problem
  • Selecting information to respond to the problem and evaluating it
  • Drawing conclusions from the evidence

By incorporating metacognitive strategies and promoting reflection throughout the learning process, educators enable students to actively engage with scientific concepts, building a deeper understanding and fostering critical thinking abilities.

In summary, a well-rounded science education places emphasis on the development of scientific literacy and critical thinking skills, based on a strong foundation in core concepts and knowledge. Incorporating metacognitive strategies and promoting reflection throughout the learning process further enhances these skills, equipping students for success in their future scientific endeavors. Remember to maintain a confident, knowledgeable, neutral, and clear tone of voice when discussing these topics.

Curriculum and Pedagogy

Teaching and learning approaches.

Teaching and learning approaches play a crucial role in promoting scientific literacy and critical thinking skills among students. One effective strategy for encouraging these skills is to create a thinking-based classroom, where the learning environment is shaped to support thinking and create opportunities for students to engage in scientific concepts 1 .

Educators can achieve this by incorporating a variety of pedagogical techniques, such as:

  • Scaffolded instruction : Gradually develop students’ understanding by modeling, guided instruction, and eventually allowing students to take ownership of their learning.
  • Inquiry-based learning : Encourage exploration and questions to build understanding of scientific concepts.
  • Collaborative learning : Use group projects and discussions to inspire debate and foster interaction among students, allowing them to learn from one another’s perspectives.

Incorporating Argumentation and Experimentation

Argumentation and experimentation are key components of scientific inquiry that contribute to students’ scientific literacy and critical thinking skills:

  • Argumentation : Incorporating argumentation in the curriculum helps students learn how to construct, evaluate, and refine scientific claims based on evidence 2 . This can be done through structured debates, teaching students to craft written scientific arguments, and evaluating peer arguments in a constructive manner.
  • Experimentation : Encouraging students to engage in hands-on experimentation allows them to explore scientific concepts more deeply while fostering their critical thinking skills 3 . Providing opportunities for experimentation can include designing experiments, carrying them out, analyzing data, and drawing conclusions.

By incorporating these teaching and learning approaches, as well as focusing on argumentation and experimentation, educators can effectively promote scientific literacy and critical thinking skills in their curriculum and pedagogy.

Assessing Scientific Literacy and Critical Thinking Skills

Test instruments and procedures.

There are various test instruments designed to assess students’ scientific literacy and critical thinking skills. One such instrument is the Test of Scientific Literacy Skills (TOSLS) , which focuses on measuring skills related to essential aspects of scientific literacy, such as:

  • Recognizing and analyzing the use of methods of inquiry that lead to scientific knowledge
  • Organizing, analyzing, and interpreting quantitative data and scientific information

The TOSLS is a multiple-choice test that allows educators to evaluate students’ understanding of scientific reasoning and their ability to apply scientific concepts in real-life situations.

Apart from standardized tests, it is crucial to incorporate critical thinking into everyday learning activities. Educators may use various methods, such as discussing complex scientific problems within the context of current events and engaging students in collaborative problem-solving tasks.

International Comparisons

When evaluating scientific literacy and critical thinking skills, it is helpful to put the findings into a broader context by comparing them with international standards and benchmarks. One significant international study is the Programme for International Student Assessment (PISA) , which measures the knowledge and skills of 15-year-olds in reading, math, and science every three years. PISA assesses students based on their abilities to use their scientific knowledge for:

  • Identifying scientific issues
  • Explaining phenomena scientifically
  • Evaluating and designing scientific enquires

By evaluating and comparing students’ performance across different countries, PISA contributes to a deeper understanding of different strategies and curricula used to foster scientific literacy and critical thinking skills in different educational contexts.

In conclusion, the assessment of scientific literacy and critical thinking skills is critical for evaluating the quality of science education. By using well-validated test instruments and comparing students’ performance internationally, educators can better understand the effectiveness of different teaching strategies and work to improve science literacy and critical thinking skills for all students.

Factors Influencing Performance and Motivation

Role of gender in physics education.

Research indicates that gender plays a significant role in students’ performance and motivation in physics education. Male and female students exhibit different levels of interest and confidence in the subject, which impact their academic achievements. A correlational study found a positive relationship between critical thinking skills and scientific literacy in both genders but did not identify any significant correlation between gender and these skills.

It is essential to recognize and address these gender differences when designing curriculum and learning environments to encourage equal participation and confidence in physics education for all students.

Decision Making and Problem-Solving

Developing strong decision-making and problem-solving skills are crucial components of scientific literacy. These skills enable students to apply scientific concepts and principles in real-world situations while reinforcing a more humanistic culture based on rational thinking, as highlighted in this article .

  • Motivation : A student’s motivation to learn and engage in scientific activities plays a vital role in the development of their decision-making and problem-solving skills. High motivation levels promote curiosity, actively seeking knowledge, and persistence in solving complex problems.
  • Correlation analysis : Studies have shown a positive relationship between scientific literacy, critical thinking, and the ability to use scientific knowledge for personal and social purposes. This correlation underlines the importance of fostering these skills in the education system.

When incorporating decision-making and problem-solving skills into science education, focus should be placed on engaging students in critical thinking exercises and creating a conducive learning environment that encourages curiosity, exploration, and collaboration.

Scientific Literacy in Everyday Life

Interpreting news reports.

Scientific literacy plays a crucial role in interpreting news reports. A confident, knowledgeable, and neutral understanding of scientific principles and facts allows individuals to critically evaluate the claims made in news articles or television segments, and determine the validity of the information presented.

For example, when encountering a news report about a new health study, it is essential to consider sample size, research methodology, and potential conflicts of interest among the researchers. A clear understanding of these factors can help prevent the spread of misinformation and promote informed decision-making.

Moreover, separating scientific facts from theories enables individuals to better grasp the certainty and uncertainty surrounding the news report. This distinction is crucial for discerning the current state of scientific knowledge and identifying areas where more research is needed.

Understanding and Evaluating Scientific Facts

Maintaining a neutral and clear perspective on science allows individuals to effectively understand and evaluate scientific facts. This involves understanding the difference between facts , which are verifiable pieces of information, and theories , which are well-substantiated explanations for observable phenomena.

For instance, the recognition that the Earth revolves around the Sun is a fact, while the theory of evolution provides a comprehensive explanation of the origin and development of species. Developing the ability to analyze and contextualize scientific information is crucial for forming well-grounded opinions and engaging in informed discussions.

Moreover, the promotion of scientific literacy allows for the appreciation of the interrelatedness of scientific disciplines. This comprehensive understanding can enhance the assessment of scientific facts and their implications in various aspects of daily life, such as making informed choices about healthcare, technology, and environmental issues. Keeping these considerations in mind, fostering scientific literacy and critical thinking skills are essential for responsible citizenship and decision-making in the modern world.

Future Research Agenda

Developing scientific literacy and critical thinking skills is crucial in today’s world, both for individual success and society as a whole. Consequently, a future research agenda exploring these areas is essential, particularly in relation to high school students as they prepare to become responsible citizens.

One of the key issues to address within this agenda is the relationship between science knowledge and attitudes toward science. This includes assessing whether a significant correlation exists between improved scientific understanding and more positive attitudes towards the scientific method and scientific discovery. Gaining insights into this aspect will help guide the development of educational resources and methodologies to foster a more science-minded society.

Another area of interest is the utility of scientific literacy in various career and life contexts. This would involve studying how scientific literacy can be applied to non-science fields, and how it influences individuals’ decision-making processes and problem-solving abilities.

Moreover, research should explore the relationship between science literacy and other literacy skills , such as mathematics, reading comprehension, and writing. This may help educators develop interdisciplinary curricula that promote the growth of critical thinking abilities and scientific understanding simultaneously.

Furthermore, emphasizing the role of scientific literacy for citizens as decision-makers is crucial. It is important to examine how improved scientific literacy influences students’ capacities to evaluate information, engage in public discourse, and make informed choices on matters that involve scientific data or principles.

Lastly, it might be beneficial to investigate the impact of innovative teaching methods, such as transformative science education and futures thinking, on developing students’ scientific literacy and critical thinking abilities. By shedding light on possible approaches that foster these essential skills, researchers can contribute to the continuous evolution of science education.

In summary, focusing on these key threads in a future research agenda will be invaluable in promoting a deeper understanding of scientific literacy and critical thinking skills. By doing so, we can work towards equipping high school students with the tools required to navigate an increasingly complex and science-driven world.

Frequently Asked Questions

What are the benefits of having scientific literacy and critical thinking skills.

Scientific literacy and critical thinking skills are essential for individuals to understand the world around them and make informed decisions. These skills enable people to differentiate science from pseudoscience and evaluate the credibility of information. Moreover, scientifically literate citizens are better equipped to participate in important societal discussions and contribute to policy-making processes.

How can educators effectively teach scientific literacy and critical thinking skills?

Educators can teach these skills by designing activities that promote critical thinking and scientific inquiry. For example, teachers can create learning experiences where students identify problems and ask questions about them, select relevant information, and draw conclusions based on evidence. Furthermore, incorporating case studies, group discussions, and scientific experiments into the curriculum can help students develop these skills.

What role does digital literacy play in promoting scientific literacy and critical thinking?

Digital literacy is an essential component in fostering scientific literacy and critical thinking. In today’s technology-driven world, individuals must be capable of navigating and evaluating online resources to access accurate information. Digital literacy skills, such as determining the credibility of websites and online articles, can help learners critically assess scientific information, weighing the evidence to form well-founded opinions.

How do life and career skills relate to scientific literacy and critical thinking?

Life and career skills, such as communication, problem solving, and adaptability, are intertwined with scientific literacy and critical thinking. These abilities are crucial in equipping individuals to face real-world challenges and make informed decisions in various fields, from science and technology to business and government. An understanding of scientific principles and the ability to think critically foster the development of crucial life and career skills that are increasingly sought-after in today’s world.

What’s the connection between problem-solving skills and scientific literacy?

Problem-solving skills are closely related to scientific literacy, as they empower individuals to analyze situations, identify problems, and devise appropriate solutions. Scientific literacy involves understanding scientific ways of knowing and thinking critically about the natural world. In essence, acquiring scientific literacy enables individuals to apply the principles and methods of science to problem-solving situations in various aspects of life.

How can reflective practice enhance critical thinking in science?

Reflective practice is a valuable tool in enhancing critical thinking skills in science. It involves examining one’s thoughts, actions, and experiences to learn and improve. By engaging in reflective practice, learners can identify personal biases, recognize gaps in their understanding, and determine ways to improve their scientific knowledge and thinking abilities. This process, in turn, promotes critical thinking and a deeper understanding of scientific concepts.

  • Eight Instructional Strategies for Promoting Critical Thinking ↩
  • Fostering Scientific Literacy and Critical Thinking in Elementary Science Education ↩
  • The Biochemical Literacy Framework: Inviting pedagogical innovation in bioscience education ↩

You may also like

scientific thinking skills

Scientific Thinking Skills: Enhancing Cognitive Abilities for Problem Solving and Decision Making

Scientific thinking skills play a crucial role in the field of science, enabling individuals to approach problems methodically and make informed decisions. […]

Scientific Thinking Questions

Scientific Thinking Questions: A Comprehensive Guide to Developing Critical Skills

Scientific thinking is a skill that allows individuals to critically analyze and question the world around them. It involves the use of […]

Scientific Thinking and Research

Scientific Thinking and Research: Essential Guide to Methodological Approaches

Scientific thinking and research are integral to the advancement of human knowledge and understanding. Scientific thinking is a type of knowledge-seeking process […]

What is Non-Scientific Thinking

What is Non-Scientific Thinking? Exploring Unconventional Perspectives

Non-scientific thinking refers to methods of exploring knowledge and understanding without adhering strictly to the processes of the scientific method. A wide […]

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List

Logo of plosone

What influences students’ abilities to critically evaluate scientific investigations?

Ashley b. heim.

1 Department of Ecology and Evolutionary Biology, Cornell University, Ithaca, NY, United States of America

2 Laboratory of Atomic and Solid State Physics, Cornell University, Ithaca, NY, United States of America

David Esparza

Michelle k. smith, n. g. holmes, associated data.

All raw data files are available from the Cornell Institute for Social and Economic Research (CISER) data and reproduction archive ( https://archive.ciser.cornell.edu/studies/2881 ).

Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of assessment questions to probe students’ critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). These instruments provide experimental scenarios and pose questions asking students to evaluate what to trust and what to do regarding the quality of experimental designs and data. Using more than 3000 student responses from over 20 institutions, we sought to understand what features of the assessment questions elicit student critical thinking. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting. We found that students are more critical when making comparisons between two studies than when evaluating each study individually. Also, compare-and-contrast questions are sufficient for eliciting critical thinking, with students providing similar answers regardless of if the individual evaluation questions are included. This research offers new insight on the types of assessment questions that elicit critical thinking at the introductory undergraduate level; specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses and assessments.

Introduction

Critical thinking and its importance.

Critical thinking, defined here as “the ways in which one uses data and evidence to make decisions about what to trust and what to do” [ 1 ], is a foundational learning goal for almost any undergraduate course and can be integrated in many points in the undergraduate curriculum. Beyond the classroom, critical thinking skills are important so that students are able to effectively evaluate data presented to them in a society where information is so readily accessible [ 2 , 3 ]. Furthermore, critical thinking is consistently ranked as one of the most necessary outcomes of post-secondary education for career advancement by employers [ 4 ]. In the workplace, those with critical thinking skills are more competitive because employers assume they can make evidence-based decisions based on multiple perspectives, keep an open mind, and acknowledge personal limitations [ 5 , 6 ]. Despite the importance of critical thinking skills, there are mixed recommendations on how to elicit and assess critical thinking during and as a result of instruction. In response, here we evaluate the degree to which different critical thinking questions elicit students’ critical thinking skills.

Assessing critical thinking in STEM

Across STEM (i.e., science, technology, engineering, and mathematics) disciplines, several standardized assessments probe critical thinking skills. These assessments focus on aspects of critical thinking and ask students to evaluate experimental methods [ 7 – 11 ], form hypotheses and make predictions [ 12 , 13 ], evaluate data [ 2 , 12 – 14 ], or draw conclusions based on a scenario or figure [ 2 , 12 – 14 ]. Many of these assessments are open-response, so they can be difficult to score, and several are not freely available.

In addition, there is an ongoing debate regarding whether critical thinking is a domain-general or context-specific skill. That is, can someone transfer their critical thinking skills from one domain or context to another (domain-general) or do their critical thinking skills only apply in their domain or context of expertise (context-specific)? Research on the effectiveness of teaching critical thinking has found mixed results, primarily due to a lack of consensus definition of and assessment tools for critical thinking [ 15 , 16 ]. Some argue that critical thinking is domain-general—or what Ennis refers to as the “general approach”—because it is an overlapping skill that people use in various aspects of their lives [ 17 ]. In contrast, others argue that critical thinking must be elicited in a context-specific domain, as prior knowledge is needed to make informed decisions in one’s discipline [ 18 , 19 ]. Current assessments include domain-general components [ 2 , 7 , 8 , 14 , 20 , 21 ], asking students to evaluate, for instance, experiments on the effectiveness of dietary supplements in athletes [ 20 ] and context-specific components, such as to measure students’ abilities to think critically in domains such as neuroscience [ 9 ] and biology [ 10 ].

Others maintain the view that critical thinking is a context-specific skill for the purpose of undergraduate education, but argue that it should be content accessible [ 22 – 24 ], as “thought processes are intertwined with what is being thought about” [ 23 ]. From this viewpoint, the context of the assessment would need to be embedded in a relatively accessible context to assess critical thinking independent of students’ content knowledge. Thus, to effectively elicit critical thinking among students, instructors should use assessments that present students with accessible domain-specific information needed to think deeply about the questions being asked [ 24 , 25 ].

Within the context of STEM, current critical thinking assessments primarily ask students to evaluate a single experimental scenario (e.g., [ 10 , 20 ]), though compare-and-contrast questions about more than one scenario can be a powerful way to elicit critical thinking [ 26 , 27 ]. Generally included in the “Analysis” level of Bloom’s taxonomy [ 28 – 30 ], compare-and-contrast questions encourage students to recognize, distinguish between, and relate features between scenarios and discern relevant patterns or trends, rather than compile lists of important features [ 26 ]. For example, a compare-and-contrast assessment may ask students to compare the hypotheses and research methods used in two different experimental scenarios, instead of having them evaluate the research methods of a single experiment. Alternatively, students may inherently recall and use experimental scenarios based on their prior experiences and knowledge as they evaluate an individual scenario. In addition, evaluating a single experimental scenario individually may act as metacognitive scaffolding [ 31 , 32 ]—a process which “guides students by asking questions about the task or suggesting relevant domain-independent strategies [ 32 ]—to support students in their compare-and-contrast thinking.

Purpose and research questions

Our primary objective of this study was to better understand what features of assessment questions elicit student critical thinking using two existing instruments in STEM: the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). We focused on biology and physics since critical thinking assessments were already available for these disciplines. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time or comparing and contrasting two studies and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.

Providing undergraduates with ample opportunities to practice critical thinking skills in the classroom is necessary for evidence-based critical thinking in their future careers and everyday life. While most critical thinking instruments in biology and physics contexts have undergone some form of validation to ensure they are accurately measuring the intended construct, to our knowledge none have explored how different question types influence students’ critical thinking. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to measure cognitive student outcomes and incorporate more effective critical thinking opportunities in the classroom.

Ethics statement

The procedures for this study were approved by the Institutional Review Board of Cornell University (Eco-BLIC: #1904008779; PLIC: #1608006532). Informed consent was obtained by all participating students via online consent forms at the beginning of the study, and students did not receive compensation for participating in this study unless their instructor offered credit for completing the assessment.

Participants and assessment distribution

We administered the Eco-BLIC to undergraduate students across 26 courses at 11 institutions (six doctoral-granting, three Master’s-granting, and two Baccalaureate-granting) in Fall 2020 and Spring 2021 and received 1612 usable responses. Additionally, we administered the PLIC to undergraduate students across 21 courses at 11 institutions (six doctoral-granting, one Master’s-granting, three four-year colleges, and one 2-year college) in Fall 2020 and Spring 2021 and received 1839 usable responses. We recruited participants via convenience sampling by emailing instructors of primarily introductory ecology-focused courses or introductory physics courses who expressed potential interest in implementing our instrument in their course(s). Both instruments were administered online via Qualtrics and students were allowed to complete the assessments outside of class. The demographic distribution of the response data is presented in Table 1 , all of which were self-reported by students. The values presented in this table represent all responses we received.

Instrument description

Question types.

Though the content and concepts featured in the Eco-BLIC and PLIC are distinct, both instruments share a similar structure and set of question types. The Eco-BLIC—which was developed using a structure similar to that of the PLIC [ 1 ]—includes two predator-prey scenarios based on relationships between (a) smallmouth bass and mayflies and (b) great-horned owls and house mice. Within each scenario, students are presented with a field-based study and a laboratory-based study focused on a common research question about feeding behaviors of smallmouth bass or house mice, respectively. The prompts for these two Eco-BLIC scenarios are available in S1 and S2 Appendices. The PLIC focuses on two research groups conducting different experiments to test the relationship between oscillation periods of masses hanging on springs [ 1 ]; the prompts for this scenario can be found in S3 Appendix . The descriptive prompts in both the Eco-BLIC and PLIC also include a figure presenting data collected by each research group, from which students are expected to draw conclusions. The research scenarios (e.g., field-based group and lab-based group on the Eco-BLIC) are written so that each group has both strengths and weaknesses in their experimental designs.

After reading the prompt for the first experimental group (Group 1) in each instrument, students are asked to identify possible claims from Group 1’s data (data evaluation questions). Students next evaluate the strengths and weaknesses of various study features for Group 1 (individual evaluation questions). Examples of these individual evaluation questions are in Table 2 . They then suggest next steps the group should pursue (next steps items). Students are then asked to read about the prompt describing the second experimental group’s study (Group 2) and again answer questions about the possible claims, strengths and weaknesses, and next steps of Group 2’s study (data evaluation questions, individual evaluation questions, and next steps items). Once students have independently evaluated Groups 1 and 2, they answer a series of questions to compare the study approaches of Group 1 versus Group 2 (group comparison items). In this study, we focus our analysis on the individual evaluation questions and group comparison items.

The Eco-BLIC examples are derived from the owl/mouse scenario.

Instrument versions

To determine whether the individual evaluation questions impacted the assessment of students’ critical thinking, students were randomly assigned to take one of two versions of the assessment via Qualtrics branch logic: 1) a version that included the individual evaluation and group comparison items or 2) a version with only the group comparison items, with the individual evaluation questions removed. We calculated the median time it took students to answer each of these versions for both the Eco-BLIC and PLIC.

Think-aloud interviews

We also conducted one-on-one think-aloud interviews with students to elicit feedback on the assessment questions (Eco-BLIC n = 21; PLIC n = 4). Students were recruited via convenience sampling at our home institution and were primarily majoring in biology or physics. All interviews were audio-recorded and screen captured via Zoom and lasted approximately 30–60 minutes. We asked participants to discuss their reasoning for answering each question as they progressed through the instrument. We did not analyze these interviews in detail, but rather used them to extract relevant examples of critical thinking that helped to explain our quantitative findings. Multiple think-aloud interviews were conducted with students using previous versions of the PLIC [ 1 ], though these data are not discussed here.

Data analyses

Our analyses focused on (1) investigating the alignment between students’ responses to the individual evaluation questions and the group comparison items and (2) comparing student responses between the two instrument versions. If individual evaluation and group comparison items elicit critical thinking in the same way, we would expect to see the same frequency of responses for each question type, as per Fig 1 . For example, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a strength, we would expect that students would respond that both groups were highly effective for this study feature on the group comparison item (i.e., data represented by the purple circle in the top right quadrant of Fig 1 ). Alternatively, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a weakness, we would expect that students would indicate that Group 1 was more effective than Group 2 on the group comparison item (i.e., data represented by the green circle in the lower right quadrant of Fig 1 ).

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g001.jpg

The x- and y-axes represent rankings on the individual evaluation questions for Groups 1 and 2 (or field and lab groups), respectively. The colors in the legend at the top of the figure denote responses to the group comparison items. In this idealized example, all pie charts are the same size to indicate that the student answers are equally proportioned across all answer combinations.

We ran descriptive statistics to summarize student responses to questions and examine distributions and frequencies of the data on the Eco-BLIC and PLIC. We also conducted chi-square goodness-of-fit tests to analyze differences in student responses between versions within the relevant questions from the same instrument. In all of these tests, we used a Bonferroni correction to lower the chances of receiving a false positive and account for multiple comparisons. We generated figures—primarily multi-pie chart graphs and heat maps—to visualize differences between individual evaluation and group comparison items and between versions of each instrument with and without individual evaluation questions, respectively. All aforementioned data analyses and figures were conducted or generated in the R statistical computing environment (v. 4.1.1) and Microsoft Excel.

We asked students to evaluate different experimental set-ups on the Eco-BLIC and PLIC two ways. Students first evaluated the strengths and weaknesses of study features for each scenario individually (individual evaluation questions, Table 2 ) and, subsequently, answered a series of questions to compare and contrast the study approaches of both research groups side-by-side (group comparison items, Table 2 ). Through analyzing the individual evaluation questions, we found that students generally ranked experimental features (i.e., those related to study set-up, data collection and summary methods, and analysis and outcomes) of the independent research groups as strengths ( Fig 2 ), evidenced by the mean scores greater than 2 on a scale from 1 (weakness) to 4 (strength).

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g002.jpg

Each box represents the interquartile range (IQR). Lines within each box represent the median. Circles represent outliers of mean scores for each question.

Individual evaluation versus compare-and-contrast evaluation

Our results indicate that when students consider Group 1 or Group 2 individually, they mark most study features as strengths (consistent with the means in Fig 2 ), shown by the large circles in the upper right quadrant across the three experimental scenarios ( Fig 3 ). However, the proportion of colors on each pie chart shows that students select a range of responses when comparing the two groups [e.g., Group 1 being more effective (green), Group 2 being more effective (blue), both groups being effective (purple), and neither group being effective (orange)]. We infer that students were more discerning (i.e., more selective) when they were asked to compare the two groups across the various study features ( Fig 3 ). In short, students think about the groups differently if they are rating either Group 1 or Group 2 in the individual evaluation questions versus directly comparing Group 1 to Group 2.

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g003.jpg

The x- and y-axes represent students’ rankings on the individual evaluation questions for Groups 1 and 2 on each assessment, respectively, where 1 indicates weakness and 4 indicates strength. The overall size of each pie chart represents the proportion of students who responded with each pair of ratings. The colors in the pie charts denote the proportion of students’ responses who chose each option on the group comparison items. (A) Eco-BLIC bass-mayfly scenario (B) Eco-BLIC owl-mouse scenario (C) PLIC oscillation periods of masses hanging on springs scenario.

These results are further supported by student responses from the think-aloud interviews. For example, one interview participant responding to the bass-mayfly scenario of the Eco-BLIC explained that accounting for bias/error in both the field and lab groups in this scenario was a strength (i.e., 4). This participant mentioned that Group 1, who performed the experiment in the field, “[had] outliers, so they must have done pretty well,” and that Group 2, who collected organisms in the field but studied them in lab, “did a good job of accounting for bias.” However, when asked to compare between the groups, this student argued that Group 2 was more effective at accounting for bias/error, noting that “they controlled for more variables.”

Another individual who was evaluating “repeated trials for each mass” in the PLIC expressed a similar pattern. In response to ranking this feature of Group 1 as a strength, they explained: “Given their uncertainties and how small they are, [the group] seems like they’ve covered their bases pretty well.” Similarly, they evaluated this feature of Group 2 as a strength as well, simply noting: “Same as the last [group], I think it’s a strength.” However, when asked to compare between Groups 1 and 2, this individual argued that Group 1 was more effective because they conducted more trials.

Individual evaluation questions to support compare and contrast thinking

Given that students were more discerning when they directly compared two groups for both biology and physics experimental scenarios, we next sought to determine if the individual evaluation questions for Group 1 or Group 2 were necessary to elicit or helpful to support student critical thinking about the investigations. To test this, students were randomly assigned to one of two versions of the instrument. Students in one version saw individual evaluation questions about Group 1 and Group 2 and then saw group comparison items for Group 1 versus Group 2. Students in the second version only saw the group comparison items. We found that students assigned to both versions responded similarly to the group comparison questions, indicating that the individual evaluation questions did not promote additional critical thinking. We visually represent these similarities across versions with and without the individual evaluation questions in Fig 4 as heat maps.

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g004.jpg

The x-axis denotes students’ responses on the group comparison items (i.e., whether they ranked Group 1 as more effective, Group 2 as more effective, both groups as highly effective, or neither group as effective/both groups were minimally effective). The y-axis lists each of the study features that students compared between the field and lab groups. White and lighter shades of red indicate a lower percentage of student responses, while brighter red indicates a higher percentage of student responses. (A) Eco-BLIC bass-mayfly scenario. (B) Eco-BLIC owl-mouse scenario. (C) PLIC oscillation periods of masses hanging on springs scenario.

We ran chi-square goodness-of-fit tests on the answers between student responses on both instrument versions and there were no significant differences on the Eco-BLIC bass-mayfly scenario ( Fig 4A ; based on an adjusted p -value of 0.006) or owl-mouse questions ( Fig 4B ; based on an adjusted p-value of 0.004). There were only three significant differences (out of 53 items) in how students responded to questions on both versions of the PLIC ( Fig 4C ; based on an adjusted p -value of 0.0005). The items that students responded to differently ( p <0.0005) across both versions were items where the two groups were identical in their design; namely, the equipment used (i.e., stopwatches), the variables measured (i.e., time and mass), and the number of bounces of the spring per trial (i.e., five bounces). We calculated Cramer’s C (Vc; [ 33 ]), a measure commonly applied to Chi-square goodness of fit models to understand the magnitude of significant results. We found that the effect sizes for these three items were small (Vc = 0.11, Vc = 0.10, Vc = 0.06, respectively).

The trend that students answer the Group 1 versus Group 2 comparison questions similarly, regardless of whether they responded to the individual evaluation questions, is further supported by student responses from the think-aloud interviews. For example, one participant who did not see the individual evaluation questions for the owl-mouse scenario of the Eco-BLIC independently explained that sampling mice from other fields was a strength for both the lab and field groups. They explained that for the lab group, “I think that [the mice] coming from multiple nearby fields is good…I was curious if [mouse] behavior was universal.” For the field group, they reasoned, “I also noticed it was just from a single nearby field…I thought that was good for control.” However, this individual ultimately reasoned that the field group was “more effective for sampling methods…it’s better to have them from a single field because you know they were exposed to similar environments.” Thus, even without individual evaluation questions available, students can still make individual evaluations when comparing and contrasting between groups.

We also determined that removing the individual evaluation questions decreased the duration of time students needed to complete the Eco-BLIC and PLIC. On the Eco-BLIC, the median time to completion for the version with individual evaluation and group comparison questions was approximately 30 minutes, while the version with only the group comparisons had a median time to completion of 18 minutes. On the PLIC, the median time to completion for the version with individual evaluation questions and group comparison questions was approximately 17 minutes, while the version with only the group comparisons had a median time to completion of 15 minutes.

To determine how to elicit critical thinking in a streamlined manner using introductory biology and physics material, we investigated (a) how students critically evaluate aspects of experimental investigations in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.

Students are more discerning when making comparisons

We found that students were more discerning when comparing between the two groups in the Eco-BLIC and PLIC rather than when evaluating each group individually. While students tended to independently evaluate study features of each group as strengths ( Fig 2 ), there was greater variation in their responses to which group was more effective when directly comparing between the two groups ( Fig 3 ). Literature evaluating the role of contrasting cases provides plausible explanations for our results. In that work, contrasting between two cases supports students in identifying deep features of the cases, compared with evaluating one case after the other [ 34 – 37 ]. When presented with a single example, students may deem certain study features as unimportant or irrelevant, but comparing study features side-by-side allows students to recognize the distinct features of each case [ 38 ]. We infer, therefore, that students were better able to recognize the strengths and weaknesses of the two groups in each of the assessment scenarios when evaluating the groups side by side, rather than in isolation [ 39 , 40 ]. This result is somewhat surprising, however, as students could have used their knowledge of experimental designs as a contrasting case when evaluating each group. Future work, therefore, should evaluate whether experts use their vast knowledge base of experimental studies as discerning contrasts when evaluating each group individually. This work would help determine whether our results here suggest that students do not have a sufficient experiment-base to use as contrasts or if the students just do not use their experiment-base when evaluating the individual groups. Regardless, our study suggests that critical thinking assessments should ask students to compare and contrast experimental scenarios, rather than just evaluate individual cases.

Individual evaluation questions do not influence answers to compare and contrast questions

We found that individual evaluation questions were unnecessary for eliciting or supporting students’ critical thinking on the two assessments. Students responded to the group comparison items similarly whether or not they had received the individual evaluation questions. The exception to this pattern was that students responded differently to three group comparison items on the PLIC when individual evaluation questions were provided. These three questions constituted a small portion of the PLIC and showed a small effect size. Furthermore, removing the individual evaluation questions decreased the median time for students to complete the Eco-BLIC and PLIC. It is plausible that spending more time thinking about the experimental methods while responding to the individual evaluation questions would then prepare students to be better discerners on the group comparison questions. However, the overall trend is that individual evaluation questions do not have a strong impact on how students evaluate experimental scenarios, nor do they set students up to be better critical thinkers later. This finding aligns with prior research suggesting that students tend to disregard details when they evaluate a single case, rather than comparing and contrasting multiple cases [ 38 ], further supporting our findings about the effectiveness of the group comparison questions.

Practical implications

Individual evaluation questions were not effective for students to engage in critical thinking nor to prepare them for subsequent questions that elicit their critical thinking. Thus, researchers and instructors could make critical thinking assessments more effective and less time-consuming by encouraging comparisons between cases. Additionally, the study raises a question about whether instruction should incorporate more experimental case studies throughout their courses and assessments so that students have a richer experiment-base to use as contrasts when evaluating individual experimental scenarios. To help students discern information about experimental design, we suggest that instructors consider providing them with multiple experimental studies (i.e., cases) and asking them to compare and contrast between these studies.

Future directions and limitations

When designing critical thinking assessments, questions should ask students to make meaningful comparisons that require them to consider the important features of the scenarios. One challenge of relying on compare-and-contrast questions in the Eco-BLIC and PLIC to elicit students’ critical thinking is ensuring that students are comparing similar yet distinct study features across experimental scenarios, and that these comparisons are meaningful [ 38 ]. For example, though sample size is different between experimental scenarios in our instruments, it is a significant feature that has implications for other aspects of the research like statistical analyses and behaviors of the animals. Therefore, one limitation of our study could be that we exclusively focused on experimental method evaluation questions (i.e., what to trust), and we are unsure if the same principles hold for other dimensions of critical thinking (i.e., what to do). Future research should explore whether questions that are not in a compare-and-contrast format also effectively elicit critical thinking, and if so, to what degree.

As our question schema in the Eco-BLIC and PLIC were designed for introductory biology and physics content, it is unknown how effective this question schema would be for upper-division biology and physics undergraduates who we would expect to have more content knowledge and prior experiences for making comparisons in their respective disciplines [ 18 , 41 ]. For example, are compare-and-contrast questions still needed to elicit critical thinking among upper-division students, or would critical thinking in this population be more effectively assessed by incorporating more sophisticated data analyses in the research scenarios? Also, if students with more expert-like thinking have a richer set of experimental scenarios to inherently use as contrasts when comparing, we might expect their responses on the individual evaluation questions and group comparisons to better align. To further examine how accessible and context-specific the Eco-BLIC and PLIC are, novel scenarios could be developed that incorporate topics and concepts more commonly addressed in upper-division courses. Additionally, if instructors offer students more experience comparing and contrasting experimental scenarios in the classroom, would students be more discerning on the individual evaluation questions?

While a single consensus definition of critical thinking does not currently exist [ 15 ], continuing to explore critical thinking in other STEM disciplines beyond biology and physics may offer more insight into the context-specific nature of critical thinking [ 22 , 23 ]. Future studies should investigate critical thinking patterns in other STEM disciplines (e.g., mathematics, engineering, chemistry) through designing assessments that encourage students to evaluate aspects of at least two experimental studies. As undergraduates are often enrolled in multiple courses simultaneously and thus have domain-specific knowledge in STEM, would we observe similar patterns in critical thinking across additional STEM disciplines?

Lastly, we want to emphasize that we cannot infer every aspect of critical thinking from students’ responses on the Eco-BLIC and PLIC. However, we suggest that student responses on the think-aloud interviews provide additional qualitative insight into how and why students were making comparisons in each scenario and their overall critical thinking processes.

Conclusions

Overall, we found that comparing and contrasting two different experiments is an effective and efficient way to elicit context-specific critical thinking in introductory biology and physics undergraduates using the Eco-BLIC and the PLIC. Students are more discerning (i.e., critical) and engage more deeply with the scenarios when making comparisons between two groups. Further, students do not evaluate features of experimental studies differently when individual evaluation questions are provided or removed. These novel findings hold true across both introductory biology and physics, based on student responses on the Eco-BLIC and PLIC, respectively—though there is much more to explore regarding critical thinking processes of students across other STEM disciplines and in more advanced stages of their education. Undergraduate students in STEM need to be able to critically think for career advancement, and the Eco-BLIC and PLIC are two means of measuring students’ critical thinking in biology and physics experimental contexts via comparing and contrasting. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to teach and measure cognitive student outcomes. Specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses to efficiently elicit undergraduates’ critical thinking.

Supporting information

S1 appendix, s2 appendix, s3 appendix, acknowledgments.

We thank the members of the Cornell Discipline-based Education Research group for their feedback on this article, as well as our advisory board (Jenny Knight, Meghan Duffy, Luanna Prevost, and James Hewlett) and the AAALab for their ideas and suggestions. We also greatly appreciate the instructors who shared the Eco-BLIC and PLIC in their classes and the students who participated in this study.

Funding Statement

This work was supported by the National Science Foundation under grants DUE-1909602 (MS & NH) and DUE-1611482 (NH). NSF: nsf.gov The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Data Availability

Advertisement

Advertisement

Fostering students’ scientific literacy by reflective questioning: An identification, summarization, self-reflective questioning, and application (ISSA)-based flipped learning approach

  • Published: 11 August 2023

Cite this article

  • Shu-Chen Cheng 1 , 2 ,
  • Gwo-Jen Hwang   ORCID: orcid.org/0000-0001-5155-276X 3 , 4 , 5 &
  • Chih-Hung Chen 6  

313 Accesses

Explore all metrics

Developing students’ scientific literacy is the most important educational goal and challenge of the 21st century. Many studies have confirmed that flipped learning has significantly impacted learning science. Researchers indicate that the lack of an appropriate learning guidance strategy in the pre-class stage for flipped learning will influence students’ understanding of learning content and affect in-class learning activities. In order to tackle this problem, the present study proposed a flipped learning approach based on identification, summarization, self-reflective questioning, and application (ISSA), further exploring the influences on students’ scientific literacy, communication tendency, problem-solving tendency, learning motivation, and cognitive load. In addition, the study used a true experimental design to assess the effectiveness of the proposed learning method, and 58 university students were recruited to participate in a natural science course. The experimental group ( N  = 29) adopted the proposed learning approach, while the control group adopted the conventional flipped learning approach. The results showed that the experimental group had higher scientific literacy, communication tendency, problem-solving tendency, and extrinsic motivation than the control group. The interviews showed that the ISSA flipped learning method could improve students’ understanding of the learning content. In particular, the process of peer interaction promoted their self-reflection and scientific literacy skills.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

scientific literacy and critical thinking skills answer key

Similar content being viewed by others

scientific literacy and critical thinking skills answer key

Flipped classroom combined with WPACQ learning mode on student learning effect - exemplified by program design courses

Yu-Chen Kuo & Po-Jung Chang

scientific literacy and critical thinking skills answer key

Effects of online strategies on students’ learning performance, self-efficacy, self-regulation and critical thinking in university online courses

Ching-Yi Chang, Patcharin Panjaburee, … Gwo-Haur Hwang

scientific literacy and critical thinking skills answer key

Improving Students’ Higher Order Thinking Skills and Achievement Using WeChat based Flipped Classroom in Higher Education

Dongping Liu & Hai Zhang

Data availability

The data and materials are available upon request to the corresponding author.

Code Availability

Not applicable.

Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., Raths, J., & Wittrock, M. C. (2001). In Anderson, & D. R. Krathwohl (Eds.), A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives . L. W.

Ancliff, M., & Kang, A. (2017). Flipping an EMI physics class: Implications of student motivation and learning strategies for the design of course contents. International Journal of Contents , 13 (4), 1–11. https://doi.org/10.5392/IJoC.2017.13.4.001 .

Article   Google Scholar  

Barnard, M., Dehon, E., Compretta, C., Notebaert, A., Sparkmon, W., Meyer, E., & Rockhold, R. (2020). Development of a competency model and tailored assessment method for high school science teachers utilizing a flipped learning approach. Educational Technology Research and Development , 68 , 2595–2614. https://doi.org/10.1007/s11423-020-09782-5 .

Bergmann, J., & Sams, A. (2014). Flipped learning: Gateway to student engagement . International Society for Technology in Education.

Brown, S. I., & Walter, M. I. (2014). Problem posing: Reflections and applications . Psychology Press.

Burke, A. S., & Fedorek, B. (2017). Does “flipping” promote engagement? A comparison of a traditional, online, and flipped class. Active Learning in Higher Education , 18 (1), 11–24. https://doi.org/10.1177/1469787417693487 .

Çakiroğlu, Ü., Güven, O., & Saylan, E. (2020). Flipping the experimentation process: Influences on science process skills. Educational Technology Research and Development , 68 (6), 3425–3448.

Chen, Y., Wang, Y., & Chen, N. S. (2014). Is FLIP enough? Or should we use the FLIPPED model instead? Computers & Education , 79 , 16–27. https://doi.org/10.1016/j.compedu.2014.07.004 .

Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Erlbaum.

El Shinta, Z., Sunyono, S., & Setyorini, M. (2020). The validity of the online module of flipped classroom based on socioscientific issues towards students’ literacy skills. IOSR Journal of Research & Method in Education , 10 (2), 51–56.

Galway, L. P., Corbett, K. K., Takaro, T. K., Tairyan, K., & Frank, E. (2014). A novel integration of online and flipped classroom instructional models in public health higher education. BMC medical education , 14 (1), 1–9. https://doi.org/10.1186/1472-6920-14-181 .

Gilboy, M. B., Heinerichs, S., & Pazzaglia, G. (2015). Enhancing student engagement using the flipped classroom. Journal of nutrition education and behavior , 47 (1), 109–114. https://doi.org/10.1016/j.jneb.2014.08.008 .

González-Gómez, D., Jeong, J. S., & Rodríguez, D. A. (2016). Performance and perception in the flipped learning model: An initial approach to evaluate the effectiveness of a new teaching methodology in a general science classroom. Journal of Science Education and Technology , 25 (3), 450–459. https://doi.org/10.1007/s10956-016-9605-9 .

Heo, H. J., & Chun, B. A. (2016). A study on the effects of mobile-based LMS on flipped learning: Focused on the affective pathway in pre‐service teacher education. International Journal of Software Engineering and Its Applications , 10 (12), 473–484. https://doi.org/10.14257/ijseia.2016.10.12.39 .

Hsia, L. H., & Sung, H. Y. (2020). Effects of a mobile technology-supported peer assessment approach on students’ learning motivation and perceptions in a college flipped dance class. International Journal of Mobile Learning and Organisation , 14 (1), 99–113. https://doi.org/10.1504/IJMLO.2020.103892 .

Huang, Y. M., & Huang, Y. M. (2015). A scaffolding strategy to develop handheld sensor-based vocabulary games for improving students’ learning motivation and performance. Educational Technology Research and Development , 63 (5), 691–708. https://doi.org/10.1007/s11423-015-9382-9 .

Article   MathSciNet   Google Scholar  

Huang, C. K., & Lin, C. Y. (2017). Flipping business education: Transformative use of team-based learning in human resource management classrooms. Journal of Educational Technology & Society , 20 (1), 323–336.

MathSciNet   Google Scholar  

Hwang, G. J., Yang, T. C., Tsai, C. C., & Yang, S. J. H. (2009). A context-aware ubiquitous learning environment for conducting complex science experiments. Computers & Education , 53 (2), 402–413. https://doi.org/10.1016/j.compedu.2009.02.016 .

Hwang, G. J., Yang, L. H., & Wang, S. Y. (2013). A concept map-embedded educational computer game for improving students’ learning performance in natural science courses. Computers & Education , 69 , 121–130. https://doi.org/10.1016/j.compedu.2013.07.008 .

Jdaitawi, M. (2020). Does flipped learning promote positive emotions in science education? A comparison between traditional and flipped classroom approaches. Electronic Journal of e-Learning , 18 (6), 516–524. https://doi.org/10.34190/JEL.18.6.004 .

Jeong, J. S., & González-Gómez, D. (2018). The study of flipped-classroom for pre-service science teachers. Education Sciences , 8 (4), 163. https://doi.org/10.3390/educsci8040163 .

Jeong, J. S., González-Gómez, D., Gallego-Picó, A., & Bravo, J. C. (2019). Effects of active learning methodologies on the students’ emotions, self-efficacy beliefs and learning outcomes in a science distance learning course. JOTSE: Journal of Technology and Science Education , 9 (2), 217–227. https://doi.org/10.3926/jotse.530 .

Kim, B. (2001). Social constructivism. Emerging perspectives on learning teaching and technology , 1 (1), 16.

Google Scholar  

Kim, M. K., Kim, S. M., Khera, O., & Getman, J. (2014). The experience of three flipped classrooms in an urban university: An exploration of design principles. The Internet and Higher Education , 22 , 37–50. https://doi.org/10.1016/j.iheduc.2014.04.003 .

King, A. (1992). Comparison of self-questioning, summarizing, and notetaking-review as strategies for learning from lectures. American Educational Research Journal , 29 (2), 303–323. https://doi.org/10.3102/00028312029002303 .

King, J. R., Biggs, S., & Lipsky, S. (1984). Students self-questioning and summarizing as reading study strategies. Journal of Reading Behavior , 16 (3), 205–218. https://doi.org/10.1080/10862968409547516 .

Kirch, C. (2016). Flipping with Kirch: The ups and downs from inside my flipped classroom . The Bretzmann Group.

Kong, S. C. (2014). Developing information literacy and critical thinking skills through domain knowledge learning in digital classrooms: An experience of practicing flipped classroom strategy. Computers & Education , 78 , 160–173. https://doi.org/10.1016/j.compedu.2014.05.009 .

Lai, C. L., & Hwang, G. J. (2014). Effects of mobile learning time on students’ conception of collaboration, communication, complex problem-solving, meta-cognitive awareness and creativity. International Journal of Mobile Learning and Organisation , 8 (3), 276–291. https://doi.org/10.1504/IJMLO.2014.067029 .

Lee, J., & Choi, H. (2019). Rethinking the flipped learning pre-class: Its influence on the success of flipped learning and related factors. British Journal of Educational Technology , 50 (2), 934–945.

Lee, M. K., Chang, S. J., & Jang, S. J. (2017). Effects of the flipped classroom approach on the psychiatric nursing practicum course. Journal of Korean Academy of Psychiatric and Mental Health Nursing , 26 (2), 196–203. https://doi.org/10.12934/jkpmhn.2017.26.2.196 .

Li, Y. L., Li, M. J., & Yang, F. (2019). Teaching method design in engineering bidding course based on integration of problem chain and mind map. Advances in Social Science Education and Humanities Research , 403 , 17–23. https://doi.org/10.2991/assehr.k.200207.004 .

Lin, Y. N., & Hsia, L. H. (2019). From social interactions to strategy and skills promotion: An ASQI-based mobile flipped billiards training approach to improving students’ learning engagement, performance and perceptions. Educational Technology & Society , 22 (2), 71–85.

Lin, H. C., Hwang, G. J., & Hsu, Y. D. (2019). Effects of ASQ-based flipped learning on nurse practitioner learners’ nursing skills, learning achievement and learning perceptions. Computers & Education , 139 , 207–221. https://doi.org/10.1016/j.compedu.2019.05.014 .

Lin, Y. N., Hsia, L. H., & Hwang, G. J. (2021). Promoting pre-class guidance and in-class reflection: A SQIRC-based mobile flipped learning approach to promoting students’ billiards skills, strategies, motivation and self-efficacy. Computers & Education , 160 , 104035. https://doi.org/10.1016/j.compedu.2020.104035 .

Lo, C. K., Lie, C. W., & Hew, K. F. (2018). Applying “First Principles of Instruction” as a design theory of the flipped classroom: Findings from a collective study of four secondary school subjects. Computers & Education , 118 , 150–165. https://doi.org/10.1016/j.compedu.2017.12.003 .

Loveys, B. R., & Riggs, K. M. (2019). Flipping the laboratory: Improving student engagement and learning outcomes in second year science courses. International Journal of Science Education , 41 (1), 64–79. https://doi.org/10.1080/09500693.2018.1533663 .

Mayer, R. E. (2014). Incorporating motivation into multimedia learning. Learning and Instruction , 29 , 171–173. https://doi.org/10.1016/j.learninstruc.2013.04.003 .

Monaghan-Geernaert, P. (2019). Flipping the classroom to teach the evaluation of research articles and the development of scientific literacy. Journal of Instructional Research , 8 (1), 62–70. https://doi.org/10.9743/JIR.2019.1.6 .

Montgomery, A. P., Mousavi, A., Carbonaro, M., Hayward, D. V., & Dunn, W. (2019). Using learning analytics to explore self-regulated learning in flipped blended learning music teacher education. British Journal of Educational Technology , 50 (1), 114–127. https://doi.org/10.1111/bjet.12590 .

Moraros, J., Islam, A., Yu, S., Banow, R., & Schindelka, B. (2015). Flipping for success: Evaluating the effectiveness of a novel teaching approach in a graduate level setting. BMC medical education , 15 (1), 27. https://doi.org/10.1186/s12909-015-0317-2 .

Moses, B., Bjork, E., & Goldenberg, E. P. (1990). Beyond problem solving: Problem posing. In T. J. Cooney (Ed.), Teaching and learning mathematics in the 1990s (82–91). Reston, VA: National Council of Teachers of Mathematics.

O’Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The internet and higher education , 25 , 85–95. https://doi.org/10.1016/j.iheduc.2015.02.002 .

OECD. (2007). PISA 2006: Science competencies for tomorrow’s world . OECD.

Paristiowati, M. (2019). Hybrid of chemistry learning activities in secondary school through development of the flipped classroom model. Asia Proceedings of Social Sciences , 4 (3), 11–13. https://doi.org/10.31580/apss.v4i3.810 .

Pintrich, P. R., Smith, D. A. F., Garcia, T., & McKeachie, W.J. (1991). A manual for the use of the motivated strategies for learning questionnaire (MSLQ). MI: National Center for Research to Improve Postsecondary Teaching and Learning .

Sargent, J., & Casey, A. (2020). Flipped learning, pedagogy and digital technology: Establishing consistent practice to optimise lesson time. European physical education review , 26 (1), 70–84. https://doi.org/10.1177/1356336X19826603 .

Shih, W. L., & Tsai, C. Y. (2017). Students’ perception of a flipped classroom approach to facilitating online project-based learning in marketing research courses. Australasian Journal of Educational Technology , 33 (5), https://doi.org/10.14742/ajet.2884 .

Soliman, N. A. (2016). Teaching English for academic purposes via the flipped learning approach. Procedia-Social and Behavioral Sciences , 232 , 122–129. https://doi.org/10.1016/j.sbspro.2016.10.036 .

Srisuwan, C., & Panjaburee, P. (2020). Implementation of flipped classroom with personalised ubiquitous learning support system to promote the university student performance of information literacy. International Journal of Mobile Learning and Organisation , 14 (3), 398–424. https://doi.org/10.1504/IJMLO.2020.108200 .

Stratton, E., Chitiyo, G., Mathende, A. M., & Davis, K. M. (2019). Evaluating flipped versus face-to-face classrooms in middle school on science achievement and student perceptions. Contemporary Educational Technology , 11 (1), 131–142. https://doi.org/10.30935/cet.646888 .

Trenshaw, K. F., Revelo, R. A., Earl, K. A., & Herman, G. L. (2016). Using self-determination theory principles to promote engineering students’ intrinsic motivation to learn. International Journal of Engineering Education , 32 (3), 1194–1207.

Tsai, C. C. (2004). Beyond cognitive and metacognitive tools: The use of the internet as an ‘epistemological’ tool for instruction. British Journal of Educational Technology , 35 (5), 525–536. https://doi.org/10.1111/j.0007-1013.2004.00411.x .

Turan, Z., & Goktas, Y. (2016). The flipped Classroom: Instructional efficiency and impact of achievement and cognitive load levels. Journal of e-learning and knowledge Society , 12 (4).

Wang, L. C., & Chen, M. P. (2010). The effects of game strategy and preference-matching on flow experience and programming performance in game‐based learning. Innovations in Education and Teaching International , 47 (1), 39–52. https://doi.org/10.1080/14703290903525838 .

Yazici, B., & Yolacan, S. (2007). A comparison of various tests of normality. Journal of Statistical Computation and Simulation , 77 (2), 175–183. https://doi.org/10.1080/10629360600678310 .

Article   MathSciNet   MATH   Google Scholar  

Ye, X. D., Chang, Y. H., & Lai, C. L. (2019). An interactive problem-posing guiding approach to bridging and facilitating pre-and in-class learning for flipped classrooms. Interactive Learning Environments , 27 (8), 1075–1092. https://doi.org/10.1080/10494820.2018.1495651 .

Download references

This study is supported in part by the Ministry of Science and Technology of Taiwan under contract numbers MOST 111-2410-H-011 -007 -MY3 and MOST 111-2410-H-142-013.

Author information

Authors and affiliations.

Center for General Education, Chung Yuan Christian University, Taoyuan, Taiwan

Shu-Chen Cheng

Bachelor Degree Program of Digital Marketing, National Taipei University, New Taipei, Taiwan

Graduate Institute of Educational Information and Measurement, National Taichung University of Education, Taichung, Taiwan

Gwo-Jen Hwang

Graduate Institute of Digital Learning and Education, National Taiwan University of Science and Technology, Taipei, Taiwan

Yuan Ze University, Taoyuan, Taiwan

Master Program of Professional Teacher, National Taichung University of Education, Taichung, Taiwan

Chih-Hung Chen

You can also search for this author in PubMed   Google Scholar

Contributions

All authors contributed to the study conception and design. Material preparation, data collection and analysis were performed by Shu-Chen Cheng and Chih-Hung Chen. Project administration were performed by Shu-Chen Cheng and Chih-Hung Chen. Methodology and supervision were performed Gwo-Jen Hwang and Chih-Hung Chen. The first draft of the manuscript was written by Shu-Chen Cheng and Chih-Hung Chen. All authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Gwo-Jen Hwang .

Ethics declarations

Conflicts of interest/competing interests.

There is no potential conflict of interest in this study.

Ethics approval

The ethical requirements for research in this selected university were followed.

Consent to participate

The participants all agreed to take part in this study.

Consent for publication

The publication of this study has been approved by all authors.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Cheng, SC., Hwang, GJ. & Chen, CH. Fostering students’ scientific literacy by reflective questioning: An identification, summarization, self-reflective questioning, and application (ISSA)-based flipped learning approach. Educ Inf Technol (2023). https://doi.org/10.1007/s10639-023-12121-9

Download citation

Received : 13 January 2023

Accepted : 02 August 2023

Published : 11 August 2023

DOI : https://doi.org/10.1007/s10639-023-12121-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Flipped classroom
  • Flipped learning
  • Scientific literacy
  • Self-reflective questioning
  • Problem-solving
  • Find a journal
  • Publish with us
  • Track your research
  • Open access
  • Published: 12 March 2024

Interactive learning environment as a source of critical thinking skills for college students

  • Hao Song 1 &
  • Lianghui Cai 2  

BMC Medical Education volume  24 , Article number:  270 ( 2024 ) Cite this article

66 Accesses

Metrics details

The cognitive skills underlying critical thinking include analysis, interpretation, evaluation, explanation, inference, and self-regulation. The study aims to consider the possibility and effectiveness of introducing the mobile game Lumosity: Brain Training into the learning process of first-year Philology students studying at Qiqihar University.

The sample included 30 volunteers: 15 girls and 15 boys, whose average age was 18.4 years. Before the experiment start, the respondents took a pre-test based on the Critical Thinking Skills Success methodology, which was developed by the American scientist Starkey. It was stated that intensive one-month training with the use of the Lumosity premium application in the classroom would improve critical thinking skills.

The pre-test results showed that some respondents had had quite good critical thinking skills before the experiment as the average score was 22.13 out of 30 points. The effectiveness was evaluated using the Student’s t-test for paired samples. It is established that there are significant differences between standard and empirical values ( p  = 0.012).

Conclusions

The research can be of interest to those who study the issue of integrating an interactive learning environment into university and student programs, as well as those who consider critical thinking as a field of scientific knowledge and seek to develop critical thinking skills. The novelty of the study is the fact that students were allowed to use the app only during classes, but the research hypothesis was confirmed. This indicates that an interactive learning environment can be considered as a tool for developing students’ critical thinking skills in the context of limited screen time.

Peer Review reports

Introduction

Critical thinking is one of the thinking skills that relates to the cognitive abilities of a person. From a pedagogical perspective, there are two main approaches. The first approach considers critical thinking as a complex combination of knowledge, skills, and predispositions that can be intentionally developed using proper stimuli. According to the second interpretation, critical thinking is a general skill that is more or less impervious to pedagogical impact [ 1 ]. Critical thinking skills are among the priorities in education. They contribute to the state’s global economic competitiveness [ 2 ]. Even before the pandemic outbreak, critical thinking was studied in the context of creating an interactive learning environment as tablets and smartphones had spread so much that teachers began to actively use ICTs for better and individualized learning [ 3 ]. Critical thinking as a mental process is characterized by objectivity, perseverance, and involvement when a person has to face a problem, an opposing opinion, or disagreement. Carefully designed interactive learning settings create favorable conditions for reflection and critical thinking, promote the development of higher education through better student involvement, popularize online learning, and contribute to effective social interaction in the conditions of reduced face-to-face contacts [ 4 ]. The relevance of the research is due to the fact that mobile applications integrated into university programs can improve students’ critical thinking skills and create prerequisites for media literacy formation. The gadget’s screen can provide space and time to improve thinking skills by engaging users in a systematic and sustained critical discourse based on interactive tasks [ 5 ]. Moreover, information technologies enrich the mental experience of recipients and contribute to the development of reflexive skills needed for critical thinking.

Literature review

Critical thinking was analyzed within the framework of studying chemistry and physics using mobile learning tools [ 6 , 7 ]. The benefits of integrating gadgets were noted as both students’ critical thinking skills and academic performance improved. Critical thinking improvement is often classified as the most important goal of formal education as the ability to think critically ensures success while teaching critical thinking skills is sometimes only part of a higher education curriculum [ 8 ]. Blended learning is often used for the development of critical thinking in the student environment. This approach is mainly based on a combination of hand-on classes, classroom discussions, online discussions, interactive simulators, and various assessment forms that involve and instruct students, as well as ensure a productive asynchronous interaction with the teacher [ 9 ]. It is shown that critical thinking, as well as academic performance, can be improved by using educational games such as Role Play Games [ 10 ]. Currently, mobile software is considered progressive in terms of developing critical thinking skills. This is due to the fact that students adapt faster to any learning environment; they get the opportunity to actively participate in the learning process, solve interesting theoretical and practical problems based on critical thinking [ 11 ]. The analysis of pedagogical impact characteristics shows that effective critical thinking development is determined by many factors. First of all, these are personal (student’s learning style and motivation); methodological (methods, pedagogical techniques, duration of classes, feedback) and contextual (classroom atmosphere, incentive system) factors [ 12 ]. It was concluded that gamification should be considered as a valuable pedagogical tool that encourages users to master educational systems and demonstrate a higher engagement rate [ 13 ]. Critical thinking as an individual’s intellectual ability to analyze a problem using raw data and various solution strategies is a skill that can be improved relatively easily by appropriate cognitive stimulation [ 14 ]. Since the Enlightenment, critical thinking was seen as the core of scientific innovation. However, after the Bologna Process, this consensus disappeared as other goals such as efficiency, professional relevance, mobility, result orientation, and competence came to the fore [ 15 ]. The promotion of critical thinking in higher education has been increasingly requested by society. Thus, for example, in 2017, the German authorities decided to introduce critical thinking into university curricula [ 16 ]. Moreover, in order to determine the optimal concept for a particular educational institution, it is necessary to analyze the framework conditions of pedagogical practice; set learning goals to develop thinking; promote critical thinking in university education; create a proper learning atmosphere; implement and adapt a program for critical thinking development, which can also rely on an interactive platform. Critical thinking as a form of reasoning can represent “good thinking” and superior “higher-order” thinking. However, there is a need for a transcultural approach to critical thinking as culturally specific ways of thinking and an assessment system that attaches particular importance to certain forms of reasoning can be obtained only from the synthesis of social imagination and thinking, which are monolingual. For example, a focus on tolerance, common in Western Europe, may not dovetail with the worldview of a student who came from Asia [ 17 ]. In the context of an interactive learning environment, computational and computer skills that have common features with critical thinking are also studied. These include the ability to solve a problem in an innovative way; the ability to exactly determine what should be derived from a problem in order to find an optimal solution; the ability to effectively reflect on the most important elements of the problem and exclude irrelevant factors from the mental canvas [ 18 ]. Mobile learning has been identified as a strategy that can increase students’ thinking skills of the highest order [ 19 ]. However, mobile technology effectiveness is also determined by the learning process, which takes place along with the use of an interactive learning environment [ 3 ]. At the same time, a strong foundation in the form of critical thinking means that students will be able to solve higher-order issues that require analysis, comparison and evaluation earlier and with greater readiness. Moreover, students’ ability to assess information reliability during its teaching rather than after the fact can make academic classes more productive [ 20 ]. Effective knowledge can be constructed by involving recipients in meaningful contexts of social interaction and life experience. Unfortunately, traditional pedagogy has failed to ensure this in order to develop the active role of students in education, improve their thinking skills and cognitive abilities, including critical thinking [ 21 ]. Creativity as a thinking skill is most often compared to critical thinking as it is a higher-order mental skill that includes the ability to rely on lower-order thinking skills, generate innovative ideas, combine parts to create a whole, as well as plan the creation and development of new products in different spheres of life. Meanwhile, critical thinking has many special characteristics as it is a specific process of clear thinking and rational analysis of facts and events [ 22 ].

The critical analysis of the literature reveals a consistent theme: the integration of mobile learning tools in education significantly enhances critical thinking skills. Studies by Astuti et al. [ 6 ] and Cahyana et al. [ 7 ] in the fields of chemistry and physics demonstrate the positive impact of mobile learning tools on both critical thinking and academic performance. This synergy of technology and education is further echoed in the findings of Leal et al. [ 11 ]. The authors note the rapid adaptation and active engagement of students in learning environments augmented by mobile software. This adaptation is not only confined to the absorption of knowledge but extends to the application of critical thinking in solving complex theoretical and practical problems. Similarly, the work of Lorencová et al. [ 12 ] highlights the multifaceted nature of critical thinking development. In this context, the researchers emphasize the importance of personal, methodological, and contextual factors in pedagogical practices. The above studies collectively suggest that the integration of technology in education, especially mobile learning tools, is a critical factor in enhancing critical thinking skills.

Furthermore, the synthesis of the results underscores the transformative role of innovative teaching methods like gamification and blended learning in fostering critical thinking. Borglum [ 9 ] and Rasyid et al. [ 10 ] illustrate how educational games and blended learning approaches, respectively, significantly facilitate the development of critical thinking skills. These methods, which combine traditional and digital pedagogical techniques, engage students more effectively, as well as provide diverse platforms for them to apply critical thinking.

Problem statement

Mobile games are popular leisure activities that can be used to improve mental functions. This suggests that users from all over the world can develop their problem-solving skills while enjoying the game. Numerous mobile games, including adventures, puzzles, car racing, shooters, and sports, can contribute to the cognitive development of a person, preparing them for faster decision-making, stimulating the brain, and rewarding successful task completion [ 18 ]. Critical thinking can be indirectly developed by video games offering veiled exercises in strategy, tactics, and problem-thinking, as seen in games like Plague Inc. or Minecraft. Furthermore, there are interactive mobile applications specifically aimed at developing critical thinking skills, such as Logic Master, Can You Escape, Skillz, Brain games, Brainiton, Unblock me, and others [ 19 ]. The analysis of the existing literature [ 23 , 24 , 25 , 26 , 27 ] shows that the discussion mainly focuses on assessing the overall impact of mobile games on cognitive development, rather than on critical thinking skills. However, the latter is no less important and this study aims to address this gap. Therefore, it explores how an interactive learning environment created with the mobile game Lumosity: Brain Training can develop critical thinking skills in first-year college students.

The research aims to empirically study the effectiveness of digital software introduction to improve critical thinking skills in the student environment. The following tasks have been set: formation of an experimental group to test the hypothesis; selection and execution of optimal software with a focus on age relevance; preliminary assessment of respondents’ critical thinking skills; and comparison of initial skills with those formed after repeated interaction with an application that ensures an interactive learning environment.

Lumosity: Brain Training presents a unique approach to cognitive development. This application focuses on enhancing various cognitive domains, such as memory, attention, flexibility, speed, and problem-solving. Based on neuropsychological tasks, the app provides a range of games that adapt to the user’s performance, ensuring a challenging and personalized experience. It tracks progress over time, offering users valuable feedback on their cognitive abilities and areas for improvement. The gamified nature of Lumosity increases engagement and motivation, which are crucial for consistent use and cognitive improvement.

Methods and materials

The study primarily utilizes the Critical Thinking Skills Success Technique developed by American researcher Starkey [ 28 ]. This method was chosen due to its structure, which consists of a Pretest and a Post-test. The Pretest is designed to determine the initial level of the respondent’s critical thinking skills. It comprises 30 questions, each with four answer options, where only one is correct. Notably, there is no time limit for completing the test, and respondents are awarded 1 point for each correct answer. The Post-test is structured similarly to the Pretest, facilitating a comparison of results after the experimental part of the study. This method is intended for individuals over 16 years of age [ 29 ]. The questions from the test evaluate various aspects of critical thinking (problem solving, decision making, logical reasoning, analysis and evaluation of statements). Tasks range from evaluating arguments and conclusions to identifying logical errors and making evidence-based judgments [ 29 ].

The interactivity of the learning environment was ensured through the use of the Lumosity: BrainTraining mobile application. This software primarily includes games that train memory, speed, flexibility of thinking, and problem-solving skills, which are crucial components of critical thinking. According to the developers, the game is based on cognitive, neuropsychological, and experimental tasks that are transformed into interactive games and puzzles. The study employs the Lumosity Premium subscription, which offers an individualized training program, unlimited access to games, detailed progress information, and recommendations for improving game accuracy, speed, and strategy. The application’s interface is illustrated in Fig.  1 .

figure 1

Lumosity: brain training interface

It is important to note that the application under study does not specialize in critical thinking per se but provides comprehensive cognitive stimulation. This choice was made due to the limited availability of market options. Apps for critical thinking development on Google Play generally have low interactivity and tend to focus more on disseminating information about critical thinking and its improvement methods, such as Critical Thinking, Critical Thinking Insight, Learn Critical Thinking Offline Guide, and others. On the App Store platform, there is only one app specifically aimed at critical thinking development for elementary schoolchildren, the Critical Thinking Activities game by Ventura Educational Systems.

The research received no support from LumosLabs, as they were not a party involved in the study. The monthly subscription for each member of the experimental group was funded by Qiqihar University. There were no material or non-material benefits involved, and the choice of the premium version was purely for the research purpose of evaluating the effectiveness of a short, intensive use of this type of mobile application.

After one month of training, the experimental group was administered a one-factor Post-test consisting of 30 questions to reflect progress in critical thinking. A dependent samples t-test was used as the statistical method to compare the means from the two related groups.

This research also included a pilot study. The objective of the pilot study was to test the feasibility of the methodologies and the effectiveness of the mobile applications in a smaller, controlled setting before the full-scale implementation. This preliminary phase involved a select group of participants exposed to the same mobile applications and testing procedures as planned for the main study. The pilot study allowed for an initial assessment of the research design, the suitability of the selected mobile applications for enhancing critical thinking, and the adequacy of the testing instruments. It also provided an opportunity to refine the research methodology based on the feedback and observations gathered during this phase. The insights gained from the pilot study were instrumental in making necessary adjustments to ensure the validity and reliability of the main study. The pilot study helped determine the optimal frequency and duration of sessions in Lumosity: Brain Training. Consequently, it was possible to maximize cognitive benefits without causing fatigue or disinterest among participants. As a result, the psychometric tool did not cause misunderstanding on the part of participants.

Participants

The study involved 60 first-year student volunteers studying at the Faculty of Philology of Qiqihar University (the average age of the respondents was 18.4 years) divided into two groups of equal size and composition, experimental and control. The sample was representative in terms of biological gender (15 males and 15 females in both groups).

Considering the number of students in the faculty as a general sample, the allowable sampling error does not exceed p  = 4.82, which allows us to state that the sample is representative of the given educational institution and the age of the participants. The choice of first-year students is due to the minimal development of critical thinking skills that are instilled in the course of university education, which allows more accurate judgment of the results of the impact of the application. In addition, the assessed impact is planned to be focused primarily on high school students and first-year students.

Anyone could join the study. After taking the pretest, the experimental group got access to a monthly premium subscription of the Lumosity: Brain Training app and used it exclusively in the classroom. The principle of voluntariness ensured relatively high motivation and academic discipline. The respondents were interested in the topic and at the very beginning gave their written consent not to play the game outside the classroom and not to use other cognitive stimulations aimed at improving critical thinking skills. This refers to applications similar in content, video games, online courses, training sessions, books. They were also given a possibility of leaving the group at any stage without prior approval and sanctions.

Research design

This study involved 60 first-year philology students from Qiqihar University. The participants were randomly assigned to either the control or the experimental group. This randomization process was a crucial step in the research design. It minimized any systematic differences between the groups. By allocating students randomly, the study ensured that each group was comparable in terms of various potential confounding variables, such as prior knowledge, cognitive abilities, and learning styles.

Both classes involved in the study were taught by the same teacher. This arrangement was critical for consistency in the delivery of course content and teaching methodology. With a single educator responsible for both the control and the experimental groups, it was possible to effectively monitor variables related to teaching style, instructor expertise, and interaction dynamics.

In the first stage of the study, the respondents performed the Critical Thinking Skills Success pre-test to assess their critical thinking skills for both the experimental and control groups. Participants in the control group did not receive any intervention, did not work with the described software. Both groups continued to follow the same university curriculum; no differences in the methods or means of teaching, as well as special means of developing critical thinking, other than those described here, were used in both groups. Participants in the experimental group were divided into micro-groups of 10 people. The respondents used the application for 30 min 5 days a week. The teacher was asked to encourage students at the beginning of the lesson and provide feedback to help them summarize what games they played and what results were achieved. A total of 20 classes were conducted. In the third week of the experiment, the respondents were given an opportunity to play in pairs. This created dynamics and revived the process through introducing the element of competition. At the same time, the micro-group leaders did not focus on the competitive strategy reinforcement but rather on introspection and cooperation. The research participants took advantage of the premium subscription and chose the most interesting games; they shared strategies to solve the problem and tracked their LPI, that is the LPIs for speed of thinking, memory, attention, flexibility of thinking, problem solving. After 1 month, the philology students completed a post-test duplicating the one they took before the experiment. This made it possible to assess critical thinking before and after the mobile game introduction.

Data analysis

To study the impact of the applied intervention, a comparison of the results of post-tests and pre-tests of the control and experimental groups was used, as well as a comparison of the results of post-tests for the control and experimental groups. The Student’s t -test method was used with the strength of the effect tested according to Cohen’s d method. The use of the Student’s t-test in the article is due to the focus on comparing the average values of two independent groups. This tool can show if there is a statistically significant difference between them.

To verify that the sample and pre-experimental design met the requirements of parametric research methods, the results of the pre-test in both groups were checked by the Shapiro-Wilk method. Results B = 0.894 for the control group and B = 0.901 for the experimental group, which indicates that the samples were obtained with a distribution as close to normal as possible.

Statistical processing

Statistical data were processed in SPSS Statistics 23. The study relies on the Student’s t-test, which assesses the software impact on the cognitive abilities of young people in the field of critical thinking.

Research limitations

Initially, the Critical Thinking Skills Success methodology was developed within the framework of the course by Starkey (Critical thinking skills Success in 20 min a day) consisting of 20 lessons devoted to certain critical thinking aspects. For example, problem recognition, finding resources, inductive reasoning, and others. Today in the scientific community this test is used without reference to the textbook by Starkey [ 30 ]. However, this can be attributed to limitations. The relatively short duration (1 month) and high frequency of software use (20 classes/4 weeks) might have also affected the results of the study. From the neuropsychology perspective, the brain of first-year students is quite flexible but it takes time to consolidate positive changes. The respondents’ age also limits the possibilities for the legitimate retransmission of experimental trends. For example, the brain of masters functions differently due to age-related changes and higher rigidity, which is a natural ontogenesis part. Although masters, as well as first-year students, can be attributed to the youth. When testing the hypothesis about the possibility of purposeful critical thinking development in the classroom, the use of software was limited by the university. This did not allow the experimental group students to use the application in their free time and additionally develop basic thinking functions and critical thinking skills.

Ethical issues

This study complies with basic ethical standards. For example, the principle of the research benefits was fully implemented. Thus, the participants not only had an opportunity to test their critical thinking skills for free but also could develop these skills by taking advantage of the premium subscription sponsored by Qiqihar University. The fair sample selection principle was also partially implemented as first-year students studying at the Faculty of Philology were enrolled in the study as volunteers willing to join the experiment. However, the sample size was limited. It involved one faculty and considered one year of study. Moreover, 15 young men and 15 young women were included into the group. This means that some students could have been discriminated against because of their biological sex. At the same time, from the very beginning, the respondents knew that they were involved in the study evaluating the effectiveness of mobile applications in the development of critical thinking skills. The principle of respect for the respondent personality and autonomy was ensured at the research design stage as the statutory document contained an option of refusing participation in the experiment at any stage. The group involved only adult volunteers who showed interest in the assessment and development of critical thinking. There was no risk to the physical and mental health of the first-year students. Moreover, prior to the beginning of the empirical experiment, informed consent forms containing data on research participation, research duration, and confidentiality were signed.

The primary data analysis shows that before the start of the study, first-year students were relatively competent in the field of critical thinking. The pre-test mean was 22.13 out of 30 points, and the range was characterized by wide variability. For example, the minimum score was 14 points and the maximum score was 28 points. The primary distribution data are presented in Table  1 .

It must be said that an interactive learning environment contributed to the development of critical thinking in the experimental group as the lower post-test limit was 19 points, and the upper limit was 30. Moreover, the mean increased up to 24.50 points. This means that the app use really stimulated brain activity. However, there were cases when the post-test result was lower. This may be due to a weakening interest in the study, situational factors, fatigue (intensive classes as a limiting parameter), arrogance and the fact that the post-test contains vocabulary from the Starkey methodology, for example, the names of logical errors that were not displayed in the pre-test [ 28 ]. In terms of statistical significance, the results were compared using the Student’s t-test for related samples. The analysis showed that there were significant differences between the standard and empirical values ( p  = 0.012) at the acceptable level of p  = 0.05. The data are shown in Table  2 .

As follows from the results of comparing the pre-test and post-test results for the control group, the changes in the level of critical thinking in this group are not statistically significant ( p  = 0.891) and the strength of the effect can be ignored (0.29). At the same time, in the experimental group, the apparent increase in mean scores (Table  1 ) from 22.13 to 24.5 is statistically significant ( p  = 0.012) and has a noticeable effect size (0.88). Comparison of post-test results for the experimental and control groups is also statistically significant ( p  = 0.028) and has an apparent effect size (1.22), indicating a large change and a strong influence of the studied variable on critical thinking in the experimental group.

As p = < 0.05, it can be concluded that there are statistically significant differences in critical thinking levels before and after the premium app subscription use. Moreover, given that the t-criterion value is negative (-2.679), a statistically significant increase in critical thinking after intensive one-month software use can be noted. This indicates the effective impact of the cognitive stimulation on the recipients’ brains. The research hypothesis was confirmed. This allows us to believe that the construction of an interactive environment in the classroom has a positive effect on the development of mental skills, which include critical thinking.

The reported effect sizes provide a profound understanding of the practical implications of the research findings. It is especially relevant in terms of the impact of interactive learning environments on critical thinking skills among first-year students. The large effect size of 0.88 in the experimental group indicates a substantial improvement in critical thinking skills due to the use of the interactive mobile application. Thus, the average performance of students in the experimental group was markedly better compared to the control group. This fact suggests the effectiveness of the intervention.

Furthermore, an even more significant effect size of 1.22 in the comparison of post-test results between the experimental and control groups underscores the efficacy of the intervention. Such a large effect size implies that students using the interactive learning tool significantly improved their critical thinking skills compared to their initial abilities. Moreover, they considerably outperformed their peers in the control group. This finding demonstrates that the interactive learning environment was effective and superior to traditional teaching methods or non-interactive learning environments in enhancing critical thinking skills.

On the other hand, in the control group, there was no significant change in critical thinking levels. The small effect size observed in this group emphasizes the limited development of these skills without interactive intervention. This contrast further validates the value of the interactive tools used in the experimental group.

In essence, the effect sizes from the study provide compelling evidence of the practical effectiveness of interactive learning environments in developing critical thinking skills. The results suggest that within educational settings, interactive mobile applications can substantially enhance critical thinking abilities among students. Thus, the findings offer important implications for educational practices and the potential of technology-enhanced learning in fostering essential cognitive skills.

This indicates the statistical significance of the results. A p-value below 0.05 confirms the hypothesis that the intervention (with an interactive mobile application) developed critical thinking skills. The effect size, quantified as 0.88 and 1.22, is considered large based on Cohen’s d-test, where 0.2 represents a small effect, 0.5 is an average effect, and 0.8 is a large effect. Therefore, the use of interactive mobile applications is reasonable for the development of critical thinking skills among college students. This conclusion has implications for the educational environment. The integration of technological learning tools into curricula can be a way to develop critical thinking skills in different parts of the world. In addition, interactive learning has the potential to significantly affect mental abilities.

The results of the empirical study confirm the earlier conclusions [ 31 , 32 , 33 ] that indicate a positive correlation between the use of technology, especially interactive learning environments, and the level of critical thinking skills. The improvement observed in the experimental group compared to the control group highlights the effectiveness of educational technologies in the development of cognitive skills. This finding is consistent with the conclusions of other authors [ 8 , 10 , 34 , 35 ]. Previous studies have also substantiated the positive role of digital tools in learning outcomes. The article, among other things, expands the dialogue on the need to adapt educational methods to the digital landscape [ 36 , 37 ].

The results continue the debate about the longevity and sustainability of cognitive improvements after mobile gaming. Therefore, the study prompts further research into the long-term effects of constant use of brain training applications. This aspect echoes the manuscripts that have studied the dynamics of interactive intervention on cognitive functions [ 33 , 38 , 39 ].

The study also opens up opportunities for future research, including comparative analysis of different demographic groups and educational contexts. Additional studies can ensure the universality of the observed benefits. This approach calls for a more detailed understanding of how different populations interact with and benefit from educational technologies. At the same time, the potential of applications such as Lumosity: Brain Training reveals new prospects for personalized education, as was also discussed earlier [ 6 , 12 , 14 , 40 ].

In this section, it is also necessary to address the aspect of limitations to make the perception of the results more holistic. The study focuses on a specific demographic group from Qiqihar University. This fact limits the possibility of generalization since the results may not be applicable to a wider and more diverse population group. In addition, the gender distribution was not proportional, which may increase the bias of the article. The duration of the study and the intensity of software use are factors that could have also affected the results. The limited sample size of only 30 volunteers may not reflect the broader group of first-year philology students. The design of the study implied the limited use of the application in the classroom. However, there are potential differences in the use of the application in real life. The measurement of critical thinking skills may depend on situational factors, which may also have an impact. These limitations define the range of future research to fully assess the educational potential of mobile applications such as Lumosity: Brain Training for the development of critical thinking skills in college students.

In the course of the theoretical and empirical research, it was found out that mobile applications aimed at the development of brain activities can improve students’ critical thinking skills. The interactive learning environment stimulates brain activity, helps learners develop memory, speed, flexibility of thinking, and problem-solving skills. The research hypothesis has been confirmed: one-month training based on the Lumosity premium subscription has improved the critical thinking skills of first-year students studying at the Faculty of Philology. The sample consisted of 30 volunteers who, prior to the experiment, took the Critical Thinking Skills Success pre-test in order to assess their initial level of critical thinking. Next, they participated in a four-week training course containing 20 classes of 30 min. At this time, the students had an opportunity to play those games that attracted them the most. When choosing a game, the learners could rely on an individual plan available within the monthly subscription or ignore the program instructions. It is important to note that the informed consent assumed that the experimental group would not use the application outside the classroom or try to improve critical thinking skills through online courses, similar applications, books. This was done in order to assess the potential of the app use in the classroom. In the second week, the students were allowed to team up and play together to increase engagement and maintain interest. At the end of the premium subscription, the experimental group took the Critical Thinking Skills Success post-test. It has been empirically proved that the Lumosity premium subscription has contributed to an increase in the students’ critical thinking skills. This indicates that critical thinking can be developed in students directly within the academic program without relying on the extracurricular use of brain training games, in which students may quickly lose interest. Critical thinking will be more effectively developed when it is taught within media literacy, sociology, and psychology disciplines and has a relatively long and purposeful impact on the student’s brain activity. The practical value of the research lies in the fact that brain-training applications can now be integrated into the learning process not only in the classroom but also in the extracurricular aspect. For example, it can be a campus competition allowing students to demonstrate progress in the brain-training game and have fun. The scientific value of the research lies in the fact that an empirical two-stage study was conducted to compare the development of critical thinking before and after cognitive stimulation implemented with the help of the application. This promotes the discussion about the mobile software benefits in higher education and motivates app developers to expand the range of marketing proposals aimed at developing critical thinking in adolescents and young people. The research results can be used to develop mathematics, journalism, sociology, linguistics, and psychology curricula and to form the basis of a media literacy course. This study can be of interest to those who are interested in an interactive learning environment or seek to develop critical thinking skills. The research perspective is to evaluate the effectiveness of the Lumosity: Brain Training app based on various samples consisting of other students but not of volunteers. Moreover, it would be useful to study the program for technical students and compare the effectiveness of one-month and three-month uses of the game.

Data availability

The datasets used and/or analysed during the current study are available from the corresponding author (Lianghui Cai, [email protected]) on reasonable request.

Ellerton P. On critical thinking and content knowledge: a critique of the assumptions of cognitive load theory. Think Skills Creat. 2022;43:100975. https://doi.org/10.1016/j.tsc.2021.100975 .

Article   Google Scholar  

Giacomazzi M, Fontana M, Trujillo CC. Contextualization of critical thinking in sub-saharan Africa: a systematic integrative review. Think Skills Creat. 2022;43:100978. https://doi.org/10.1016/j.tsc.2021.100978 .

Ismail NS, Harun J, Salleh S, Zakaria MAZM. Supporting students critical thinking with a mobile learning environment: A meta-analysis. In: Gómez Chova L, López Martínez A, Candel Torres, I, editors. 10th International Technology, Education and Development Conference. Valence, Spain: INTED2016 Proceedings, 2016, pp. 3746–3755.

Yafie E. Collaborative Mobile Seamless Learning (CMSL) based on android apps to improving critical thinking in higher education in the post-covid-19 era. JARDCS. 2020;12:428–41. https://doi.org/10.5373/JARDCS/V12SP7/20202125 .

McCann S. Higher order mLearning: critical thinking in mobile learning. MODSIM World. 2015;208:1–11.

Google Scholar  

Astuti IAD, Dasmo D, Nurullaeli N, Rangka IB. The impact of pocket mobile learning to improve critical thinking skills in physics learning. J Phys Conf Ser. 2018;1114(1):012030. https://doi.org/10.1088/1742-6596/1114/1/012030 .

Cahyana U, Fitriani E, Rianti R, Fauziyah S. Analysis of critical thinking skills in chemistry learning by using mobile learning for level x. IOP Conf Ser Mater Sci Eng. 2018;434(1):012086. https://doi.org/10.1088/1757-899X/434/1/012086 .

Norouzi M, Samet A, Sharifuddin RSB. Investigate the effect of mobile learning over the critical thinking in higher education. Adv Nat Appl Sci. 2012;6(6):909–16.

Borglum RN. The effects of blended learning on critical thinking in a high school Earth Science class. Dissertations and Theses. Northern Iowa: University of Northern Iowa; 2016.

Rasyid A, Iswari RI, Marwoto P. The effectiveness of mobile learning role play game (rpg) maker mv in improving students’ critical thinking ability. J Phys Conf Ser. 2020;1567(4):042088. https://doi.org/10.1088/1742-6596/1567/4/042088 .

Leal LR, Ortega MV, Montes LP. Mobile devices for the development of critical thinking in the learning of differential equations. J Phys Conf Ser. 2019;1408(1):012015. https://doi.org/10.1088/1742-6596/1408/1/012015 .

Lorencová H, Jarošová E, Avgitidou S, Dimitriadou C. Critical thinking practices in teacher education programmes: a systematic review. Stud High Educ. 2019;44(5):844–59. https://doi.org/10.1080/03075079.2019.1586331 .

Bouchrika I, Harrati N, Wanick V, Wills G. Exploring the impact of gamification on student engagement and involvement with e-learning systems. Interact Learn Environ. 2021;29(8):1244–57. https://doi.org/10.1080/10494820.2019.1623267 .

Lestari SW, Agung L, Musadad AA. Android based adventure games to enhance vocational high school students’ critical thinking skills. In: Saddhono K, Ardianto DT, Hidayatullah MF, Cahyani VR, editors. Seword Fressh 2019: Proceedings of the 1st Seminar and Workshop on Research Design, for Education, Social Science, Arts, and Humanities (Vol. 115). Surakarta, Central Java, Indonesia: SEWORD FRESSH; 2019, pp. 115–120. https://doi.org/10.4108/eai.27-4-2019.2286917 .

Kruse O. Kritisches Denken als Leitziel Der Lehre. Auswege Aus Der Verschulungsmisere. J für Wissenschaft Und Bildung. 2010;19(1):77–86. https://doi.org/10.25656/01:16350 .

Jahn D, Kenner A, Kergel D, Heidkamp-Kergel B. Kritische Hochschullehre. Fachmedien Wiesbaden: Springer; 2019.

Book   Google Scholar  

Song X. Critical thinking and pedagogical implications for higher education. East Asia. 2015;33(1):25–40. https://doi.org/10.1007/s12140-015-9250-6 .

Article   MathSciNet   Google Scholar  

Fernandez JM, Palaoag TD, Dela Cruz J. An assessment of the mobile games utilization and ıt’s effect to one’s computational thinking skills. IJITEE. 2019;8(9S2):548–52. https://doi.org/10.35940/ijitee.I1115.0789S219 .

Ryua HB, Parsonsb D, Leea H. Using game-based collaborative learning to enhance critical thinking skills. Adv Affect Pleasurable Des. 2014;19(5):461–75.

Kadhom Faroun I. What is critical thinking? Samawah: Al Muthanna University College of Basic Education, 2021. https://doi.org/10.13140/RG.2.2.36795.95525 .

Rogti M. Critical thinking as a social practice: the interrelationship between critical thinking engagement, social interaction, and cognitive maturity. Lang Lit J. 2021;21:180–90.

Tso A, Ho W. Creativity and critical thinking in practice. J Commun Educ. 2021;5(1):1–2.

Paul R, Elder L. The miniature guide to critical thinking concepts and tools. Rowman & Littlefield; 2019.

Ennis RH. Critical thinking across the curriculum: a vision. Topoi. 2018;37:165–84.

Shiraev EB, Levy DA. Cross-cultural psychology: critical thinking and contemporary applications. Routledge; 2020.

Black M. Critical thinking: an introduction to logic and scientific method. Pickle Partners Publishing; 2018.

Malik RS. Educational challenges in 21st century and sustainable development. JSDER. 2018;2(1):9–20.

Starkey LB. Critical thinking skills success in 20 minutes a day. Learning Ex; 2010.

Lutsenko EL. Adaptation of L. Starkey’s critical thinking skills success test. Bull Kharkiv Natl Univ Named after V N Karazin Series: Psychol. 2014;55:65–70.

Talov DP, Orlova AV. Problems of psychological diagnostics of critical thinking in adolescents. Herzen Readings: Psychol Res Educ. 2020;3:691–6.

Noone C, Hogan MJ. A randomised active-controlled trial to examine the effects of an online mindfulness intervention on executive control, critical thinking and key thinking dispositions in a university student sample. BMC Psychol. 2018;6(1):13. https://doi.org/10.1186/s40359-018-0226-3 .

Article   PubMed   PubMed Central   Google Scholar  

Palavan Ã. The effect of critical thinking education on the critical thinking skills and the critical thinking dispositions of preservice teachers. Educ Res Rev. 2020;15(10):606–27. https://doi.org/10.5897/ERR2020.4035 .

Khoiriyah U, Isnaini UP, Utami RF, Djunet NA, Wijayanti PM, Saputra FA. Stimulating critical thinking skills through Critical Thinking Question List (CTQL). In Khan A, Rashid A, Uddin J, Wahid HS, editors. International Conference on Medical Education (ICME 2021). Yogyakarta Indonesia: Atlantis Press, 2021; pp. 24–28. https://doi.org/10.2991/assehr.k.210930.005 .

Kusmaryani W, Musthafa B, Purnawarman P. The influence of mobile applications on students’ speaking skill and critical thinking in English language learning. J Phys Conf Ser. 2019;1193(1):012008. https://doi.org/10.1088/1742-6596/1193/1/012008 .

Srilaphat E, Jantakoon T. Ubiquitous flipped classroom instructional model with learning process of scientific to enhance problem-solving skills for higher education (UFC-PS model). High Educ Stud. 2019;9(1):76–85. https://doi.org/10.5539/hes.v9n1p76 .

Mansbach J. Using technology to develop students’ critical thinking skills. Northwestern University School of Professional Studies. Distance Learning; 2015.

Chang CY, Kao CH, Hwang GJ, Lin FH. From experiencing to critical thinking: a contextual game-based learning approach to improving nursing students’ performance in Electrocardiogram training. ETRD. 2020;68(3):1225–45. https://doi.org/10.1007/s11423-019-09723-x .

Jantakoon T, Piriyasurawong P. Flipped classroom instructional model with mobile learning based on constructivist learning theory to enhance critical thinking (FCMOC model). J Theor Appl Inf Technol. 2018;96(16):5607–14.

Wang Q, Woo HL, Zhao J. Investigating critical thinking and knowledge construction in an interactive learning environment. Interact Learn Environ. 2009;17(1):95–104. https://doi.org/10.1080/10494820701706320 .

Kenyon T. Critical thinking for engineers and engineering critical thinking. In: Silva F. MFD, editor. 2016 2nd International Conference of the Portuguese Society for Engineering Education (CISPEE), Vila Real, Portugal: IEEE, 2016; pp. 1–4. https://doi.org/10.1109/CISPEE.2016.7777736 .

Download references

Acknowledgements

Not applicable.

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Author information

Authors and affiliations.

School of Civil Commercial and Economic Law, Henan University of Economics and Law, Zhengzhou, China

School of Communication Arts, Wuhan Qingchuan University, WuHan, China

Lianghui Cai

You can also search for this author in PubMed   Google Scholar

Contributions

HS and LC contributed equally to the experimentation. HS and LK wrote and edited the article. HS and LK equally designed and conducted the experiment. HS, LC studied scientific literature about the topic. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Lianghui Cai .

Ethics declarations

Ethics approval and consent to participate.

The authors declare that the work is written with due consideration of ethical standards. The study was conducted in accordance with the ethical principles approved by the Ethics Committee of Qiqihar University (Protocol No. 6 of 13.06.2023). Informed consent was signed by participants.

Consent for publication

Competing interests.

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Song, H., Cai, L. Interactive learning environment as a source of critical thinking skills for college students. BMC Med Educ 24 , 270 (2024). https://doi.org/10.1186/s12909-024-05247-y

Download citation

Received : 30 September 2023

Accepted : 01 March 2024

Published : 12 March 2024

DOI : https://doi.org/10.1186/s12909-024-05247-y

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Critical thinking
  • Interactive learning environment
  • Mental efficiency
  • Mobile learning
  • Soft skills

BMC Medical Education

ISSN: 1472-6920

scientific literacy and critical thinking skills answer key

IMAGES

  1. Critical Thinking Skills

    scientific literacy and critical thinking skills answer key

  2. Infographic Design The 6 Keys to Critical Thinking Teacher Classroom

    scientific literacy and critical thinking skills answer key

  3. Test 1 answers Critical Thinking

    scientific literacy and critical thinking skills answer key

  4. (PDF) Teaching Science Literacy and Critical Thinking Through Problem

    scientific literacy and critical thinking skills answer key

  5. 10 Essential Critical Thinking Skills (And How to Improve Them

    scientific literacy and critical thinking skills answer key

  6. (PDF) Analysis of Students’ Critical Thinking Skills, Scientific

    scientific literacy and critical thinking skills answer key

COMMENTS

  1. PDF Republic of the Philippine Department of Education DepEd Complex

    LEARNING STRAND 2: SCIENTIFIC LITERACY AND CRITICAL THINKING SKILLS The ultimate goal of this learning strand is to enable the learners to apply critical thinking skills and problem solving in daily life situations in order to improve their lives, as well as the quality of life of the people, the community and the country as a whole.

  2. PDF LEARNING STRAND 2 SCIENTIFIC AND CRITICAL THINKING SKILLS

    Strand 2 Scientific and Critical Thinking Skills of the ALS K to 12 Basic Education Curriculum (BEC). This module was collaboratively designed, developed, and reviewed by select DepEd field officials and teachers ... Using the learner's module, present the key concepts in preparing for oral presentations, visual aids, and effective oral ...

  3. Answer Sheet LS2 & LS3

    Answer Sheet LS2 & LS3 - Read online for free. The document contains two practice tests for a junior high school student. The first section has 13 multiple choice questions testing scientific literacy and critical thinking skills. The second section contains 15 multiple choice mathematics and problem solving questions. The student is directed to encircle the correct answer for each question.

  4. PDF LEARNING STRAND 2 SCIENTIFIC AND CRITICAL THINKING SKILLS

    Scientific and Critical Thinking Skills of the ALS K to 12 Basic Education (BEC). This module was designed to provide you with fun and meaningful opportunities for guided and independent ... Answer Key This contains answers to all activities in the module. Glossary

  5. Ls 2 Scientific and Critical Thinking Skills

    Ls 2 Scientific and Critical Thinking Skills - Free download as PDF File (.pdf), Text File (.txt) or view presentation slides online. ALS Curriculum Guide

  6. PDF Science Literacy, Critical Thinking, and Scientific Literature

    does not go far enough to foster the higher-order critical-thinking skills that are such an integral part of science and would go a long way to cultivating science literacy. For the past 40 y, the emphasis in science education can be best described as learning science by doing science. Students would learn the basic tenets of the scientific

  7. Video 5

    This video features 20 Review Questions and Answers for ALS A&E - Learning Strand 2 - Scientific Literacy and Critical Thinking Skills.

  8. Teaching critical thinking in science

    Scientific inquiry includes three key areas: 1. Identifying a problem and asking questions about that problem. 2. Selecting information to respond to the problem and evaluating it. 3. Drawing conclusions from the evidence. Critical thinking can be developed through focussed learning activities. Students not only need to receive information but ...

  9. Fostering Scientific Literacy and Critical Thinking in ...

    Scientific literacy (SL) and critical thinking (CT) are key components of science education aiming to prepare students to think and to function as responsible citizens in a world increasingly affected by science and technology (S&T). Therefore, students should be given opportunities in their science classes to be engaged in learning experiences that promote SL and CT, which may trigger the ...

  10. Critical thinking

    What is critical thinking? Critical thinking is the art of making clear, reasoned judgements based on interpreting, understanding, applying and synthesising evidence gathered from observation, reading and experimentation. Burns, T., & Sinfield, S. (2016) Essential Study Skills: The Complete Guide to Success at University (4th ed.) London: SAGE ...

  11. Use of the Test of Scientific Literacy Skills Reveals That Fundamental

    Indeed, several reports in recent years have argued for the inclusion of critical-thinking and scientific literacy skill development in college STEM courses (American Association for the Advancement of Science [AAAS], 2011; President's Council of Advisors on Science and Technology [PCAST], 2012; American Society for Engineering Education ...

  12. Fostering Scientific Literacy and Critical Thinking in Elementary

    Abstract. Scientific literacy (SL) and critical thinking (CT) are key components of science education aiming to prepare students to think and to function as responsible citizens in a world ...

  13. Scientific Literacy

    Scientific literacy is fundamental in education, nurturing critical thinking skills and curiosity. A pedagogical approach known as inquiry-based learning is essential in this context, emphasizing ...

  14. PDF Fostering Scientific Literacy and Critical Thinking in Elementary

    Abstract. Scientific literacy (SL) and critical thinking (CT) are key components of science education aiming to prepare students to think and to function as responsible citizens in a world increasingly affected by science and technology (S&T). Therefore, students should be given opportunities in their science classes to be engaged in learning ...

  15. Critical Thinking in Science: Fostering Scientific Reasoning Skills in

    Critical thinking is essential in science. It's what naturally takes students in the direction of scientific reasoning since evidence is a key component of this style of thought. It's not just about whether evidence is available to support a particular answer but how valid that evidence is. It's about whether the information the student ...

  16. ERIC

    Scientific literacy (SL) and critical thinking (CT) are key components of science education aiming to prepare students to think and to function as responsible citizens in a world increasingly affected by science and technology (S&T). Therefore, students should be given opportunities in their science classes to be engaged in learning experiences that promote SL and CT, which may trigger the ...

  17. LS 2 Scientific and Critical Thinking Skills

    LS 2 Scientific and Critical Thinking Skills - Free download as PDF File (.pdf), Text File (.txt) or view presentation slides online. LS 2 Scientific and Critical Thinking Skills

  18. 2 Principles and Definitions

    Chapter 2Principles and Definitions. The development of the National Science Education Standards was guided by certain principles. Those principles are. Science is for all students. Learning science is an active process. School science reflects the intellectual and cultural traditions that characterize the practice of contemporary science.

  19. Scientific Literacy and Critical Thinking Skills- Critical Thinking Secrets

    Metacognition, or the process of thinking about one's own thinking, plays a crucial role in fostering critical thinking skills in science education. Cambridge highlights key steps in the critical thinking process, which include: Identifying a problem and asking questions about that problem. Selecting information to respond to the problem and ...

  20. Science literacy, critical thinking skill, and motivation: A

    This study aimed at determining the correlation among three variables i.e. critical thinking skill, motivation, and scientific literacy possessed by high school students. The research used a ...

  21. An empirical analysis of the relationship between nature of science and

    Critical thinking (CRT) skills transversally pervade education and nature of science (NOS) knowledge is a key component of science literacy. Some science education researchers advocate that CRT skills and NOS knowledge have a mutual impact and relationship. However, few research studies have undertaken the empirical confirmation of this relationship and most fail to match the two terms of the ...

  22. What influences students' abilities to critically evaluate scientific

    Critical thinking and its importance. Critical thinking, defined here as "the ways in which one uses data and evidence to make decisions about what to trust and what to do" [], is a foundational learning goal for almost any undergraduate course and can be integrated in many points in the undergraduate curriculum.Beyond the classroom, critical thinking skills are important so that students ...

  23. current scientific thinking: Topics by Science.gov

    Scientific literacy (SL) and critical thinking (CT) are key components of science education aiming to prepare students to think and to function as responsible citizens in a world increasingly affected by science and technology (S&T). Therefore, students should be given opportunities in their science classes to be engaged in learning…

  24. Fostering students' scientific literacy by reflective ...

    Developing students' scientific literacy is the most important educational goal and challenge of the 21st century. Many studies have confirmed that flipped learning has significantly impacted learning science. Researchers indicate that the lack of an appropriate learning guidance strategy in the pre-class stage for flipped learning will influence students' understanding of learning content ...

  25. Interactive learning environment as a source of critical thinking

    Background The cognitive skills underlying critical thinking include analysis, interpretation, evaluation, explanation, inference, and self-regulation. The study aims to consider the possibility and effectiveness of introducing the mobile game Lumosity: Brain Training into the learning process of first-year Philology students studying at Qiqihar University. Methods The sample included 30 ...