David,
I fully agree, you make many excellent points. I will forward your message
to one of our colleagues who is making an effort to change things at our
local Science Fairs.
I apologize to others in this discussion group for not relaying comments I
received sooner, but I have been swamped with teaching and Grant Proposals.
Here are some excerpts of useful comments:
...I agree with you wholeheartedly that there is a large public
misperception about how science works, but I'm not sure that what you are
writing will alleviate that problem for the people who read it. The major
thing I think it does get across is the tentativeness of all scientific
conclusions; my concern is that there is a great deal more to how science
works that is still unaddressed.
I consider the following books are good examples that provoke ideas and
discussions about how science truly works. They are quite contrary to the
simplistic and inaccurate hypothetico-inductive model that we teach as "The
Scientific Method."
Chalmers, A.F. What is this Thing Called Science?
Kuhn, Thomas. The Structure of Scientific Revolutions.
Hull, David. Science as a Process.
Doug Jensen
...I think it helps to put emphasis on quantitative predictions that follow
from
a null hypothesis and from an alternative hypothesis. In many cases a
problem comes from a failure to get specific predictions. The notion that
if
you can not justifiably predict a specific quantitative difference that you
can then turn the question around and predict no difference. By failing to
support this later you lend support to the former. The use of the null
hypothesis in this way should be emphasized...
...There is nothing "wrong" with getting results that are unlikely. That
just
means that you have obtained results that are less common, but that does not
mean that the results are incorrect or poorly observed or in any way flawed.
Variation is something many biology students have a hard time with, and they
need to accept that variation is a good thing. Before you get into the
concept of percent confidences with your students I would strongly advise
that you assign some sort of work sheet in which they be required to work
through data sets and calcualte means and standard deviations. You can then
ask them what, if anything, is "wrong" or "right" with the individuals that
differ from the mean population value. This might lead students to consider
variation, fitness, and other concepts...
... Another reason for never being 100% sure is that chance can give you ten
heads in a row, even with an honest coin. And this can occur ten times in a
row, even with an honest coin. Thus it IS possible to conclude, falsely,
that our coin is biased when in truth it is not.
This leads to the notion that even after a hypothesis is supported by a set
of data it still is not considered sufficient to PROVE that that the
hypothesis is the "truth". This is another thing that I have seen trip up
lots of students. Many of them feel that if their data does not prove
something that they have somehow failed, or done something wrong.
Unfortunately, instructors tend to reinforce this by giving students
exercises in which there are known differences being expected to be found.
Give students two sets of data that support the null hypothesis (i.e. no
difference between the two data sets) and there will always be some students
who will conclude that the data sets differ simply because the means are not
the same. I think that they do this because they have been conditioned to
look for differences. You might want to give students exercises with data
sets to examine if they results are the same?...
Scott Meissenr
I have my students read this paper:
Platt, J. R. (1964). "Strong Inference." Science 146(3642): 347-353.
It is old and it may not be the best paper in the world on the scientific
method (I am anything but an expert in the philosophy of science), but it
does provide a context in which students can approach primary research
articles for the rest of the semester (and the rest of their careers, one
would hope) that focuses on identifying disprovable hypotheses and valid
tests thereof.
-Tom Jacobs
on 4/12/01 8:19 AM, "David W. Kramer" at kramer.8 at osu.edu wrote:
> I've read all the excellent messages in this thread and can't resist
> jumping in!!
>> I agree on the importance of students learning and following the scientific
> method (even though scientists still debate what the exact steps are! or
> what they should be called!) but we should be willing to rise above purely
> semantic issues, accepting the use of several terms to describe the various
> steps as long as the basic observation/problem/hypothesis/data gathering
> and analysis/conclusion/communication rubric is there.
>> Most of us at one time or another have criticized 1) science fair projects
> and 2) the teachers who make these assignments (seemingly with limited
> understanding of the process of science). We volunteer as judges and are
> too often dismayed by what usually amounts to a lack of instruction or
> faulty instruction of the students. So we criticize. Afterall, we're
> trained to be critical of what we see!! Some of us decide that it isn't
> worth our time to continue to be involved in such a poorly conceived and
> executed endeavor.
>> But: If we are not part of the solution, we are part of the problem!
>> Let us rejoice that at the very least the science fair introduces thousands
> of students to the joys and frustrations of hands-on science. At least
> they are not just reading about science or passively watching a video.
> They are DOING science. But we need not ignore the fact that they could be
> doing it better.
>> Rather than simply criticizing how it is being done poorly in too many
> cases, let's try to think of ways of improving the process. Here are just
> two ideas:
>> 1) Volunteer to visit at least a few schools in your area that have science
> fair projects to talk to the students (and their teachers... insist that
> the teachers stay in the room so they can give this lecture next year)
> about the scientific method, i.e., the process. Don't just name the steps
> but explain the logic, the role of each step. Use this as an OPPORTUNITY
> TO TEACH THAT in everday speech they use "theory" the way scientists use
> "hypothesis"... a predicted explanation that needs to be tested. AND in
> science we use the word "theory" also but to mean something very different:
> A principle that has a very high degree of predictability, that has been
> supported with data under a wide variety of circumstances (e.g., over a
> long time, in many places, under many conditions, with many different
> organisms, etc.). WE NEED TO EXPLAIN THAT the Cell Theory, the Theory of
> Evolution, etc. are NOT HYPOTHESES but are more akin to the LAWS of the
> physical sciences.
>> Next year go to several new schools. After several years you will have
> educated lots of students and their teachers.
>> 2) We also need to provide students and their teachers with a one-page
> (well, maybe two!) explanation of some very basic (developmentally
> appropriate) statistical analyses. What is a mean? a median? a standard
> deviation? a percentage? tests of statistical significance? When should
> these various analytical techniques be used? What do they show us? I
> enounter lots of students (even in my college classroom!) who can calculate
> percentages, for example, but have no clue as to why you would want to do
> so!
>> Students also need to know the value of making data sets visual through
> graphing techniques. Can you graph a hypothesis even before you have
> gathered data? (What should the graph look like if the hypothesis is
> supported by the data?) When is it appropriate (and inappropriate) to use
> a line graph? bar graph? pie chart? I encounter two problems with
> statistical analysis in science fair projects: 1) too little data, 2)
> incorrect choice of analytical tool. The students have access to some
> fairly sophisticated software tools and many students know how to use them
> to construct beautiful graphs but they don't understand which kind of graph
> is used for which purpose.
>> I think we can do a better job of supporting science fair and similar
> programs. Our professional societies can publish a brochure on the
> scientific method and another one on statistical analysis and make these
> available to our members to distribute when we 1) train teachers (WE are
> the ones who prepare the teachers!!), 2) talk to students about science
> fair projects, and 3) judge science fair projects. The Botanical Society
> of America would be pleased to do this. Send me your suggestions.
>> Dave
>>> *********************
> David W. Kramer, Chair
> Education Committee
> Botanical Society of America
>http://www.botany.org>> Asst. Prof. of Evolution, Ecology, and Organismal Biology
> Ohio State University at Mansfield
> 1680 University Drive
> Mansfield, OH 44906-1547
> Phone: (419) 755-4344 FAX: (419) 755-4367
> e-mail: kramer.8 at osu.edu
--
Grant R. Cramer
Associate Professor
Mail Stop 200
Department of Biochemistry
University of Nevada
Reno, NV 89557
phone: (775) 784-4204
fax: (775) 784-1650
email: cramer at unr.edu
web page: http://gcramer-mac.ag.unr.edu/index.html
---