Supercomputer access was originally reserved for graduate students and research scientists. However, in 1988 as part of a national contest sponsored by ETA Systems, a division of Contol Data Corporation, this power was made available to high school students. The contest was called SuperQuest. After Control Data closed down ETA Systems and terminated SuperQuest in 1989, this exciting educational initiative was picked up by several of the national supercomputing centers including the Cornell Theory Center, the National Center for Supercomputing Applications, and the University of Alabama/Alabama Supercomputing Network. It was jointly sponsored by the National Science Foundation and various private corporations. To be ready for the technical expertise that will be required in the next century, both students and teachers must learn to use high performance computing and computational science as powerful investigative tools.
As a teacher whose students won the ETA10-P supercomputer, the million dollar grand prize in that first SuperQuest, I have felt an obligation to develop curriculum appropriate for the secondary school level that would utilize our supercomputer. Although the ETA10-P at Thomas Jefferson High School for Science and Technology is no longer working due to an unfortunate leak in the roof of our computer lab, some of the materials we have developed could be used on even modest computers. At Jefferson, we are using modest Pentium systems in a networked UNIX (Linux) environment. By utilizing a parallel programming environment called PVM, Parallel Virtual Machine developed at MIT, we are able to attain computational power that is comparable to our former supercomputer.
Until recently, only three of these techniques had been used for most scientific investigations. The first method, observational science, is where researchers study some situation or phenomenon, and then carefully document their discoveries. Some examples of observational science include studying the social behavior of gorillas in various habitats, or mapping the geological formations in the Grand Canyon.
The second method is referred to as experimental science. With this technique, an experiment is designed that will provide some insight into a basic scientific principle. It is important in experimental science to have control groups for comparison, and to try hold many factors constant in order to isolate cause and effect. Examples of experimental science include tests to determine the appropriate concentrations for a new medication, or comparative tests of airplane wing designs in a wind tunnel.
In the third method, theoretical science, a law or theory is hypothesized and then substantiated by additional research and rigorous mathematics. Examples of theoretical science include the complex equations describing fluid flow, and the familiar formula, e=mc2, in Einstein's theory of relativity.
A very practical application of supercomputer power is in the area of computer modeling and simulation. One distinct advantage of certain computer models is that they can be used to speed up extremely slow processes in order to predict potential outcomes in the future. Long range weather forecasting and the problems of global warming are environmental studies which are being investigated by supercomputer models. Scientists have even been able to study the effects of seemingly harmless forest management techniques that have now contributed to increased forest fire potential.
Computer simulation models are also excellent choices to study processes that happen too quickly to observe by a direct experiment. They are very useful to investigate the behavior of things that might be too small to examine by any physical means. Some examples include molecular dynamics simulations and three dimensional modeling of chemical compounds that can help scientists understand the physical properties of these structures.
Computer models can be used to explore situations that researchers cannot experience directly. For instance, scientists can investigate the nature of black holes using supercomputer simulations. Others have developed computer models of developing thunderstorms in order to understand the internal conditions that bring about severe weather phenomena such as tornados.
Some graphics visualization approaches may use just simple two-dimensional graphs of lines or dots on the screen, while others can display three-dimensional renderings of objects, contours, and surfaces. Advanced visualization techniques may include computer animations with realistic colors, reflections, and shading [4]. At Jefferson, we are using the library called OpenGL Graphics to provide graphics visualization access for our students.
Color is a very valuable tool in graphics visualization. It can be used to enhance images or provide emphasis to details that might not be readily apparent. Engineers can better view air currents and turbulence around new car designs with color enhancement. Architects can look for regions of material stress in structures that might fatigue under heavy use.
Doctors are using three-dimensional imaging to view cancerous tumors in the body which might be very difficult to discern by conventional x-rays. Using similar techniques, anthropologists can even investigate the interior of ancient mummies without ever unwrapping them.
We have been pleased to see our students use many of the tools they have learned in our elective courses as they pursue other projects and activities at Jefferson. We feel that the diversity of senior projects in the CS Lab involving computational science are a direct result of the dynmaically evolving curriculum we try to offer. We are especially gratified by the large number of former students who have gone on to advanced degrees at prestigious universities, and are kind enough to keep us informed of their accomplishments. After ten years of supercomputing at TJHSST, we feel very confident that the experiment started by ETA Systems was an unqualified success.