(71e) Use of the Fundamentals of Engineering Exam As An Engineering Education Assessment Tool
The Chemical Engineering program at New Mexico State University began using the Fundamentals of Engineering Exam as an assessment tool in 2000. It was found over the ensuing years that students in the program were not finding much value in taking the exam. Consequently, the sample size was always too small to provide any useful assessment information. The NCEES report “Using the Fundamentals of Engineering (FE) Examination to Assess Academic Programs” by LeFevre, et al. suggested use of a “scaled score” method to treat the FE data for use as an assessment tool. Collected data was treated by this method, but again data provided little value as the error bars on the scaled scores were of extremely large size relative to the scale of the scores. It was noted by the Ch E faculty that analysis by this method assumes the results are for a population rather than a sample, and does not address whether the sample taking the exam reflect the population.
It is assumed that the FE exam represents a national norm capable of providing a useful assessment of the effectiveness of a program. It was thus suggested that the curriculum be modified so that taking the FE exam is a requirement of graduation. In this manner, the true population would be reflected by the data, and a method of treating the data to perform an assessment could be developed. Because students do not have to pass the exam as a degree program requirement, concern was expressed that students would not put forth a valid effort. A protocol was developed to moderate this concern. Beginning in 2007/08, students in the program were required to sit for the exam during the fall semester of the senior year, with the registration fee paid by the department. Students who do not pass the exam must take the FE a second time at their own expense in the spring semester. The requirement to take the exam a second time at their own expense will help to ensure students will put forth their best effort to pass. In this manner, a delay in graduation timeframe is avoided, yet the department collects more useful data.
Data has been collected by this method for a few years, and analysis of the results have permitted identification of topic-specific strengths and weaknesses in the program. Results and the ensuing student support network will be discussed.