Assessment and Continuous Improvement
Student success is at the heart of SUNY Oneonta’s assessment and continuous improvement process. Collaborating with all academic, student experience and operational units across SUNY Oneonta, Institutional Assessment is a resource for the development and implementation of assessment techniques while ensuring consistent processes across SUNY Oneonta. Additionally, we provide ongoing support for accreditation, and state and federal compliance requirements.
SUNY Oneonta is accredited by the Middle States Commission on Higher Education. “Assessment of student learning and achievement demonstrates that the institution’s students have accomplished educational goals consistent with their program of study, degree level, the institution’s mission, and appropriate expectations for institutions of higher education.”* In 2020 SUNY Oneonta adopted a New Continuous Improvement Plan. This plan outlines the process by which faculty and staff serve the critical function of synthesizing assessment reports, making connections to strategic initiatives, and providing recommendations based on the assessment findings.
Our philosophy on assessment is guided by our commitment to our students and the questions we ask to ensure quality. What are our students learning? Are our services the best they can be? The questions we ask about ourselves will vary from program to program and department to department. Whether they deal with students learning specific content, skills or attitudes or perhaps issues of student motivation and ability to monitor their own learning. Our assumption is that the key assessment questions are best known by the faculty and staff themselves. Finding ways to answer these questions is key to our student success.
Academic assessment seeks to answer the broad question, "What and how well do our students learn what we are attempting to teach them?" As such, academic assessment is not designed to evaluate individual faculty or even individual courses. It is designed to evaluate individual programs as a whole and to determine where the programs might be strengthened in order to improve our students' abilities to learn. The primary audience for academic assessments is not administrators or accrediting agencies, but, rather, the program faculty themselves.
Similarly, operational efficiency and effectiveness is the capability of SUNY Oneonta to deliver services to students in the most cost-effective manner possible while still ensuring the high quality of products and support. This is achieved by streamlining core processes in order to more effectively respond student needs in a cost-effective manner. By focusing on continuous progression toward a meaningful but ambitious target, assessment methods are used to give feedback and guide SUNY Oneonta’s planning efforts.
Economics Program (BS) 20/21 Cycle
|Summary of Findings:||The performance of students enrolled in the Senior Seminar on the National Economic Literacy Survey shows that the 2020 group of 15 students answered 87 percent of the questions correctly, with the median student answering two questions incorrectly. In 2020, our students scored perfectly for questions 1, 2, 6, and 7. The group scored noticeably lower for question 3, but similar to the students in ECON 110. The group scored better than the national average. The 2019 group of 18 students answered 82 percent of the questions correctly, with median student recorded one question wrong. On average, our students consistently perform better than the national average.
To generate another comparison group for the Senior Seminar students, we administered the examination to the students enrolled in Principles of Economics (ECON 110) in the spring of 2010. ECON 110 is a one semester course that combines principles of microeconomics and principles of macroeconomics. The course is designed for non-majors and commonly completed by students during their freshman or sophomore year. The students in the course can be viewed as a sample of undergraduate students at the College. The institutional student profile was at a peak during this period. As evidence shows, the performance of the students in ECON 110 on the literacy survey was much better than the national average but considerably worse than the performance of the students in the Senior Seminar.
We did not collect data for the National Economic Literacy Survey in the spring of 2021. In the fall of 2020, the faculty unanimously decided that starting the spring of 2021 we switched to the Test of Understanding College Economics (TUCE) to gauge students’ understanding of basic economic concepts (PLO1).
Our External Evaluator’s Report (from the 2017 program review) made it clear that the National Economic Literacy Survey may not be the best instrument for assessing our student’s understanding of basic economic concepts. The evaluators suggested that we consider using either the Test of Understanding College Economics (TUCE) from the National Council on Economic Education or the field examination from the Educational Testing Service. Unfortunately, both of these examinations are costly to administer and the discretionary budget for the Department of Economics, Finance and Accounting has been previously insufficient to cover either. Alternatively, the external evaluators suggested that we develop our own test. Finding a cost-effective replacement for the National Economic Literacy Survey was a goal of the faculty. In the spring of 2018, we decided to use 20 questions from the TUCE for the first time. The test consists of ten macroeconomics questions and ten microeconomics questions. We have been collected data since 2018 on the selected questions from the TUCE. Overall, our students in the Senior Seminar scored higher than the national average. In 2020, our students scored higher by about nine percent than the national average in the macroeconomics area, and eleven percent in the microeconomics subjects.
Since 2018, we have used both the National Economic Literacy Survey and the 20 questions from TUCE for assessing our students' understanding of economic concepts in the Senior Seminar class. After three years of collecting both data, the faculty discussed the results from both instruments in the fall of 2020. The faculty unanimously recommended discontinuing the National Economic Literacy Survey for measuring PLO1 -Understanding of Basic Economic Concepts effective spring of 2021. The reasons are as follows. TUCE allows the faculty to understand better our students' learning outcome in basic economic concepts. TUCE questions enable us to compare our students' performance with the national average from similar cohort students - college students who completed the course in Economics. In contrast, the National Economic Literacy's national benchmark is drawn from the general population, not from college students or college graduates, in 1998. Questions from TUCE serve as a better assessment instrument. They contain information at a similar level as the ETS Major Field Test – Business questions.
|Results :||PLO Assessment Results: Meeting/Exceeding|
|Improvement/Growth Opportunities:||In the fall of 2020, the faculty discussed the lower outcome for question number 12 of the TUCE economics questions. After carefully reexamining the question, the faculty concluded that the topic had been extensively covered in the microeconomics courses. The faculty decided to include a short review after students complete the test to understand their thought process in answering the question and its possible remedies. Results for the spring of 2021 are shown in this report and will be discussed first thing in the fall of 2021.|