Test Item Analysis of Engineering Economy Subjected to Table of Specification: Input for Digital Test Bank and E-Class record for the College of Engineering
DOI:
https://doi.org/10.64807/natf6e49Abstract
This study is aimed at analyzing the 50-item multiple choice test in engineering economy and subjecting it to test item analysis using the Table of Specifications. The substance of this research paper is derived from Bloom’s taxonomy of learning, which includes Knowledge, Comprehension, Application, Analysis, synthesis, and evaluation, respectively. The analysis of the 50-item test was centered on the TOS specifically: (1) the list of course objectives; (2) the topics covered in class; (3) the amount of time spent on those topics; (4) textbook chapter topics; and (5) the emphasis and space, respectively. The statistical tools used in this study are: metrics of central tendency that include the average and standard deviation of students’ scores after the final examination are subjected to graphical analysis of scores and hypothesis testing to determine if a significant difference exists between the academic performance in the final examination and the final grade of students. The hypothesis testing was done at a 5% level of significance relative to the numerical coefficient for degree of freedom. This study methodically employed a review of related studies and literature to categorize, describe, analyze, and interpret the borrowed materials from different authors both in the local and international domains, respectively. The end goal of this study was to develop a relevant and effective mechanism for a digital LMS intended for testing banking systems and metrics for the grading system. The study further employed CDAI (Categorizing. Describing, Analyzing, and Interpreting), a method that was originally developed by Johnny Saldana (2019). This study was conducted on Quezon City University’s main campus during the school year 2023-2024.
Keywords:
TOS, Test Item, Grading System, Class Record, Bloom's TaxonomyReferences
Bichi. (2015). Item Analysis using a Derived Science Achievement Test Data. International Journal of Science and Research (IJSR), 4(5).
Camba, et al. (2021). Does this OBE Count: Test Development and Item Analysis in High School Dressmaking Course.
Cooper, et al. (n.d.). Using Reliability, Validity, and Item Analysis to Evaluate a Teacher-Developed Test in International Business.
Dichoso, et al. (2020). Test Item Analyzer Using Point-Biserial Correlation and P-Values in the Philippines schools. International Journal of Scientific & Technology Research, 9(04).
Gnaldi. (2013). Methods of Item Analysis in Standardized Student Assessment: an Application to an Italian Case Study. ResearchGate.
Ignacio. (2008). An Analysis of Test Item Pool in Selected Subjects in College Of Business Education. TIP Research Journal Quezon City, 5(1).
Jugar. (2013). An Inquiry on the Roles of Personal Test Item Banking (PTIB) and Table of Specifications (TOS) in the Construction and Utilization of Classroom Tests. International Journal of Education and Research, 1(12).
Lahza, et al. (2022). Beyond item analysis: Connecting student behavior and performance using e-assessment logs. British Journal of Educational Technology.
Lloyd, et al. (2021). Electricity Concepts' Test Construction, Validation, and Item Analysis for Senior High School General Physics 2 IN STEM students in a public national high school in Pangasinan, Philippines.
Mamolo. (2021). Development of an Achievement Test to Measure Students’ Competency in General Mathematics. Anatolian Journal of Education, 6(1).
Marie, et al. (2015). RELEVANCE OF ITEM ANALYSIS IN STANDARDIZING AN ACHIEVEMENT TEST IN TEACHING OF PHYSICAL SCIENCE IN B.ED SYLLABUS. i-manager’s Journal of Educational Technology, 12(3).
McCowan, et al. (1999). Item Analysis for Criterion Referenced Tests. Research Foundation of SUNY/Center for Development of Human Services.
Olaso. (2019). Test Construction and Item Analysis including Rubrics Development at G10-Antipolo’s room on July 1-5, 2019.
Ole, et al. (2021). DEVELOPMENT AND VALIDATION OF A PHYSICS CONCEPT TEST IN KINEMATICS FOR SENIOR HIGH SCHOOL STUDENTS. IOER International Multidisciplinary Research Journal (IIMRJ).
Orongan. (2020). Reliability Analysis on Teachers’ Quarterly Classroom Assessment in Basic Education. LDCU-REPI Asian Scientific Journals, 16(1).
Patrimonio. (2017). THE ETEST BUILDER: A RENAISSANCE IN EDUCATIONAL ASSESSMENT PRACTICE.
Perkins, et al. (2018). An Item Analysis and a Reliability Estimate of a Classroom Kinesiology Achievement Test. Research Foundation of SUNY/Center for Development of Human Services.
Sabri, et al. (2013). ITEM ANALYSIS OF STUDENT COMPREHENSIVE TEST FOR RESEARCH IN TEACHING BEGINNER STRING ENSEMBLE USING MODEL BASED TEACHING AMONG MUSIC STUDENTS IN PUBLIC UNIVERSITIES. semantic scholar.
Sorby, et al. (2013). The Development and Assessment of a Course for enhancing the 3-D Spatial Visualization Skills of First Year Engineering Students. research journal of engineering education.
Tan, et al. (2019). Development of Valid and Reliable Teacher-made Tests for Grade 10 Mathematics. Academia.edu.
Alade, et al. (2014). TABLE OF SPECIFICATION AND ITS RELEVANCE IN EDUCATIONAL DEVELOPMENT ASSESSMENT. European Centre for Research Training and Development UK, 2(1), 1-17.
Anunaobi, et al. (2022). Relevance of Table of Specification in Educational Assessment. International Journal of Innovative Social Sciences & Humanities Research, 10(1), 73-80.
Ballado. (2014). Development and Validation of a Teacher Education Aptitude Test. International Journal of Interdisciplinary Research and Innovations, 2(4), 129-133.
Bautista, et al. (2019). Construct Validity and Difficulty Index of Departmentalized Reading Comprehension Test for Grade 11 Students. Asian EFL Journal Research Articles, 23(3.3).
Cagape. (2009). “Ethical Standards in Tests: Test Preparation and Administration in Philippine Education, A study in Zamboanga Del Sur”. scribd.
Dela Rama. (2011). The Conformity of Test Construction of the Achievement Test Papers of College Teachers: A Case Study. Silliman Journal, 52(2).
Doctor. (2017). Integrated Educational Management Tool for Adamson University. International Journal of Computing Sciences Research, 1(1), 52-71.
Doctor, et al. (2019). DEVELOPMENT AND ACCEPTABILITY OF AN INTEGRATED ITEM ANALYSIS APPLICATION: AN ENHANCEMENT TO ADAMSON UNIVERSITY INTEGRATED EDUCATIONAL MANAGEMENT TOOL. Journal of the World Federation of Associations of Teacher Education.
Fives, et al. (2013). Classroom Test Construction: The Power of a Table of Specifications. Montclair State University.
Gochyyev, et al. (2023). Item Analysis. Sage Research Methods.
Lasaten. (2016). Assessment Methods, Problems and Training Needs of Public High School Teachers in English. International Journal of Languages, Literature and Linguistics, 2(2).
Odukoya, et al. (2017). Item analysis of university-wide multiple choice objective examinations: the experience of a Nigerian private university. National Library of Medicine National Center of Biotechnology Information.
Quaigrain, et al. (2017). Using reliability and item analysis to evaluate a teacher-developed test in educational measurement and evaluation. Taylor and Francis Online Cogent Education, 4(1).
Rezigalla. (2022). Item Analysis: Concept and Application. In M.S. Firstenberg & S.P. Stawicki (Eds.), Medical Education for the 21st Century.
Roleda, et al. (2018). The Effect of Language in Students’ Performance in FCI. Presented at the DLSU Research Congress 2018.
Siena College. (2023). IMPORTANCE OF TABLE OF SPECIFICATION IN CONSTRUCTING TEST ITEMS.
Silao, et al. (2021). DEVELOPMENT OF AN AUTOMATED TEST ITEM ANALYSIS SYSTEM WITH OPTICAL MARK RECOGNITION (OMR). International Journal of Electrical Engineering and Technology (IJEET), 12(1), 67-79.
Siri, et al. (2011). The Use of Item Analysis for the Improvement of Objective Examinations. ResearchGate.
Tamayao, et al. (2020). Design and Validation of the College Readiness Test (CRT) for Filipino K to 12 Graduates. International Journal of Higher Education, 9(2).
Tardeo, et al. (2015). ASSESSMENT OF PHYSICS QUALIFYING EXAMINATION FOR EE AND ECE COURSES: AN ITEM AND OPTION ANALYSIS. LPU-Laguna Journal of Multidisciplinary Research, 4(3).
Thirakunkovit. (2016). AN EVALUATION OF A POST-ENTRY TEST: AN ITEM ANALYSIS USING CLASSICAL TEST THEORY (CTT). Purdue University Purdue e-Pubs.
Yahia. (2021). Post-validation item analysis to assess the validity and reliability of multiple-choice questions at a medical college with an innovative curriculum. Natl Med J India.
Snyder. (2019). Literature review as a research methodology: An overview and guidelines. Journal of Business Research, 104, 333-339.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 QCU The Star

This work is licensed under a Creative Commons Attribution 4.0 International License.
All articles published in QCU Journals are made available under the Creative Commons Attribution 4.0 International License (CC BY 4.0).
This license allows for:
- Sharing – copying and redistributing the material in any medium or format.
- Adapting – remixing, transforming, and building upon the material for any purpose, including commercial use.


