image (5) (1)

QS Global News Just Published Our Recent IEEE Publication

Saudi Arabia – Wajid Hussain, Director of the Office of Quality and Accreditation at the Faculty of Engineering, Islamic University, William Spady CEO IN4OBE and Lindsey Conner, professor, College of Education, Psychology and Social Work at Flinders University published a research article on guidelines for remote accreditation post COVID-19 in the IEEE Journal.

The research article provides authentic theoretical, conceptual, and practical frameworks that offer accreditation agencies, educational institutions, and engineering programs detailed guidelines for conducting credible remote program evaluations during and after the COVID19 pandemic.

A novel meta-framework is used to qualify digital Integrated Quality Management Systems
for three engineering programs seeking accreditation. The Integrated Digital Quality Systems utilize authentic OBE frameworks and assessment methodology to automate collection, evaluation, and reporting of precision CQI data.

A novel Remote Evaluator Module that enables successful virtual ABET accreditation audits is presented in detail.

A theory based mixed methods approach is applied for evaluations. Detailed results and discussions show how various phases of the meta framework help to qualify the context, construct, causal links, processes, technology, data collection and outcomes of comprehensive CQI efforts.

Industrial-Training-Courses

Industrial Training Courses: A Challenge during the COVID19 Pandemic

Abstract:

Industrial training courses require students to gain sufficient practical engineering experience that confirms theoretical knowledge by application to field work. The courses expose students to real life engineering activity involving problem solving, design, experimentation and manufacturing. Students get introduced to entrepreneurship, diverse collaborative work environments and quality systems that instill world class safety standards and professional ethics. Preventive measures and lockdowns during prolonged pandemic conditions have severely limited students’ capability for in-person participation of onsite industrial training programs, thereby, adversely affecting the scope of training courses. This paper presents some plausible solutions to challenges faced by both instructors and students in fulfillment of essential outcomes for remote offerings of industrial training courses during the COVID19 pandemic. Essential aspects of an outcome based digital platform used for remote management, assessment and evaluation of industrial training courses are presented. A course template that facilitates virtual engineering roles as viable alternative to students’ in-person participation in industry settings is explained. This study compares two course models offered prior to and during pandemic conditions for fulfillment of course outcomes, makes observations of required skills and knowledge, related deficiencies and some recommendations to help engineering programs enhance student learning in remotely offered industrial training courses.
Published in: 2020 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE)
Date of Conference: 8-11 Dec. 2020
DOI: 10.1109/TALE48869.2020.9368455
Date Added to IEEE Xplore: 08 March 2021
Conference Location: Takamatsu, Japan
Publisher: IEEE
Evaluations-of-Engineering-Programs

Impact Evaluations of Engineering Programs Using ABET Student Outcomes

Abstract:

Engineering programs worldwide collect and report student learning outcomes data to conduct program evaluations for quality assurance and accreditation purposes. Accreditation agencies such as ABET typically mandate that at least two years of program evaluation data be provided and for institutions to show how this data has been used for continuous quality improvement. Engineering programs rarely evaluate interventions using multi-term student outcomes information over several years, since this quantitative data generally lacks accuracy and statistical power. The quality of outcomes data is affected by obsolete assessment methods and lack of digital access and technical analysis. In this study, we present essential elements of an authentic outcome based assessment model that used web-based software and embedded assessment technology to collect and report accurate cohort outcomes for credible multi-term evaluations. A non-experimental approach employing regression analyses were used to identify trends in student outcomes and evaluate the impact for three engineering programs. Detailed rubrics provide criteria to accurately classify multi-year student outcomes. The findings of this study present practical steps for engineering programs to effectively collect and report accurate cohort outcomes data and perform credible evaluations of program interventions based on multi-year outcomes data.
Published in: IEEE Access ( Volume: 9)
Page(s): 46166 – 46190
DOI: 10.1109/ACCESS.2021.3066921
Date of Publication: 17 March 2021 
Electronic ISSN: 2169-3536
Publisher: IEEE
Quality-Improvement

Quality Improvement With Automated Engineering Program Evaluations

Abstract:

In this paper, we present examples of quality improvement efforts to enhance student learning in engineering education by employing a novel program evaluation methodology that automates ABET Student Outcomes (SOs) data measurement and analysis based on the classification of specific performance indicators per Bloom’s 3 domains and their learning levels. The learning levels are further categorized based on a 3-Level Skills Grouping Methodology that groups together learning levels of related proficiency. Program evaluations use aggregate values of ABET SOs as an overall performance index. These values are calculated by assigning weights to measured specific performance indicators according to the Frequency-Hierarchy Weighting-Factors Scheme, which incorporates a hierarchy of measured skills, course levels in which they are measured, and counts of assessments implemented for their measurement. The number of assessments processed for measurement of performance indicators associated with the 3 categories of skills in multiple course levels is counted to calculate percentage learning distribution in the elementary, intermediate and advanced levels for the 3 learning domains. Learning distributions obtained for measured ABET SOs are compared to ideal models to verify standards of achievement for required types of skills, proficiency levels and align engineering curriculum delivery to attain highest levels of holistic learning.
Published in: 2016 IEEE Frontiers in Education Conference (FIE)
INSPEC Accession Number: 16505014
Date of Conference: 12-15 Oct. 2016
Date Added to IEEE Xplore:01 December 2016
DOI: 10.1109/FIE.2016.7757418
Conference Location: Erie, PA, USA
Publisher: IEEE
sunset (1)

ABET Accreditation During and After COVID19 – Navigating the Digital Age

Abstract:

Engineering accreditation agencies and governmental educational bodies worldwide require programs to evaluate specific learning outcomes information for attainment of student learning and establish accountability. Ranking and accreditation have resulted in programs adopting shortcut approaches to collate cohort information with minimally acceptable rigor for Continuous Quality Improvement (CQI). With tens of thousands of engineering programs seeking accreditation, qualifying program evaluations that are based on reliable and accurate cohort outcomes is becoming increasingly complex and is high stakes. Manual data collection processes and vague performance criteria assimilate inaccurate or insufficient learning outcomes information that cannot be used for effective CQI. Additionally, due to the COVID19 global pandemic, many accreditation bodies have cancelled onsite visits and either deferred or announced virtual audit visits for upcoming accreditation cycles. In this study, we examine a novel meta-framework to qualify state of the art digital Integrated Quality Management Systems for three engineering programs seeking accreditation. The digital quality systems utilize authentic OBE frameworks and assessment methodology to automate collection, evaluation and reporting of precision CQI data. A novel Remote Evaluator Module that enables successful virtual ABET accreditation audits is presented. A theory based mixed methods approach is applied for evaluations. Detailed results and discussions show how various phases of the meta-framework help to qualify the context, construct, causal links, processes, technology, data collection and outcomes of comprehensive CQI efforts. Key stakeholders such as accreditation agencies and universities can adopt this multi-dimensional approach for employing a holistic meta-framework to achieve accurate and credible remote accreditation of engineering programs.
 
Published in: IEEE Access ( Volume: 8)
Page(s): 218997 – 219046
Date of Publication: 01 December 2020
Electronic ISSN: 2169-3536
INSPEC Accession Number: 20190964
DOI: 10.1109/ACCESS.2020.3041736
Publisher: IEEE
 
Theoretical, conceptual and practical frameworks based on social science, evaluation and program theory for Mixed Methods Theory Based Impact Evaluations (MMTBIEs) to achieve credible remote ABET accreditation of digital Integrated Quality Management Systems (IQMS) consisting of 6 Plan Do Check Act (PDCA) quality cycles.