logo logo International Journal of Educational Methodology

IJEM is a leading, peer-reviewed, open access, research journal that provides an online forum for studies in education, by and for scholars and practitioners, worldwide.

Subscribe to

Receive Email Alerts

for special events, calls for papers, and professional development opportunities.

Subscribe

Publisher (HQ)

RHAPSODE
Eurasian Society of Educational Research
College House, 2nd Floor 17 King Edwards Road, Ruislip, London, HA4 7AE, UK
RHAPSODE
Headquarters
College House, 2nd Floor 17 King Edwards Road, Ruislip, London, HA4 7AE, UK
Research Article

The Ordinal Priority Approach for Supporting Teacher Collaboration in Assessment Decisions

Tomomi Kubota

These days, many schools are reviewing their curricula, and Science, Technology, Engineering, Arts, and Mathematics (STEAM) education is one area wher.


  • Pub. date: October 15, 2025
  • Online Pub. date: October 14, 2025
  • Pages: 513-525
  • 126 Downloads
  • 526 Views
  • 0 Citations

How to Cite

Abstract:

T

These days, many schools are reviewing their curricula, and Science, Technology, Engineering, Arts, and Mathematics (STEAM) education is one area where these changes are being applied. Because STEAM education integrates five subjects, it requires an approach in which teachers from these subjects work collaboratively. However, applying traditional assessment methods in STEAM education is challenging, as it requires teachers to jointly decide on appropriate assessment strategies. At present, no clear framework exists to support this process. This study examined the potential of the ordinal priority approach (OPA), a recently introduced method for multi-criteria decision-making, to facilitate teachers’ collaborative selection of assessment methods for STEAM education. It further explored the extent to which subject differences affect collaboration by comparing the decision-making of two groups: a homogeneous group (teachers of the same subject) and a heterogeneous group (teachers of different subjects). Pre- and post-questionnaires were administered to both groups to determine how the OPA can assist teachers in jointly developing a STEAM assessment method. Analyses of the responses identified differences in each group’s consensus-building process. The study revealed three key contributions of OPA to teacher collaboration in STEAM education: 1) it ensures that teachers from diverse subjects have their opinions considered; 2) its transparent decision-making process helps mitigate distrust during discussions; and 3) it promotes fair decision-making, unaffected by social power differences within the group. Based on these findings, OPA appears effective in converging diverse expert opinions through a clear decision-making process.

Keywords: Assessment methods, group decision-making, ordinal priority approach, STEAM education, teacher collaboration.

description PDF
file_save XML
Article Metrics
Views
126
Download
526
Citations
Crossref
0

Introduction

Prompted by recent advances in science and technology, as well as societal changes brought about by globalization, educators worldwide are discussing new perspectives on education. As a result, many schools have begun reviewing their curricula to better support students in this evolving environment. One significant outcome of these changes is the adoption of the Science, Technology, Engineering, Arts, and Mathematics(STEAM)education framework (Quigley et al., 2020), which integrates the foregoing into a unified curriculum (Hom & Dobrijevic, 2022). According to Boice et al. (2021), STEAM “involves utilizing student-centered instructional pedagogies, including [problem-based inquiry learning], group learning, and real-world application, to increase cross-disciplinary content knowledge through learning goals for students in both Science, Technology, Engineering, and Mathematics (STEM) and arts disciplines” (p. 5). Through STEAM education, students can acquire essential 21st-century knowledge and skills, such as critical thinking, creativity, problem-solving, innovation, communication, cooperation, and entrepreneurship (Jolly, 2014). Alongside changes to traditional curricula, these developments have necessitated professional development to prepare teachers for integrating lesson plans (Belbase et al., 2022). Because STEAM education encompasses five distinct subjects, it necessitates a collaborative team approach in which teachers from various disciplines work together (Asghar et al., 2012).

The Ordinal Priority Approach to Group Decision-Making

When integrating opinions within a team, common methods include majority voting or the Borda count. Yet, decisions involving multiple criteria should be shaped directly by these criteria. Various methods have been developed to address the challenges of multi-criteria decision-making (MCDM), which involves selecting the best alternative from a set of options based on criteria provided by multiple experts (Kumar et al., 2017). Among these, the newest isthe ordinal priority approach (OPA), developed by Ataei et al. (2020).OPA addresses MCDM problems through a relationship-based approach, whereby ordinal numbers are assigned to criteria, alternatives, and experts’ opinions according to their relative importance (Ataei et al., 2020). It requires only simple input data and is especially suitable for ranking items by importance, as it does not require quantitative numerical data. Moreover, it avoids the need for scaling data, averaging, or pairwise comparison matrices (Penadés-Plàet al., 2016) because it relies solely on ordinal data (Mahmoudi et al., 2021). Additionally, OPA allows for a clearer determination of the dominance among items than using quantitative preferences or exact ratios (Sadeghi et al., 2023). Its flexible information processing also makes it suitable for implementation under dynamic and ambiguous conditions (Ond et al., 2023).

Teacher Collaboration for STEAM Education

Many educators believe that teacher collaboration is fundamental to delivering effective interdisciplinary education (Herro & Quigley, 2016; Margot & Kettler, 2019). Indeed, Reeves et al. (2017) analyzed data from the Trends in International Mathematics and Science Study and identified teacher collaboration as a significant predictor of student achievement. Teachers working within STEAM education require time to coordinate and integrate their respective subjects (Margot & Kettler, 2019). However, Park et al. (2016) found that most teachers reported insufficient time and concerns about increased workload when implementing STEAM education. Given its simplicity, OPA requires no specialized training, making it easy for teachers to adopt and potentially minimizing redundant discussions.

Although teachers acknowledge the need for cross-disciplinary education in STEAM (Herro & Quigley, 2016), collaboration frequently falls short of expectations (Vangrieken et al., 2015). Limitations in communication between teachers from different subjects have also been noted (Al Salami et al., 2017). Differences in teachers’ perceptions of subject integration can result in varied approaches (Wang et al., 2011) and concerns about collaborative lesson planning (Asghar et al., 2012) or following others’ lesson plans (Bagiati& Evangelou, 2015). Misunderstandings among teachers from different subjects can jeopardize the success of an interdisciplinary curriculum, and conflicting subject knowledge may lead teachers to challenge one another’s approaches, potentially making collaboration more difficult or even impossible (Costantino, 2017). One function of OPA in group decision-making is identifying unreliable experts and inappropriate criteria, which helps minimize sources of distrust(Mahmoudi & Javed, 2022).

STEAM Assessment Challenges

Additionally, when implementing STEAM education, teachers must consider factors such as lesson time, student level, task consistency, progress monitoring, technology use, and assessment methods (Herro et al., 2019). The choice of assessment methods varies based on the task, the competencies and skills to be evaluated, and the projects students undertake (Belbase et al., 2022). While traditional assessment methods are inadequate due to STEAM’s interdisciplinary nature (Dubek et al., 2021), the literature lacks clarity on how students’ learning across STEAM subjects at various levels should be assessed, from basic understanding to advanced application (Belbase et al., 2022). There is also limited guidance for teachers on integrating and assessing these diverse competencies and skills (Glisic & Favaro, 2019). Effective assessment must capture the unique aspects of STEAM education. In this regard, OPA offers a promising solution, as it can rank options by accommodating diverse criteria (Kadaeiet al., 2023) and can also be applied in fuzzy or ambiguous environments to address real-world uncertainty (Kadaeiet al., 2023; Mahmoudi & Javed, 2022).

Group Decision-Making for STEAM Assessment Using OPA

Given OPA’s utility, as explored above, it has the potential to address challenges in teacher collaboration and STEAM assessment. However, the use of structured decision-making tools in education is largely limited to evaluating programs or selecting teachers in higher education. While research on teacher collaboration in STEAMsuch as teacher training programs (Boiceet al.,2021) and knowledge management approaches (Wu, 2022)exists, none provides specificgroupdecision-making methodologies.This study, therefore, proposes the use of OPA to support teacher collaboration in determining assessment methods in STEAM education. Two research questions (RQs) guide this study:

RQ1: How does the degree of inter-subject differences affect teacher collaboration in STEAM education?

RQ2: How can OPA help eliminate barriers teachers face when collaborating to formulate STEAM assessments?

Methodology

To rank assessment methods in STEAM education, this study recruited five teachers (hereafter referred to as “experts”)from a single high school renowned locally for its extensive STEAM program .The selection was based on the fact that these teachers are the primary instructors for STEAM education at the school. As this is an exploratory study, the limited generalizability of the findings is acknowledged.

At this school, students work in small groups to explore topics of interest, with teachers assisting with experiments and analyses. Students are given opportunities to present their results at a learning presentation event attended by other grades and invited teachers from other schools. The experts recruited for this study were core members of the school’s STEAM program, with lesson planning typically carried out by two or three teachers working together. The experts completed a preliminary questionnaire, participated in the OPA task, and subsequently completed a post-questionnaire.

Pre-questionnaire

The five experts completed the pre-questionnaire the day before the OPA task. This questionnaire investigated how teachers collaborate with colleagues in planning STEAM assessments to understand the current state of such collaboration. Responses were collected using various formats, including a four-point Likert-type scale, binary yes/no items, and open-ended questions for aspects requiring more detailed exploration (see Appendix A).

Ordinal Priority Approach Task

The OPA task was conducted in November 2024. The subjects taught by the five experts are listed in Table 1. The experts were divided into two groups to match the typical number of teachers involved in lesson planning and to facilitate comparison between decision-making in a homogeneous group (same subject; HoGr) and a heterogeneous group (different subjects;HeGr). The time taken to reach decisions was recorded to provide insight into the widely cited challenge of time constraints in implementing STEAM education. In addition, the decision-making process was recorded, and the participants’ comments were transcribed.

Table 1. Expert Information

Group Expert Subject Experience (years)
HoGr 1 Science (geology) 15
  2 Science (chemistry) 10
HeGr 3 Life science 29
  4 Informatics 20
  5 Mathematics 24

The OPA task followed the eight steps of the OPA process (Table 2). The components used in the OPA ranking calculation (Table 2, Step 7) are presented with formulae (1)–(4) in Table 3(Appendix B).

Table 2. The Eight-Step OPA Process

Step Process Content
1 Identify a goal. The teachers aim to determine an assessment method.
2 Decide on the experts to participate in the decision-making and rank them according to their experience and knowledge. Identify the teachers and rank them based on their position (previously decided during discussion).
3 Identify alternatives. The alternatives are various assessment methods. Experts discuss these and write down their preferred alternatives on a blank sheet of paper.
4 Identify the important criteria relevant to the goal. The teachers determine the competencies and skills they wish to assess.
5 Rank the criteria according to their perceived importance. Each teacher writes their rankings down on paper.
6 Rank the alternatives according to their relevance to the criteria.  
7 Create a linear programming model using the components and execute it. Teachers’ ranking data are entered into OPA Solver v1.
8 Check the results. Share the results with the five teachers.

Post-questionnaire

Immediately after the OPA task, the five experts completed the post-questionnaire. This questionnaire examined how applying the OPA during teacher collaboration influenced their selection of an assessment method. Responses were collected using a four-point Likert-type scale and open-ended questions for aspects requiring further detail (see Appendix C).

Results

Difficulties of STEAM Assessment

During the OPA task, two experts commented on challenges in STEAM assessment.Expert1stated, “Some competencies cannot be assessed objectively. For example, the ‘ability to work in teams’ is an indispensable competency in STEAM education, and I would like to assess it. However, it is difficult because there is no way to assess it objectively.Expert 2 noted, “I’m not very keen on ‘applying the learning to real-world situations,’ but it is said to be necessary as an element of STEAM education, ”and “I would like to assess by watching student discussions, but it is difficult because this assessment method takes a lot of time and effort.

According to the pre-questionnaire, teachers needed an average of 57.5 minutes to reconcile their various opinions (item 1) and an average of 88 minutes to reach a decision (item 2). When asked what takes the most time when preparing for STEAM assessment with other teachers (item 3), experts mentioned “deciding what to follow from the previous year and what to improve,” “choosing teaching materials and coming up with ideas for assessment questions,” “sharing objectives and content and making lesson plans,” and “understanding the learning content and syllabus of other subjects.”

The homogeneous group (HoGr) took 17 minutes to determine the alternatives and criteria, 9 minutes to rank them independently, and 2 minutes to input the data into the OPA Solver. The heterogeneous group (HeGr) took 30 minutes to determine the alternatives and criteria, 14 minutes to rank them independently, and 4 minutes to input data.

From the post-questionnaire, regarding the length of time (item 1), three experts answered that it was the “same as usual,” and two answered that it was “longer.” Regarding workload (item 2), three experts answered that it was the “same as usual,” and two answered that it was “greater.”

Process of Homogeneous Group Task

When determining the alternatives and criteria, Ho reached decisions through dialogue. They recorded five criteria and five alternatives on the sheet. Table 4(Appendix D)shows the experts’ rankings of criteria, and Table 5(Appendix D)their rankings of alternatives for each criterion.The data from each expert were then input into OPA Solver. Using the rankings in Tables 4 and 5, the weights of the experts, criteria, and alternatives were calculated (Table 6, Appendix D).

During the OPA task, Expert 1 remarked, “The alternatives did not include a method for assessing the competency of collaborating within a team” and “We eliminated the trivial criteria and reached a consensus on which criteria were mostimportant.” Expert 2 said, “To assess this competency, this alternative was necessary, yet it was not included,” “Through sufficient discussion and consensus-building, the result gradually moved closer to one that was convincing,” and “Through the discussion, it became clear that some areas were lacking, and I felt that further discussion was necessary.”

Process of Heterogeneous Group Task

In contrast to HoGr, HeGr first considered the alternatives and criteria individually, then shared their selections, ensuring duplicates were included in the overall list. They recorded eight criteria and five alternatives. Table 7 (Appendix E) shows the experts’ rankings of criteria, and Table 8 (Appendix E) shows the rankings of alternatives for each criterion. The data from each expert was then input into OPA Solver. Based on the prioritization results in Tables 7 and 8, the weights of the experts, criteria, and alternatives were calculated (Table 9, Appendix E).

During the OPA task, Expert 4 commented, “The problem-solving [criteria], which was supposed to be ranked first when the alternatives and criteria were identified in the group, ended up with a lower ranking when prioritizing individually.” Expert 5 observed, “There was a difference between the rankings anticipated at the stage of identifying alternatives and criteria as a group and the rankings assigned when prioritizing individually.”

DiversePerspectives

In the pre-questionnaire, regarding their ability to make decisions while considering all necessary factors (item 4), all experts answered “usually.”During the OPA task, experts commented as follows.Expert1stated,“The competencies I want to measure are numerous.”Expert 2 remarked,“Even among colleagues teaching the same subject, there are instances where they emphasize criteria that I had not considered significant or choose to omit questions from the test that I believe should be included.”Expert3said,“It is very difficult for three teachers to discuss and integrate each opinion,” and“If you are in another subject, I do not know what your intentions are, and they will be completely different.”

In the post-questionnaire, when asked about considering the various factors (item 3), two experts answered “not much” and three answered “somewhat.” One reason given for answering “not much” was that “conflicts between teachers always occur due to differences in beliefs (these are non-negotiable things). Respecting the opinions of many people will delay the lesson plan, so there are situations where it is better to proceed on my own.” Reasons for answering “somewhat” included “The order in which things should be prioritized was clear” and “I felt it was meaningful to encourage deeper thinking through prioritizing.”

Satisfaction ofDecisionMaking

In the pre-questionnaireregarding whether differences in direction or understanding occur when deciding on assessments (item 5), two experts answered “yes,” and three answered “no.” “Yes” responses mentioned “differences in perception will always occur due to differences in the knowledge they have,” and “differences in choosing how to develop students, and differences in perception of how deeply to think about the assessment method.”Regarding past experiences of being unconvinced or discussions proceeding without consensus (item 6), two answered “rarely” and three answered “sometimes.” “Sometimes” responses cited, for example, “discussions with a teacher who has different degrees of detail regarding assessment,” “materials that I thought were important were omitted because another teacher thought these were unnecessary,” and “discussions with a teacher who wanted to reduce the descriptive questions to reduce the burden on teachers, or who wanted to increase the descriptive questions to improve writing skills.”

According to the post-questionnaire, regarding whether the experts were convinced by the OPA’s outcome (item 4), two HoGrexperts answered “not much” and three HeGrexperts answered “somewhat.” The reasons given for answering “not much” included, “Assessment of STEAM education is a difficult task, so it was meaningful that we were able to recognize the results that we were not convinced by,” and “We were unable to come up with appropriate alternatives, and the issues related to the assessment of STEAM education itself have become clear. We were able to build consensus to some extent through discussion.” The reason given for answering “somewhat” was “because the results and my feelings were matched to some extent.” Regarding the clarity of the process (item 5), four experts answered “somewhat” and one “extremely.” Reasons for answering “somewhat” included, “because it can be predicted to a certain extent at the ranking stage” and “if there were valid criteria and alternatives, I feel that it would be somewhat clear.” The reason given for answering “extremely” was “because we can clearly rank the usually ambiguous decision-making factors.”

SharingMutualUnderstanding

According to the pre-questionnaire,regarding sharing mutual understanding of assessment methods (item 7), two experts answered “never,” and three answered “usually.” Of those three, two indicated that mutual understanding was achieved through daily casual conversations (the other response was invalid).

During the OPA task, Expert 2 remarked, “Because there were the sufficient discussions, I feel that my priorities and [Expert 1’s] will be almost the same,” “Although I thought that we had thoroughly discussed and reached a consensus atthe stage of proposing alternatives and criteria, it is surprising that there were differences in the rankings,” and “Even after thorough discussions, there are areas where different rankings have been assigned.”

According to the post-questionnaire, regarding OPA’s ability to facilitate shared understanding (item 6), two experts answered “somewhat” and three answered “extremely.” One reason given for answering “somewhat” was, “Although there were considerable differences in the competencies to be assessed depending on the subject, we could get a consensus to some extent.” The reasons given for answering “extremely” were “because we were able to smoothly discuss what was important to me,” “because we conducted lessons together regularly,” and “because the other expert was a good facilitator.”

Influence of Factors Unrelated to the Discussion

According to the pre-questionnaire, regarding whether past decisions had been influenced by factors not directly related to the discussion (item 8), three experts answered “rarely,” and two answered “sometimes.”In the post-questionnaire, regarding this same issue (item 7), three experts answered “not at all” and two answered “not much.”

Prioritizing Experts

InHoGr, Expert 1took the lead during lesson planning. Their ranking is shown in Table 6.In HeGr, Expert 3 usually led, followed by Expert 4 and Expert 5; their rankings are shown in Table 9. According to the post-questionnaire, regarding expert prioritization (item 8), experts mentioned, “There is usually a hierarchy among teachers, so I think that prioritizing is essential,” and “It would be good to be able to determine how to decide who is most prioritized.”

Impressions of Using OPA

During the OPA task, Expert 1 said, “I think that OPA would be most effective during group decision-making with a large group of around five to seven people.” Expert 4 said, “OPA will be useful when trying to include the opinions of new teachers, for example, who may find it difficult to express their opinions.” In the post-questionnaire, regarding how they felt about decision-making using OPA (item 9), experts expressed the following views: “Although it is important that the criteria and alternatives are valid, I felt it was difficult in educational activities, especially lessons,” “I felt OPA was an interesting method; however, the decision-making will not improve unless we repeat the steps several times,” “I felt that we needed to think a little more about introducing OPA,” “I think it would be good if it was a process that integrates many opinions in the process of creating something new with multiple teachers,” and “It is good to use OPA as a method to understand each expert’s opinion.”

Discussion

Degree of Inter-Subject Differences Affecting Teacher Collaboration

The characteristics of each group were examined based on experts’ statements, and differences were observed in how they identified alternatives and criteria. In the homogeneous group (HoGr), consisting of teachers from the same subject, opinions were summarized through dialogue. In contrast, the heterogeneous group (HeGr), composed of teachers from different subjects, had more dispersed opinions, and the summaries reflected more individualized perspectives. Experts’ comments suggested that HoGr formed a deeper consensus when organizing alternatives and criteria than HeGr.

Regarding diversity within teams, effective decision-making requires both considering a wide range of options and reaching consensus on a preferred solution. This reflects the diversity paradox: although team diversity can generate more options to address decision-making problems successfully, it may also increase the likelihood of conflict and hinder convergence (Linder et al., 2025). A comparison of Tables 4 and 7 shows substantial differences in prioritization within HeGr, indicating greater variance in priority rankings compared with HoGr. However, according to post-questionnaire item 4, HeGrmembers were more satisfied with the decision-making outcome than HoGrmembers.

Statements from HeGr participants indicate that their individual rankings sometimes differed from what they expected after collaboratively presenting alternatives and criteria. This suggests that what they considered important changed when working individually compared with in groups. Listing alternatives and criteria through the OPA appeared to make each expert more aware of subject-area differences, prompting discussions aimed at consensus building. These findings suggest that OPA is especially suitable for converging diverse viewpoints rather than fostering deeper consensus within already homogeneous groups.

Additionally, an unexpected finding was that even colleagues who frequently engaged in discussions did not always assign the same rankings to criteria and alternatives, contrary to expectations. Differences in perception persisted even among participants who believed they had achieved complete consensus. Because OPA expresses rankings as ordinal numbers, it clearly revealed these differences even after consensus-building efforts. This was also reflected in responses to post-questionnaire item 5, where participants praised OPA’s ability to make the decision-making process transparent.

OPA Helps Eliminate Obstacles Teachers Face When Collaborating

Because STEAM requires collaboration among teachers from multiple subjects, the competencies and skills they prioritize for assessment inevitably vary. Notably, OPA places no limit on the number of criteria and alternatives (Mahmoudi & Javed, 2022) , allowing for the inclusion of numerous competencies, skills, and potential assessment methods.

Moreover, assigning priorities enables teachers to share which competencies and skills to emphasize and to what extent in the assessment.OPA facilitates fair and transparent decision-making by avoiding ambiguous judgments, reducing misunderstandings, and mitigating potential distrust among colleagues. Transparency is considered a key component of school organizations (Shagholi & Hussin, 2009). By visualizing prioritization and results, OPA contributed to transparency in decision-making,as evidenced by the shift between pre-questionnaire item 7 and post-questionnaire item 6 responses.

OPA also helps prevent the personalization of opinions. Comparing responses to pre-questionnaire item 8 and post-questionnaire item 7 indicates that the method was meaningful for limiting power imbalances. In some school contexts, senior or more experienced teachers hold disproportionate authority, which can erode trust and hinder collaboration. As Sachs et al. (2012) note, senior teachers must recognize when to relinquish authority and when to empower novice teachers. Because in OPA, experts rank alternatives and criteria individually after their identification, each opinion is evaluated independently, helping to balance influence among participants.

Conclusion

Given the interdisciplinary nature of STEAM education, collaboration among teachers from multiple subjects is essential. However, challenges such as limited communication and distrust stemming from subject-specific conflicts have been reported. Additionally, existing STEAM assessment methods for evaluating interdisciplinary learning outcomes have been criticized as ambiguous.

To improve collaboration in determining STEAM assessment methodsthis study employed the ordinal priority approach (OPA), a specific and innovative method not previously used in educationThe results revealed differences in decision-making between homogeneous and heterogeneous teacher groups. Homogeneous groups were better able to clarify mutual differences in opinion, while heterogeneous groups facilitated decision-making by prioritizing diverse perspectives.

OPA was shown to contribute to teacher collaboration in STEAM education in three ways:

It allows for the consideration of diverse competencies and skills that teachers from various subjects wish to assess.

Its clear decision-making process ensures transparency, helping mitigate distrust.

By providing each teacher with an opportunity to express their views, OPA supports fair decision-making without undue influence from power dynamics.

Given these benefits, OPA appears to be effective for synthesizing diverse expert opinions through a structured and transparent process. As group size increases, capturing all viewpoints becomes more difficult, and discussions risk being dominated by a few teachers. OPA may therefore be particularly useful in groups with power imbalances, such as those with varying levels of experience or institutional positions.

Recommendations

In terms of time and effort, the duration for identifying alternatives and criteria and reaching decisions using OPA was shorter than the averages reported in pre-questionnaire items 1 and 2. However, according to post-questionnaire items 1 and 2, participants felt the decision-making time and effort were similar to or longer than those of standard discussions. As noted in the Introduction, appropriate STEAM assessment remains a contested and challenging issue; therefore, participants may have focused more on the difficulty of the decision-making contents than on the OPA itself. Their statements about the inherent difficulty of STEAM assessment were consistent with earlier research, which also found uncertainty about how best to implement and assess STEAM education.

In post-questionnaire items 3 and 4, two HoGr participants answered “not much.” As shown in the HoGr process section, these participants discovered that alternatives they had not considered emerged when ranking individually. However, because the OPA procedure was strictly followed, no alternatives could be added at this stage—an outcome that influenced their post-task evaluations. It is therefore recommended that an optional step be introduced to verify the validity of alternatives and criteria at multiple points during the process. Additionally, since OPA only prioritizes multiple alternatives, it is important to also consider situations where the outcome may be unsatisfactory. Future study is needed on including a subsequent phase for generating new alternatives, when the best choice cannot be identified from the available alternatives.

Limitations

This study involved only a small number of experts to facilitate rapid evaluation of a novel method. Future studies should include larger samples to strengthen the findings.

Ethics Statement

The study was reviewed and approved by Tohoku University. All participants were informed about the study in advance and signed an informed consent form.

The author thanks the five teachers who generously shared their time and insights by participating in this study.

Conflict of Interest

The author declares no conflict of interest.

Funding

This research was funded by JSPS KAKENHI, Grant-in-AidforJSPSFellows, Grant Number 24KJ0404.

Generative AI Statement

As the author of this work, I used the AI tool Elicit for literature searches. After using this tool, I reviewed and verified the final version of the work. I take full responsibility for the content of the published work.

References

Al Salami, M. K., Makela, C. J., & de Miranda, M. A. (2017). Assessing changes in teachers’ attitudes toward interdisciplinary STEM teaching. International Journal of Technology and Design Education, 27, 63-88. https://doi.org/10.1007/s10798-015-9341-0

Asghar, A., Ellington, R., Rice, E., Johnson, F., & Prime, G. M. (2012). Supporting STEM education in secondary science contexts. Interdisciplinary Journal of Problem-Based Learning, 6(2), 85-125. https://doi.org/10.7771/1541-5015.1349

Ataei, Y., Mahmoudi, A., Feylizadeh, R. M., & Li, D.-F. (2020). Ordinal priority approach (OPA) in multiple attribute decision-making. Applied Soft Computing, 86, Article 105893. https://doi.org/10.1016/j.asoc.2019.105893

Bagiati, A., & Evangelou, D. (2015). Engineering curriculum in the preschool classroom: The teacher’s experience. European Early Childhood Education Research Journal, 23(1), 112-128. https://doi.org/10.1080/1350293X.2014.991099

Belbase, S., Mainali, B. R., Kasemsukpipat, W., Tairab, H., Gochoo, M., & Jarrah, A. (2022). At the dawn of science, technology, engineering, arts, and mathematics (STEAM) education: Prospects, priorities, processes, and problems. International Journal of Mathematical Education in Science and Technology, 53(11), 2919-2955. https://doi.org/10.1080/0020739X.2021.1922943

Boice, K. L., Jackson, J. R., Alemdar, M., Rao, A. E., Grossman, S., & Usselman, M. (2021). Supporting teachers on their STEAM journey: A collaborative STEAM teacher training program. Education Sciences, 11(3), Article 105. https://doi.org/10.3390/educsci11030105

Costantino, T. (2017). STEAM by another name: Transdisciplinary practice in art and design education. Arts Education Policy Review, 119(2), 100-106. https://doi.org/10.1080/10632913.2017.1292973

Dubek, M., DeLuca, C., & Rickey, N. (2021). Unlocking the potential of STEAM education: How exemplary teachers navigate assessment challenges. The Journal of Educational Research, 114(6), 513-525. https://doi.org/10.1080/00220671.2021.1990002

Glisic, M., & Favaro, P. (2019). Empowering modern learners: 21st century learning—Phase 1 evaluation report. Peel District School Board. https://bit.ly/4migpNO

Herro, D., & Quigley, C. (2016). Exploring teachers’ perceptions of STEAM teaching through professional development: Implications for teacher educators. Professional Development in Education, 43(3), 416-438. https://doi.org/10.1080/19415257.2016.1205507

Herro, D., Quigley, C., & Cian, H. (2019). The challenges of STEAM instruction: Lessons from the subject. Action in Teacher Education, 41(2), 172-190. https://doi.org/10.1080/01626620.2018.1551159

Hom, E. J., & Dobrijevic, D. (2022, October 26). What is STEM education? Live Science. https://bit.ly/3UbIYAz

Jolly, A. (2014, November 18). STEM vs. STEAM: Do the arts belong? Education Week. https://bit.ly/4o8R4rk

Kadaei, S., Nezam, Z., González-Lezcano, R. A., Shokrpour, S., Mohammadtaheri, A., Doraj, P., & Akar, U. (2023). A new approach to determine the reverse logistics-related issues of smart buildings focusing on sustainable architecture. Frontiers in Environmental Science, 10, Article 1079522. https://doi.org/10.3389/fenvs.2022.1079522

Kumar, A., Sah, B., Singh, R. A., Deng, Y., He, X., Kumar, P., & Bansal, R. C. (2017). A review of multi criteria decision making (MCDM) towards sustainable renewable energy development. Renewable and Sustainable Energy Reviews, 69, 596-609. https://doi.org/10.1016/j.rser.2016.11.191

Linder, C., Lechner, C., & Villani, E. (2025). Make it work - The challenge to diversity in entrepreneurial teams: A configurational perspective. European Management Journal, 43(1), 74-88. https://doi.org/10.1016/j.emj.2024.01.004

Mahmoudi, A., Deng, X., Javed, S. A., & Zhang, N. (2021). Sustainable supplier selection in megaprojects: Grey ordinal priority approach. Business Strategy and the Environment, 30(1), 318-339. https://doi.org/10.1002/bse.2623

Mahmoudi, A., & Javed, S. A. (2022). Performance evaluation of construction sub‐contractors using ordinal priority approach. Evaluation and Program Planning, 91, Article 102022. https://doi.org/10.1016/j.evalprogplan.2021.102022

Margot, K. C., & Kettler, T. (2019). Teachers’ perception of STEM integration and education: A systematic literature review. International Journal of STEM Education, 6, Article 2, https://doi.org/10.1186/s40594-018-0151-2

Onden, I., Deveci, M., & Onden, A. (2023). Green energy source storage location analysis based on GIS and fuzzy Einstein based ordinal priority approach. Sustainable Energy Technologies and Assessments, 57, Article 103205. https://doi.org/10.1016/j.seta.2023.103205

Park, H., Byun, S.-Y., Sim, J., Han, H.-S., & Baek, Y. S. (2016). Teachers’ perceptions and practices of STEAM education in South Korea. Eurasia Journal of Mathematics, Science, and Technology Education, 12(7), 1739-1753. https://doi.org/10.12973/eurasia.2016.1531a

Penadés-Plà, V., García-Segura, T., Martí, J. V., & Yepes, V. (2016). A review of multi-criteria decision-making methods applied to the sustainable bridge design. Sustainability, 8(12), Article 1295. https://doi.org/10.3390/su8121295

Quigley, C. F., Herro, D., King, E., & Plank, H. (2020). STEAM designed and enacted: Understanding the process of design and implementation of STEAM curriculum in an elementary school. Journal of Science Education and Technology, 29, 499-518. https://doi.org/10.1007/s10956-020-09832-w

Reeves, P. M., Pun, W. H., & Chung, K. S. (2017). Influence of teacher collaboration on job satisfaction and student achievement. Teaching and Teacher Education, 67, 227-236. https://doi.org/10.1016/j.tate.2017.06.016

Sachs, G. D., Fisher, T., & Cannon, J. (2012). Collaboration, mentoring and co-teaching in teacher education. Journal of Teacher Education for Sustainability, 13(2), 70-86. https://doi.org/10.2478/v10099-011-0015-z

Sadeghi, M., Mahmoudi, A., Deng, X., & Luo, X. (2023). Prioritizing requirements for implementing blockchain technology in construction supply chain based on circular economy: Fuzzy Ordinal Priority Approach. International Journal of Environmental Science and Technology, 20, 4991-5012. https://doi.org/10.1007/s13762-022-04298-2

Shagholi, R., & Hussin, S. (2009). Participatory management: An opportunity for human resources in education. Procedia Social and Behavioral Sciences, 1(1), 1939-1943. https://doi.org/10.1016/j.sbspro.2009.01.341

Vangrieken, K., Dochy, F., Raes, E., & Kyndt, E. (2015). Teacher collaboration: A systematic review. Educational Research Review, 15, 17-40. https://doi.org/10.1016/j.edurev.2015.04.002

Wang, H.-H., Moore, T. J., Roehrig, G. H., & Park, M. S. (2011). STEM integration: Teacher perceptions and practice. Journal of Pre-College Engineering Education Research, 1(2), Article 2. https://doi.org/10.5703/1288284314636

Wu, Z. (2022). Understanding teachers’ cross-disciplinary collaboration for STEAM education: Building a digital community of practice. Thinking Skills and Creativity, 46, Article 101178. https://doi.org/10.1016/j.tsc.2022.101178

 

...