Next Article in Journal
Emergency Management in Coal Mining: Developing a Capability-Based Model in Indonesia
Previous Article in Journal
Expanding the Fine-Kinney Methodology Using Fuzzy Logic: A Case Study in an Energy Linemen Workshop
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Factor-Based Analysis of Certification Validity in Engineering Safety

by
Samat Baigereyev
1,*,
Zhadyra Konurbayeva
2,
Monika Kulisz
3,
Saule Rakhmetullina
4 and
Assiya Mashekenova
1
1
International School of Engineering, D. Serikbayev East Kazakhstan Technical University, Serikbayev Str. 19, Ust-Kamenogorsk 070003, Kazakhstan
2
Business School, D. Serikbayev East Kazakhstan Technical University, Ust-Kamenogorsk 070003, Kazakhstan
3
Faculty of Management, Lublin University of Technology, 20-618 Lublin, Poland
4
School of Digital Technologies and Artificial Intelligence, D. Serikbayev East Kazakhstan Technical University, Ust-Kamenogorsk 070003, Kazakhstan
*
Author to whom correspondence should be addressed.
Safety 2025, 11(4), 95; https://doi.org/10.3390/safety11040095
Submission received: 1 August 2025 / Revised: 24 September 2025 / Accepted: 26 September 2025 / Published: 2 October 2025

Abstract

Professional certification of engineers plays a crucial role in verifying competencies and ensuring the safety and quality of engineering outputs. However, most existing certification systems assign fixed validity periods (e.g., 3–5 years) without considering individual engineer characteristics or the intensity of technological progress in specific fields. This study examines the key factors influencing the optimal validity period of engineering certifications and proposes it as a measurable indicator to support safety in engineering practice. A new model is introduced that integrates expert judgment, fuzzy set theory, and bibliometric analysis of Q1/Q2 Scopus-indexed publications. The model incorporates three main factors: competence level, professional experience, and the technological intensity of the discipline. A case study from the engineering certification system of Kazakhstan demonstrates the model’s practical applicability. Certification bodies, policymakers, and engineering organizations can use these findings to establish more flexible certification validity periods, thereby ensuring timely reassessment of competencies and reducing safety risks. For example, for mechanical engineers, the optimal validity period is 3 years rather than the statutory 5 years; in other words, the model recommends a 40% reduction in certification validity. This reduction reflects the combined effects of competency level, professional experience, and technology intensity on certification renewal schedules. Overall, the proposed factorial approach supports a more personalized and safety-oriented certification process and offers insights into improving national qualification systems.

1. Introduction

Ensuring the safety and quality of engineering outputs remains a critical challenge for governments, industry, and society. In many countries, improving the quality of engineering education and professional development is considered one of the most pressing issues. This is largely due to the high level of responsibility of engineers, whose decisions and actions directly affect public safety, environmental sustainability and the reliability of critical infrastructure [1,2,3]. History has shown that inadequate qualifications of engineers can lead to serious consequences, including industrial accidents, structural failures, and man-made disasters. In this context, it is also important to define safety and quality standards governing engineering practice, as well as to provide basic training in the functional safety of technologies to ensure the safe use of machines, equipment and products by third parties. International standards such as ISO 9001 (quality management) [4], ISO 45001 (occupational health and safety) [5], and IEC 61508 (functional safety) [6] are of particular relevance in this regard. Moreover, the ergonomic education of engineers can significantly contribute to these aspects, since safety is considered an integral part of ergonomics. In Europe, for example, certification systems recognize the professional qualification of ergonomists within the framework of the CREE (Certified European Ergonomist) system, which emphasizes the importance of human-centered design and a safe working environment. These competencies related to safety and ergonomics are considered in the certification system as part of several evaluation criteria, including those related to developing engineering solutions in accordance with specified requirements, evaluating the social implications of engineering practice, and adhering to professional ethics and standards. This raises the question of the extent to which engineers and technicians should receive basic and continuing education in ergonomics and safety-related competencies, especially when they perform functions as safety specialists.
However, in current practice, technicians and engineers typically receive only basic instruction in occupational safety, while systematic ergonomics training is rarely included in their education. Specialized methods, such as the Machine Time Method (MTM), are usually applied only in specific industrial settings and are not part of standard engineering curricula. As manufacturing systems become more complex, there is a growing need to strengthen both initial education and continuing professional development in ergonomics and occupational safety. This is particularly important for engineers serving as safety officers, who are responsible for risk assessment and design of safe workplaces. Introducing ergonomics-focused modules and basic MTM training could help them prevent injuries more effectively and improve the overall efficiency of production systems.
In this context, professional certification systems play a critical role in verifying engineers’ competence and readiness to practice responsibly. However, traditional certification models typically rely on fixed validity periods (e.g., 3 or 5 years), regardless of individual competence, experience or the intensity of technological progress in specific fields. This creates a risk that engineers may continue working under outdated certifications that no longer reflect their actual readiness to meet modern safety and technical requirements [7].
Therefore, a two-stage system for recognizing engineers’ qualifications operates in many countries [8,9]. The first stage is the accreditation of engineering educational institutions, which ensures their compliance with international quality standards (ABET in the USA, ECUK in the UK, CEAB in Canada, JABEE in Japan, etc.). The second stage is the professional certification of engineers, which aims at recognizing their competencies within a specific field of engineering practice (NCEES in the USA, ECUK in the UK, Engineers Canada in Canada, IPEJ in Japan, etc.).
Professional certification of engineers is one of the most important mechanisms for verifying their competencies. The main outcomes of professional certification include the development of engineering education and the engineering profession, improved preparation and quality of graduates from technical education institutions, and motivation of engineers to enhance their professional competencies [10,11,12,13,14,15,16]. With regard to safety-related competencies, certification systems should assess not only the reliable functioning of technologies but also the human-oriented aspects of engineering practice. These include competencies in risk assessment, compliance with international safety and quality standards (e.g., ISO 45001, IEC 61508), and the ability to design work processes and environments that ensure the safe operation and ease of use of machines and products by third parties. In addition, ergonomic assessment is a critical aspect of safety competencies, as it links technical reliability with human well-being. However, in some countries, such training is not a systematic component of engineering education, which highlights the need to strengthen certification mechanisms to ensure the consistent development and assessment of safety competencies.
The literature analysis showed that there are many models of certifications in other countries [17,18,19,20,21,22,23,24]. For example, in the USA, certification is conducted on a voluntary basis. The certification procedure is controlled by the National Council of Examiners for Engineering and Surveying (NCEES). The NCEES develops and evaluates the exams for candidates seeking certification as Fundamentals of Engineering (FE), Professional Engineer (PE), and Structural Engineer (SE). The certificate validity period is 3 years. The certificate renewal procedure is based on the principles of continuing professional development [25]. In the United Kingdom and other European countries, the national engineering regulatory body is the Engineering Council. This body is responsible for developing and maintaining standards of professional competencies and for implementing principles of ethical behavior in the field of engineering activities. Engineers can be awarded the following titles: Chartered Engineer (CEng), Incorporated Engineer (IEng), Engineering Technician (EngTech), and Information and Communications Technologies Technician (ICTTech). The certificate validity period is 3 years [26].
In Russia, the certification of engineers is carried out on the basis of the APEC Register of Engineers. The documents of the applicant for certification are submitted to and reviewed by the Secretariat of the Russian Surveillance Committee. Then the Certification Center conducts an examination to verify the applicant’s competencies. The decision on the recognition of the engineer’s professional competencies is made by the Russian Supervision Committee.
In Kazakhstan, the certification process is carried out on a voluntary and mandatory basis. Currently, Kazakhstan is implementing the National Qualifications System, which is a comprehensive set of legal and institutional regulations governing demand and supply of qualifications, ensuring the linkage between economic sectors, the labor market, and the system of vocational education and training. The main elements of the national qualifications system are the National Qualifications Framework, sectoral qualifications frameworks, professional standards, educational programs, and certification of specialists. The general requirements for the certification procedure are specified in the Law of the Republic of Kazakhstan “On Professional Qualifications” (No. 314-VIII, dated 4 July 2023). The period of validity of the certificate is regulated by the Register of Professions of the Republic of Kazakhstan and can be 1, 2, 3 or 5 years.
However, in the presented models of certification, the connection between the validity period of the certificate and the characteristics of the candidate (such as the level of competencies and work experience), as well as the characteristics of the engineering field (such as the intensity of technological development), is not addressed. The certificate validity period is typically represented as a fixed value (3–5 years). This fixed-duration approach may not adequately reflect the dynamic nature of engineering practice and evolving competency requirements. To address this gap, we propose a new model in which the duration of certification validity is determined by a set of objective criteria, including the engineer’s demonstrated competencies, accumulated professional experience, and the intensity of technological advancement within the relevant engineering discipline.
As shown in Figure 1, the model is represented as a black-box model. The inputs to the model include the engineer’s competencies, work experience, and the intensity of technological development within the engineering discipline, while the output is the certificate validity period.
This study contributes to the literature by introducing a novel model that combines expert-based ranking, fuzzy logic, and bibliometric indicators (Q1/Q2 Scopus data) to dynamically predict the validity period of engineering certificates. Unlike previous fixed-duration approaches, the proposed model enables personalized and data-driven certification outcomes. Thus, the aim of this paper is to develop an approach for analyzing the influence of candidate characteristics on the validity period of engineering certificates.
In recent years, fuzzy logic has been increasingly applied in engineering to support multi-criteria decision-making and enhance personalization in complex systems. For example, Lee and Kang [27] developed a three-phased fuzzy logic model to evaluate smart TV operating systems, illustrating the potential of fuzzy reasoning to process diverse inputs and produce nuanced outcomes. Similarly, Su et al. [28] proposed a fuzzy analytic hierarchy process combined with set pair analysis for evaluating micro-edge honing quality, demonstrating how fuzzy models can improve the interpretability and precision of engineering assessments. These approaches support the premise of this study that fuzzy-based models offer powerful tools for increasing the fairness and accuracy of professional engineering certification.
The proposed methodology employs fuzzy set theory to address uncertainties in assessing an engineer’s competence and related factors. The effectiveness of this approach depends on the correct definition of membership functions and aggregation of individual criteria. Each criterion is evaluated by subject matter experts, providing input data for the model. This combination of expert assessments and fuzzy logic enables a more flexible and accurate determination of overall competence, which is subsequently used to establish the recommended certificate validity period.

2. Materials and Methods

2.1. Description of the Approach for Determining the Certificate Validity Period

To determine the certificate validity period, a conventional scale ranging from 0 to 1 is used. The scale values increase in increments of 0.1. Each value on the scale corresponds to a specific certificate validity period (Table 1).
Three input parameters—the level of the engineer’s competencies, the engineer’s work experience, and the intensity of technological development within the engineering field—are evaluated as scores S1, S2 and S3, respectively, each ranging from 0 to 1. To determine the certificate validity period, the average score (Sa) is calculated as follows:
S a = S 1 + S 2 + S 3 3
The scale for the certificate validity period ranges from 0 to 5 years. A value of 0 years corresponds to failure in the certification exam. The maximum value of 5 years is based on the commonly accepted maximum period in many countries.
Given that the relationship between the input parameters—the engineer’s competence level and work experience—and the certificate validity period is direct (i.e., an increase in these parameters leads to an increase in the certificate validity period), their change from low to high values corresponds to a change from 0 to 1 on the normalized scale. Conversely, the relationship between the intensity of technological development within the engineering field and certificate validity period is inverse. Therefore, as the intensity parameter increases from low to high, its corresponding normalized scale value decreases from 1 to 0.
Thus, after calculating the scores S1, S2, and S3, and consequently, the average score Sa, the certificate validity period can be determined using Table 1.
To evaluate the applicability of the proposed model, the validity period of Engineer X’s certificate was assessed. The engineer’s characteristics presented in Table 2 were used as input data for the model.
The predicted validity period was compared with the officially established regulatory validity period in the Republic of Kazakhstan, which made it possible to evaluate the accuracy and practical applicability of the developed model.
Potential risks associated with a lack of competence are then assessed based on the severity of potential consequences across several dimensions, including safety, business continuity, financial, environmental, and regulatory impacts. Expert judgment is used to evaluate each dimension, and the scores are normalized to a range between 0 and 1. The probability of insufficient competence is then combined with the average normalized severity to calculate an overall risk score. Finally, the baseline certification period, derived from the competence score, is compared to the specified risk intervals, and the final recertification period is adjusted, if necessary, to reflect both engineer’s competence and the assessed risk level.

2.2. Determination of the Score for Evaluating an Engineer’s Competence Level (S1)

The level of engineer’s competencies was evaluated using established assessment criteria. The determination of these criteria for the certification procedure was based on 12 evaluation criteria (C1, C2, C3, …, C12) defined in the Washington Accord [29,30]. The names of these criteria are presented in Table 3.
As shown in Table 3, the twelve evaluation criteria encompass the full spectrum of knowledge, skills, and competencies required of a modern engineer, including a solid grounding in the fundamental sciences and engineering principles; the ability to analyze and design complex systems; conduct evidence-based research; apply modern technologies; assess the social and environmental impacts of engineering practice; adhere to professional ethics; work effectively in teams and demonstrate leadership; communicate efficiently; and maintain a commitment to continuous professional development.
The engineer’s level of competence is evaluated by six experts (E1–E6), all of whom are specialists in the field of mechanical engineering. The assessment of criteria is conducted through both theoretical and practical examinations.
Prior to the evaluation, the criteria are ranked according to their relative importance [31,32]. For this purpose, the experts are asked to complete two forms: the first form is used to assess the level of importance of each criterion, while the second form is used for peer assessment of the experts themselves. A ten-point rating scale is applied in both forms.
The results of the assessment of the significance of criteria on a ten-point scale are presented in Table 4.
The results of the mutual assessment of experts are presented in Table 5.
Then, absolute experts’ ratings are converted into relative values. For this purpose, each expert’s rating is divided by the sum of all ratings. The resulting relative values of the experts are presented in Table 6.
A rank table is then compiled (Table 7), in which the highest value in each row is assigned rank 1, the second highest value is assigned rank 2, and so on.
The Spearman rank correlation coefficient is used to evaluate the degree of concordance between expert assessments. The coefficients are calculated for each pair of experts. As an example, the formula for calculating the Spearman rank correlation coefficient for the first and second experts is given below [32]:
ρ 1 , 2 = 1 6 j = 1 n Z 1 j Z 2 j 2 n 3 n
In Equation (2), Z 1 j Z 2 j represents the deviation between ranking assessments of the first and second experts for criteria used in the certification of engineers, and n denotes the number of criteria.
The values of the Spearman correlation coefficients calculated using Equation (2) are as follows: ρ 1 , 2 = 0.94 , ρ 1 , 3 = 0.79 , ρ 1 , 4 = 0.97 , ρ 1 , 5 = 0.86 , ρ 1 , 6 = 0.98 , ρ 2 , 3 = 0.81 , ρ 2 , 4 = 0.92 , ρ 2 , 5 = 0.87 , ρ 2 , 6 = 0.93 , ρ 3 , 4 = 0.68 , ρ 3 , 5 = 0.79 , ρ 3 , 6 = 0.77 , ρ 4 , 5 = 0.86 , ρ 4 , 6 = 0.98 , ρ 5 , 6 = 0.87 .
After calculating the Spearman rank correlation coefficients for all pairwise combinations of experts, i.e., ρ 1,3 , ρ 1,4 , ρ 1,5 , ρ 1,6 , ρ 2,3 , ρ 2,4 , ρ 2,5 , ρ 2,6 , ρ 3,4 , ρ 3,5 , ρ 3,6 , ρ 4,5 , ρ 4,6 , ρ 5,6 , a correlation matrix of rank relationships is constructed (Table 8).
Using the results of the second form, the expert competence coefficients are calculated according to the following formula [32]:
K i = i = 1 m X i j m
In Equation (3), X i j is the total score assigned, and m is the number of experts. Based on Equation (3), the values of the expert competence coefficients ( K 1 = 7.83 , K 2 = 6.67 , K 3 = 6.50 , K 4 = 8.83 , K 5 = 7.33 , K 6 = 7.67 ) were calculated.
Then, the weight coefficients A i of each expert, taking into account their competence, are determined by multiplying the expert competence coefficients by their assessments [32], i.e.,:
A i j = K i × Z i j K i
The resulting values of the weight coefficients are as follows: A 1 = 0.0894 , A 2 = 0.0866 , A 3 = 0.0860 , A 4 = 0.0797 , A 5 = 0.0876 , A 6 = 0.0850 , A 7 = 0.0833 , A 8 = 0.0831 , A 9 = 0.0793 , A 10 = 0.0680 , A 11 = 0.0790 , A 12 = 0.0929 .
By comparing the values of A i j calculated using Equation (4), a rank is assigned to each criterion.
As shown in Table 9, the criterion with the highest importance is No. 12 (“Commitment to continual professional development and lifelong learning”), while the criterion with the lowest importance is No. 10 (“Proficiency in communication and knowledge of foreign languages”).
After the criteria have been ranked, each of the six experts assigns a score between 0 and 1 to each criterion, as presented in Table 10.
Each score is multiplied by the weighting coefficient A i of the corresponding criterion. A weighted sum of the scores is calculated for each expert, resulting in a single final score per expert. To obtain the overall final score, these six individual scores are averaged using the arithmetic mean. The results of calculations are presented in Table 11.
Thus, the score evaluating the level of engineer’s competence (S1) is 0.67.

2.3. Determination of the Score Considering Work Experience (S2)

Work experience is another input parameter affecting the certificate validity period. In practice, the terms “High work experience”, “Medium work experience”, and “Low work experience” are commonly used. However, the number of years corresponding to these categories may vary depending on the type of engineering specialization. That is, what is considered “high experience” in one engineering field may be classified as “medium” or even “low” in another, reflecting the specific requirements, complexity, and professional standards of each discipline. To evaluate the value of the work experience parameter, fuzzy set theory is applied [33,34,35]. Fuzzy set theory allows a gradual assessment of the membership of elements in a set using membership functions with values in the real unit interval [0,1].
Let us define two sets: the set of linguistic variables L = { l 1 , l 2 , l 3 }, where l 1 represents “Low work experience”, l 2 represents “Medium work experience”, and l 3 represents “High work experience”; and the set of scores U = { u 1 , u 2 , , u n } , 0 < u < 1. A fuzzy set describing a linguistic term over the set of scores is represented as follows:
l ¯ j = μ l j ( u 1 ) u 1 , μ l j ( u 2 ) u 2 , , μ l j ( u n ) u n
It is necessary to determine the degrees of membership of the elements of the set U to elements of the set L , i.e., to find μ 1 j ( u i ) for all 1 =   1 ¯ ,   m ¯ and i = 1 ¯ , n ¯ .
The membership function is defined based on a statistical analysis of the expert group’s opinions. Experts complete a questionnaire in which they express their assessment of the degree to which each element u i , i = 1 ¯ , n ¯ belongs to the fuzzy set l ¯ i ( j = 1 , m ¯ ).
In the assessment of engineer X’s work experience, experts E1, E2, E3, E4, E5, and E6 were asked to classify the engineer’s experience in the field of Mechanical Engineering into the categories “Extensive work experience”, “Average work experience” and “Limited work experience”. The following experience ranges were proposed for evaluation: “more than 10 years”, “7–10 years”, “3–6 years”, “1–2 years”, “less than 1 year”. Each expert assessed which of these ranges corresponded to each experience category, taking into account the specific requirements and professional standards of the Mechanical Engineering discipline (Table 12).
Based on the results of experts’ responses the degree of membership to the fuzzy set l ¯ j is calculated using the following formula:
S 2 = μ l j u i = 1 K k = 1 ¯ , K ¯ b j , i k , i = 1 ¯ , n ¯
In Equation (6), K is the number of experts, and b j , i k represents the opinion of the k-th expert regarding the membership of element u i in the fuzzy set l ¯ j , with k = 1 ¯ , K ¯ , j = 1 , m ¯ and i = 1 ¯ , n ¯ . Expert scores are assumed to be binary, b j , i k { 0,1 } , where “1” indicates the presence of the fuzzy set properties in element u i , and “0” indicates its absence.
In Table 13, the numbers above the line represent the votes cast by the experts for the membership of the corresponding element of the universal set in the fuzzy set.
The numbers below the line indicate the degrees of membership calculated using Equation (6). Thus, using fuzzy set theory, the evaluation of the work experience set is performed by a membership function with values in the real unit interval [0,1].
Given that the candidate has 8 years of experience, Table 13 considers the range of 7 to 10 years. Within this range, the majority of experts classified the engineer’s experience as “Average work experience”. The corresponding S2 value, based on the maximum number of votes, is 0.7.

2.4. Determination of the Score for the Intensity of Technological Development Within an Engineering Discipline (S3)

The determination of the score for the intensity of technological development within a given engineering type is conducted by analyzing the number of papers published in Q1 and Q2 journals currently indexed in Scopus [36].
Table 14 presents the number of Q1 and Q2 papers for the period from 2019 to 2023.
Based on the data presented in Table 14, the intensity of technological development within the engineering type can be evaluated using the following formula:
I = x 2 x 1 + x 3 x 2 + x 4 x 3 + x 5 x 4 4
The results of the calculated values of the intensity of technological development using Equation (7) are presented in Table 15.
As shown in Table 15, the highest value of technological development intensity corresponds to the field of Engineering (miscellaneous), while the lowest value corresponds to the field of Media Technology.
Using proportional scaling and considering the indirect relationship between technological development intensity and certificate validity period, the conversion to a conventional scale can be performed as follows:
S 3 = 1 I e I m a x
In Equation (8), I e represents the intensity of technological development for a given engineering type, and I m a x is the maximum of technological development intensity among all engineering types (in this case I m a x = 19 corresponding to the Engineering (miscellaneous)).
For Engineer X, working in the field of Mechanical Engineering, the value of S3 calculated using Equation (8) is 0.38.
The results of the proposed model are presented in Table 16.
Thus, the score evaluating the engineer’s level of competence (S1) is 0.67, the score evaluating work experience (S2) is 0.70, and the score evaluating the intensity of technological development in the field of mechanical engineering (S3) is 0.38.
Based on the proposed formula, the score assessing the validity period of the certificate is 0.60.
According to Table 1, the certificate validity period is set at 3 years.

2.5. Risk Assessment Based on Competence and Consequences

In addition to the average competence score (Sa), a structured assessment of the potential risk associated with the recertification intervals is introduced. This risk combines the probability of insufficient competence and the severity of the possible consequences.
The possible consequences of insufficient competence are divided into individual aspects, each of which is rated by experts on a scale from 1 to 5 (1 = minimal impact, 5 = catastrophic). To integrate these scores into the risk level calculation, they are normalized to a 0–1 range by dividing by 5. The results are presented in Table 17.
Along with safety risk, business continuity, financial loss, environmental impact, and regulatory consequences, a baseline indicator of ergonomic education was incorporated into the assessment model. Limited ergonomic awareness may increase safety risks and reduce long-term performance. Therefore, ergonomics was integrated into the risk analysis as a separate consequence category, highlighting the importance of regular ergonomics training as a preventative measure to sustain competence and mitigate risks.
The probability P fail that an engineer’s competence will be insufficient at the time of recertification is estimated as:
P fail = 1 S c
In Equation (9), S c represents an aggregate measure of competence, professional experience, and technological intensity, calculated as a weighted sum:
S c = ω 1 S 1 + ω 2 S 2 + ω 3 S 3
In Equation (10), ω 1 , ω 2 , ω 3 are the average weights estimated by experts.
Based on expert assessments, the average weights of the parameters were assigned as ω 1 = 0.4 , ω 2 = 0.35 , ω 3 = 0.38 . Using these weights, the competence score S c was calculated as 0.608. Consequently, the probability P fail of insufficient competence at the time of recertification is P fail = 0.417 .
Then, the overall risk score R is determined by multiplying the probability of insufficient competence ( P fail ) by the average severity ( Sev avg ):
R = P fail Sev avg
Given that the average severity ( Sev avg ) is 0.6, the overall risk score R is calculated as 0.25.
The threshold values for mapping the overall risk score R to the recertification intervals are determined by experts, taking into account the criticality of the engineering area, historical experience, and regulatory requirements. The calculated overall risk score R is then compared with the risk-certification period table (Table 18), allowing the recertification interval to be adjusted according to the assessed risk level.
The mapping of risk score to recommended recertification intervals is based on the principle that higher risk levels require shorter intervals to ensure timely reassessment and risk mitigation. The thresholds (≥0.6, 0.3–0.6, <0.3) correspond to high, medium, and low risk levels, providing a practical framework for adjusting certification periods according to both competence and severity of consequences. These thresholds can be further adapted to the criticality requirements of specific industries.
The exact validity period of the certificate is determined by combining the engineer’s competence assessment (Sa) with the calculated risk score (R). The base validity period of the certificate is first determined from Sa using Table 1. This period is then compared with the risk score intervals: if the base validity period of the certificate falls within the range corresponding to the assessed risk, it is retained; if it exceeds the upper limit of the interval, it is reduced accordingly.
For the engineer in question, the aggregated competence score (Sa = 0.6) corresponds to a baseline certification period of 3 years. The calculated risk score of 0.25 falls within the low-risk range (4–5 years). Since the baseline period lies within this range, the final recommended certification period remains 3 years. This approach ensures that the validity period of the certificate reflects both the engineer’s competence and the assessed risk level, providing a transparent and practical method for determining recertification intervals.

3. Results and Discussion

The obtained results confirm that the certificate validity period in professional engineering certification can be determined when personalized factors are taken into account. In the case studied, considering the engineer’s competence level, work experience, and technological intensity of the engineering field resulted in a certificate validity period that was two years shorter than the standard period. This finding highlights the potential of the proposed model to optimize the certification process by providing more tailored and evidence-based outcomes, particularly for fields characterized by rapid technological change or varying competency requirements.
As shown in Table 16, the assessment of the certificate validity period was calculated as the arithmetic mean of the values S1 = 0.67, S2 = 0.7, and S3 = 0.38. The most significant factors influencing the certificate validity period are the engineer’s level of knowledge and work experience, with S1 = 0.67 and S2 = 0.70, respectively.
The greatest influence on the assessment of the engineer’s competence level (S1) was exerted by criterion C12 (“Commitment to continual professional development and lifelong learning”), for which the engineer received the highest marks and which, according to the expert assessment method, has the greatest weight.
For the parameter reflecting the engineer’s work experience (S2), the value S2 = 0.7 was obtained based on four expert votes.
Regarding the third parameter—the assessment of the technology development intensity by engineering type (S3)—its value was 0.38, as this engineering field is characterized by a relatively high intensity of technology development. This, in turn, indicates that the engineer should undergo re-certification more frequently.
The risk assessment is compared with the recommended recertification intervals based on the principle that higher risk levels require shorter intervals to ensure timely reassessment and risk mitigation. Thresholds for high, medium and low risk provide a practical basis for adjusting the certification periods according to both competence and the severity of the potential consequences and can be adapted to specific industry requirements. The exact validity period is determined by combining the engineer’s competence assessment (Sa) with the calculated risk assessment (R): the base period is derived from Sa using Table 1 and compared with the risk intervals, retained if it falls within the range or reduced if it exceeds the upper limit. For the engineer in question, Sa = 0.6 corresponds to a base period of 3 years, while a risk assessment of 0.25 falls in the low risk range (4–5 years); thus, the final recommended certification period remains 3 years. This approach ensures that the validity of certification reflects both competence and assessed risk, providing a transparent and practical method for determining recertification intervals.
The findings of this study align with the broader trend of applying fuzzy logic to improve decision-making in complex engineering systems. While our model focuses on predicting certificate validity based on expert and bibliometric data, similar methods have been employed to evaluate technical systems and operational alternatives. For instance, Lee and Kang [27] developed a three-phased fuzzy MCDM model to support decision-making in consumer electronics evaluation, and Su et al. [28] demonstrated the utility of trapezoidal fuzzy AHP in assessing manufacturing quality. These approaches confirm the potential of fuzzy models to enhance objectivity and transparency in structured, multi-criteria assessments.
The proposed model could be applied in a broader international context, particularly in countries seeking to improve their national qualification systems or to align their certification practices with global standards such as the Washington Accord. By incorporating both objective (bibliometric) and subjective (expert) assessments, the model can be adapted to different engineering disciplines and regulatory environments. Its modular structure also allows for future integration with digital platforms or decision-support tools used by certification centers and professional bodies.
However, several limitations of the study should be acknowledged. The expert reviews used to determine competency scores were limited to a single institution and region, which may introduce subjective bias. In addition, the model was tested on only one representative case, which does not fully capture the diversity of engineering disciplines and experience profiles. Future research should aim to validate the model on a larger sample, include cross-institutional expert panels, and consider additional input variables such as ethical qualifications, education, or continuing professional development activities.

4. Conclusions

This study investigated the key factors influencing the validity period of engineering certificates, which serve as a critical indicator for ensuring the safety and quality of engineering outcomes. A factor model was developed, integrating three input parameters: engineers’ competencies, professional experience, and the technological intensity of a specific engineering field. These parameters were quantified using expert assessments, fuzzy set theory, and bibliometric analysis of Scopus publications for the first and second quarters. The output parameter—the certificate validity period—was determined using a lookup table that associates the average input scores with the recommended duration.
The results indicate that these factors significantly influence the recommended certification duration. For instance, when applied to data from the Kazakhstan certification registry, the model suggested an optimal validity period of three years for mechanical engineers, compared to the statutory five years, representing a 40% reduction. This finding addresses the original research question, demonstrating that fixed certification periods may not adequately reflect variations in competence, experience, and technological dynamics.
The proposed model offers certification bodies, policymakers, and engineering organizations a practical tool for tailoring certification timelines to individual engineers’ profiles. Its application is particularly relevant in engineering sectors characterized by rapid technological change, where timely reassessment of competencies is essential for maintaining safety and minimizing occupational risks. More adaptive certification schedules can support evidence-based workforce planning, risk prevention, and the modernization of national qualification systems.
However, this study has certain limitations. It relies on expert judgment within a limited national context, which may not fully capture the diversity of engineering fields worldwide. Further research should aim to validate the model using larger datasets across multiple countries and disciplines, incorporate additional factors such as continuous professional development, and explore the potential for integration with digital certification platforms to enable automated reassessment planning.

Author Contributions

Conceptualization, S.B. and S.R.; methodology, A.M.; validation, Z.K., A.M. and M.K.; formal analysis, M.K.; investigation, S.B.; resources, S.R. and Z.K.; data curation, S.B.; writing—original draft preparation, S.B.; writing—review and editing, M.K.; visualization, A.M.; supervision, S.B.; project administration, Z.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research has been funded by the Science Committee of the Ministry of Science and Higher Education of the Republic of Kazakhstan (Grant No. BR21882257—“Development of a National Model of Engineering Education in the Context of Achieving the Sustainable Development Goals”).

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ABETAccreditation Board for Engineering and Technology (USA)
APECAsia-Pacific Economic Cooperation
ASME BPVAmerican Society of Mechanical Engineers—Boiler and Pressure Vessel Code
CEABCanadian Engineering Accreditation Board
CEngChartered Engineer
ECUKEngineering Council United Kingdom
EngTechEngineering Technician
FEFundamentals of Engineering (exam/certification)
ICTTechInformation and Communications Technologies Technician
IEngIncorporated Engineer
IPEJInstitution of Professional Engineers, Japan
JABEEJapan Accreditation Board for Engineering Education
NCEESNational Council of Examiners for Engineering and Surveying (USA)
PEProfessional Engineer
Q1/Q2First and second quartile journal ranking in Scopus

References

  1. Francis, N.; Norton, E. Educating Civil Engineers for the Twenty-First Century: The ‘New-Model Engineer’. Proc. Inst. Civ. Eng.-Civ. Eng. 2023, 177, 63–71. [Google Scholar] [CrossRef]
  2. Gauvreau, P. Sustainable Education for Bridge Engineers. J. Traffic Transp. Eng. 2018, 5, 510–519. [Google Scholar] [CrossRef]
  3. Byrne, E.P. The Evolving Engineer; Professional Accreditation Sustainability Criteria and Societal Imperatives and Norms. Educ. Chem. Eng. 2023, 43, 23–30. [Google Scholar] [CrossRef]
  4. ISO 9001:2015; Quality Management Systems—Requirements. International Organization for Standardization: Geneva, Switzerland, 2015.
  5. ISO 45001:2018; Occupational Health and Safety Management Systems—Requirements with Guidance for Use. International Organization for Standardization: Geneva, Switzerland, 2018.
  6. IEC 61508:2010; Functional Safety of Electrical/Electronic/Programmable Electronic Safety-Related Systems. International Electrotechnical Commission: Geneva, Switzerland, 2010.
  7. Tomassi, A.; Falegnami, A.; Meleo, L.; Romano, E. The GreenSCENT Competence Frameworks. In The European Green Deal in Education; Routledge: London, UK, 2024; pp. 25–44. [Google Scholar]
  8. Haro, P.; Villanueva Perales, Á.L.; Fernández-Baco, C.; Rodriguez-Galán, M.; Morillo, J. EUR-ACE Accreditation for Chemical Engineering in Spain: Current Situation, Lessons Learned and Challenges. Educ. Chem. Eng. 2023, 45, 19–27. [Google Scholar] [CrossRef]
  9. Raghunath, N.; Haapala, K.R.; Sanchez, C.A. Examining Industry Expectations for Content Knowledge in Mechatronics across Career and Professional Certificate Programs. Manuf. Lett. 2023, 35, 1230–1235. [Google Scholar] [CrossRef]
  10. Murray, S.L.; Lynch-Caris, T.M. Educating the Professional Engineer of 2020: The Changing Licensure Requirements. In Proceedings of the ASEE Annual Conference and Exposition, Atlanta, GA, USA, 23–26 June 2013. [Google Scholar]
  11. Cai, H.; Peng, C.; Zhu, C.; Ouyang, M. The Research of Civil Engineering Certification Information Management System. E3S Web Conf. 2021, 252, 03016. [Google Scholar] [CrossRef]
  12. Veroya, F.C.; Ong, A.K.S.; Young, M.N.; German, J.D. Actual Uptake Pursuance Analysis of Certification Examination among Industrial Engineers in the Philippines: A TPB-PVT Approach. Acta Psychol. 2024, 248, 104399. [Google Scholar] [CrossRef]
  13. Sun, J.; Liu, D.; Kou, L.; Lin, Y. Regarding Engineering Education Professional Certification as a Starting Point, Do a Good Job of Audit Assessment. In e-Learning, e-Education, and Online Training; Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering; Springer: Cham, Switzerland, 2018; Volume 243, pp. 287–291. Available online: https://www.researchgate.net/publication/326071614_Regarding_Engineering_Education_Professional_Cer (accessed on 9 May 2025).
  14. Robles, R.; Quadrado, J. Analyzing the ASME BPV Code of Construction Professional Engineer Accreditation Requirements and Their Impact in Central, South America and Mexico. In Proceedings of the LACCEI international Multi-conference for Engineering, Education and Technology, Buenos Aires, Argentina, 19–21 July 2023. [Google Scholar] [CrossRef]
  15. Khamis, N.K.; Harun, Z.; Tahir, M.F.M.; Wahid, Z.; Sabri, M.A.M. Motivational Factors of Professional Engineers and Non-Professional Engineers in Applying for License as Professional Engineer: A Comparative Study. Int. Educ. Stud. 2013, 6, 124–130. [Google Scholar] [CrossRef]
  16. Zheng, W.; Xing, L.; Jiang, G.; Xu, J. Optimization and Reform of Talent Training Scheme for Engineering Majors Under the Background of Engineering Certification. Adv. Educ. Technol. Psychol. 2021, 5, 134–144. Available online: https://clausiuspress.com/article/2152 (accessed on 9 May 2025).
  17. Widiasanti, I.; Tamin, R.Z. A Review on Certification Procedure for Professionals Engineer Based on Engineering Act in Indonesia. In Proceedings of the International Conference : Issues, Management And Engineering In The Sustainable Development On Delta Areas Semarang, Semarang, Indonesia, 11–22 August 2015. [Google Scholar]
  18. Widiasanti, I.; Tamin, R.Z.; Marzuki, P.F.; Wiratmaja, I.I. Development of Civil Engineers’ Certification System Evaluation Model. IOP Conf. Ser. Mater. Sci. Eng. 2018, 434, 012196. [Google Scholar] [CrossRef]
  19. Sengupta, N.; Chawla, N.; Agarwal, A.; Evans, J. Do Online Certifications Improve Job Market Outcomes? Evidence from an IT Skills Certification Platform in India. Inf. Econ. Policy 2023, 65, 101067. [Google Scholar] [CrossRef]
  20. Dubljević, S.; Tepavčević, B.; Markoski, B.; Anđelković, A.S. Computational BIM Tool for Automated LEED Certification Process. Energy Build. 2023, 292, 113168. [Google Scholar] [CrossRef]
  21. Bagert, D.J. Licensing and Certification of Software Professionals. Adv. Comput. 2004, 60, 1–34. [Google Scholar] [CrossRef]
  22. Wang, B.; Wu, C.; Li, J.; Zhang, L.; Huang, L.; Kang, L. Certified Safety Engineer (CSE) as a New Official Profession in China: A Brief Review. Saf. Sci. 2019, 116, 108–115. [Google Scholar] [CrossRef]
  23. Gao, Y.; Xiong, J. Research on the Implementation of Higher Engineering Education Certification Status and Comparative. Adv. Vocat. Tech. Educ. 2024, 6, 191–198. [Google Scholar] [CrossRef]
  24. Buchanan, W.W.; McNeill, P.R. Professional Registration of Engineering Technology Graduates. J. Eng. Technol. 1998, 15, 12–19. [Google Scholar]
  25. Grebski, M.; Wolniak, R. Procedures for Obtaining an Engineering License in the USA and Canada. Zesz. Naukowe. Organ. I Zarządzanie/Politech. Śląska 2019, 136, 145–152. [Google Scholar] [CrossRef]
  26. Nakanishi, S. The Chartered Engineer System and the State of the Art of the Engineering Education of Universities in United Kingdom. Proc. JSME Annu. Meet. 2003, 2003.5, 441–442. [Google Scholar] [CrossRef]
  27. Lee, A.H.I.; Kang, H.Y. A Three-Phased Fuzzy Logic Multi-Criteria Decision-Making Model for Evaluating Operation Systems for Smart TVs. Appl. Sci. 2023, 13, 7869. [Google Scholar] [CrossRef]
  28. Su, J.; Liang, Y.; Yu, Y.; Wang, F.; Zhou, J.; Liu, L.; Gao, Y. Quality Evaluation of Effective Abrasive Grains Micro-Edge Honing Based on Trapezoidal Fuzzy Analytic Hierarchy Process and Set Pair Analysis. Appl. Sci. 2024, 14, 10939. [Google Scholar] [CrossRef]
  29. Graduate Attributes & Professional Competencies. International Engineering Alliance. 2013. Available online: http://www.ieagreements.org (accessed on 9 May 2025).
  30. Paramasivam, S.; Mutusamy, K.; Tan, K. Study of the Effectiveness of the Implementation of Washington Accord in Malaysia’s Engineering Undergraduate Programme Using SEM. Procedia Soc. Behav. Sci. 2013, 90, 803–812. [Google Scholar] [CrossRef]
  31. Iriste, S.; Paed, M.; Katane, I. Expertise as a Research Method in Education. Environ. Educ. Pers. 2018, 11, 11–12. [Google Scholar] [CrossRef]
  32. Bekenov, T.N.; Kornev, V.A.; Mashekenova, A.S. Quality Management System for Business Processes of Production and Operation of Complex Technical Systems; Shygys Akparat: Ust-Kamenogorsk, Kazakhstan, 2010. [Google Scholar]
  33. Höhle, U. On the Mathematical Foundations of Fuzzy Set Theory. Fuzzy Sets Syst. 2022, 444, 1–9. [Google Scholar] [CrossRef]
  34. Chen, L.; Pan, W. Fuzzy Set Theory and Extensions for Multi-Criteria Decision-Making in Construction Management; Emerald Publishing Limited: Leeds, UK, 2018; Available online: https://repository.hku.hk/handle/10722/262151 (accessed on 9 May 2025).
  35. Tian, X.; Ma, J.; Li, L.; Xu, Z.; Tang, M. Development of Prospect Theory in Decision Making with Different Types of Fuzzy Sets: A State-of-the-Art Literature Review. Inf. Sci. 2022, 615, 504–528. [Google Scholar] [CrossRef]
  36. Scopus-Homepage. Available online: https://www.scopus.com/pages/home#basic (accessed on 9 May 2025).
Figure 1. Key factors influencing the validity period of certificates through their impact on certification exam outcome.
Figure 1. Key factors influencing the validity period of certificates through their impact on certification exam outcome.
Safety 11 00095 g001
Table 1. Mapping average input parameter values to the certificate validity period.
Table 1. Mapping average input parameter values to the certificate validity period.
The average of the input parameters (Sa) 00.10.20.30.40.50.60.70.80.91
Certificate validity period (years)00.511.522.533.544.55
Table 2. Characteristics of Engineer X.
Table 2. Characteristics of Engineer X.
ParameterValue
Name/IdentifierEngineer X
Date of issue of the certificate5 May 2020
Standard validity period of the certificate (according to the current legislation of Kazakhstan)5 years
Professional experience8 years
Field of professional activityMechanical engineering
Recording of violations/disciplinary sanctionsNone
Table 3. Evaluation criteria for assessing engineer competencies.
Table 3. Evaluation criteria for assessing engineer competencies.
NoName of CriterionDesignation
1Knowledge and understanding of the natural sciences (mathematics, physics, computer science) and the fundamentals of engineering for solving complex engineering problems.C1
2Ability to analyze complex engineering products, processes, and systems within both their specific field and a broader interdisciplinary context.C2
3Ability to develop and design solutions for complex engineering problems in accordance with specified requirements.C3
4Ability to conduct research on complex engineering problems using scientifically grounded approaches and methods.C4
5Ability to apply new technologies and modern tools to solve complex engineering problems.C5
6Ability to evaluate and consider the social implications of engineering practice.C6
7Ability to assess and account for the impact of engineering activities on sustainable development.C7
8Ability to adhere to professional ethics, responsibility, and standards of engineering practice.C8
9Ability to work effectively in a team and manage diverse, multidisciplinary teams.C9
10Proficiency in communication and knowledge of foreign languages.C10
11Ability to demonstrate leadership qualities and apply principles of engineering management.C11
12Commitment to continual professional development and lifelong learning.C12
Table 4. Assessment of the Relative Importance of the Criteria Using a Ten-Point Scale.
Table 4. Assessment of the Relative Importance of the Criteria Using a Ten-Point Scale.
Expert’s NumberC1C2C3C4C5C6C7C8C9C10C11C12
E110101010101010998910
E288101010991099710
E3788585553259
E410108101010101010101010
E5108866108891088
E61099910910101041010
Table 5. Results of the Experts’ Mutual Assessment (10-Point Scale).
Table 5. Results of the Experts’ Mutual Assessment (10-Point Scale).
E1E2E3E4E5E6
E11057889
E2579478
E3687549
E487991010
E5795788
E6679897
Table 6. Relative assessments of the experts.
Table 6. Relative assessments of the experts.
K i C1C2C3C4C5C6C7C8C9C10C11C12
E17.830.08700.08700.08700.08700.08700.08700.08700.07830.07830.06960.07830.0870
E26.670.07340.07340.09170.09170.09170.08260.08260.09170.08260.08260.06420.0917
E36.500.10000.11430.11430.07140.11430.07140.07140.07140.04290.02860.07140.1286
E48.830.08470.08470.06780.08470.08470.08470.08470.08470.08470.08470.08470.0847
E57.330.10100.08080.08080.06060.06060.10100.08080.08080.09090.10100.08080.0808
E67.670.09090.08180.08180.08180.09090.08180.09090.09090.09090.03640.09090.0909
Table 7. Ranked assessments.
Table 7. Ranked assessments.
C1C2C3C4C5C6C7C8C9C10C11C12Sum of Ranks
E111111112232117
E233111221224123
E332242444564141
E411211111111113
E513344133213331
E612221211131118
Table 8. Correlation matrix of expert rankings.
Table 8. Correlation matrix of expert rankings.
E1E2E3E4E5E6
E110.940.790.970.860.98
E20.9410.810.920.870.93
E30.790.8110.680.790.77
E40.970.920.6810.860.98
E50.860.870.790.8610.87
E60.980.930.770.980.871
Table 9. Resulting ranking of criteria.
Table 9. Resulting ranking of criteria.
K i C1C2C3C4C5C6C7C8C9C10C11C12
E17.830.08700.08700.08700.08700.08700.08700.08700.07830.07830.06960.07830.0870
E26.670.07340.07340.09170.09170.09170.08260.08260.09170.08260.08260.06420.0917
E36.500.10000.11430.11430.07140.11430.07140.07140.07140.04290.02860.07140.1286
E48.830.08470.08470.06780.08470.08470.08470.08470.08470.08470.08470.08470.0847
E57.330.10100.08080.08080.06060.06060.10100.08080.08080.09090.10100.08080.0808
E67.670.09090.08180.08180.08180.09090.08180.09090.09090.09090.03640.09090.0909
A i 0.08940.08660.08600.07970.08760.08500.08330.08310.07930.06800.07900.0929
P i 245936781012111
Table 10. Experts’ scores.
Table 10. Experts’ scores.
C1C2C3C4C5C6C7C8C9C10C11C12
E10.70.70.90.60.80.50.60.80.90.50.70.7
E20.50.50.60.70.50.70.80.60.90.70.40.8
E30.90.50.70.70.60.90.60.60.40.80.50.6
E40.60.80.50.50.80.60.70.90.70.60.80.5
E50.60.80.60.90.70.40.60.50.80.90.70.6
E60.80.90.60.60.90.80.40.50.80.70.60.9
Table 11. Calculation of criterion S1 with weighting coefficients.
Table 11. Calculation of criterion S1 with weighting coefficients.
C1C2C3C4C5C6C7C8C9C10C11C12Sum
E10.0630.0610.0770.0480.0700.0430.050.0660.0710.0340.0550.0650.70
E20.0450.0430.0520.0560.0440.060.0670.050.0710.0480.0320.0740.64
E30.080.0430.0600.0560.0530.0770.050.050.0320.0540.040.0560.65
E40.0540.0690.0430.040.070.0510.0580.0750.0560.0410.0630.0460.67
E50.0540.0690.0520.0720.0610.0340.050.0420.0630.0610.0550.0560.67
E60.0720.0780.0520.0480.0790.0680.0330.0420.0630.0480.0480.0840.71
AvgS1 = 0.67
Table 12. Experts’ assessments of work experience.
Table 12. Experts’ assessments of work experience.
kLinguistic VariablesOver 10 Years7–10 Years3–6 Years1–2 YearsLess Than 1 Year
E1Limited work experience00011
E1Average work experience01100
E1Extensive work experience10000
E2Limited work experience00111
E2Average work experience01000
E2Extensive work experience10000
E3Limited work experience00011
E3Average work experience01100
E3Extensive work experience10000
E4Limited work experience00011
E4Average work experience00100
E4Extensive work experience11000
E5Limited work experience00011
E5Average work experience00100
E5Extensive work experience11000
E6Limited work experience00111
E6Average work experience01000
E6Extensive work experience10000
Table 13. Results of experts’ evaluations.
Table 13. Results of experts’ evaluations.
Linguistic VariablesTypeOver 10 Years7–10 Years3–6 Years1–2 YearsLess Than 1 Year
Limited work experienceAbsolute value00266
Normalized value000.311
Average work experienceAbsolute value04400
Normalized value00.70.700
Extensive work experienceAbsolute value61000
Normalized value10.2000
Table 14. Number of papers published in Q1 and Q2 journals (2019 to 2023) [36].
Table 14. Number of papers published in Q1 and Q2 journals (2019 to 2023) [36].
Name of the Engineering Type20192020202120222023
Aerospace Engineering7272717485
Architecture8089103115121
Automotive Engineering4954576368
Biomedical Engineering138150172187200
Building and Construction108119121134144
Civil and Structural Engineering184198203219235
Computational Mechanics3740404445
Control and System Engineering145152156162183
Electrical and Electronic Engineering369379388408439
Engineering (miscellaneous)39486785115
General Engineering155155158160162
Industrial and Manufacturing Engineering180178176185203
Mechanical Engineering324330326347371
Mechanics of Materials213225221224241
Media Technology3633363638
Ocean Engineering4850545357
Safety, Risk, Reliability and Quality919795100113
Table 15. Calculated values of technological development intensity.
Table 15. Calculated values of technological development intensity.
Name of the Engineering TypeTechnological Development Intensity, IScore, S3
Aerospace Engineering3.250.8
Architecture10.250.5
Automotive Engineering4.750.8
Biomedical Engineering15.50.2
Building and Construction90.5
Civil and Structural Engineering12.750.3
Computational Mechanics20.9
Control and System Engineering9.50.5
Electrical and Electronic Engineering17.50.1
Engineering (miscellaneous)190
General Engineering1.750.9
Industrial and Manufacturing Engineering5.750.7
Mechanical Engineering11.750.4
Mechanics of Materials70.6
Media Technology0.51
Ocean Engineering2.250.9
Safety, Risk, Reliability and Quality5.50.7
Table 16. Calculated values of the factors.
Table 16. Calculated values of the factors.
S1S2S3Certificate Validity ScoreCertificate Validity
0.670.70.380.63 years
Table 17. Severity of potential consequences.
Table 17. Severity of potential consequences.
AspectSeverity, SevNormalized Value of Severity
Safety risk40.8
Business continuity40.8
Financial loss30.6
Environmental impact20.4
Regulatory consequences30.6
Ergonomic awareness and training20.4
Table 18. Recommended recertification intervals based on risk score.
Table 18. Recommended recertification intervals based on risk score.
Risk Score, RRecommended Interval
≥0.6High risk (recertification every 1 year)
0.3–0.6Medium risk (recertification every 1–2 years)
<0.3Low risk (recertification every 4–5 years)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Baigereyev, S.; Konurbayeva, Z.; Kulisz, M.; Rakhmetullina, S.; Mashekenova, A. Factor-Based Analysis of Certification Validity in Engineering Safety. Safety 2025, 11, 95. https://doi.org/10.3390/safety11040095

AMA Style

Baigereyev S, Konurbayeva Z, Kulisz M, Rakhmetullina S, Mashekenova A. Factor-Based Analysis of Certification Validity in Engineering Safety. Safety. 2025; 11(4):95. https://doi.org/10.3390/safety11040095

Chicago/Turabian Style

Baigereyev, Samat, Zhadyra Konurbayeva, Monika Kulisz, Saule Rakhmetullina, and Assiya Mashekenova. 2025. "Factor-Based Analysis of Certification Validity in Engineering Safety" Safety 11, no. 4: 95. https://doi.org/10.3390/safety11040095

APA Style

Baigereyev, S., Konurbayeva, Z., Kulisz, M., Rakhmetullina, S., & Mashekenova, A. (2025). Factor-Based Analysis of Certification Validity in Engineering Safety. Safety, 11(4), 95. https://doi.org/10.3390/safety11040095

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop