Next Article in Journal
Financial Risk Management and Resilience of Small Enterprises Amid the Wartime Crisis
Previous Article in Journal
The Bidirectional Relationship Between Audit Fees and Corporate Reputation: Panel Evidence from South African Listed Firms
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Readiness and Perceptions of IPSAS 46 “Measurement” Implementation in Public Sector Entities: Evidence from Georgia

Department of Accounting, Ivane Javakhishvili Tbilisi State University, Tbilisi 0186, Georgia
*
Author to whom correspondence should be addressed.
J. Risk Financial Manag. 2026, 19(1), 36; https://doi.org/10.3390/jrfm19010036
Submission received: 21 October 2025 / Revised: 19 December 2025 / Accepted: 22 December 2025 / Published: 5 January 2026
(This article belongs to the Topic Risk Management in Public Sector)

Abstract

This article examines the challenges associated with implementing and applying the new valuation approaches introduced by IPSAS 46 in the public sector, specifically fair value and current operational value. The analysis is based on thematic survey data collected from professionals engaged in public sector accounting and reporting, as well as a diverse range of documentary sources. Although the study does not include a factor analysis and its conclusions are limited to the Georgian context, thematic grouping revealed key determinants influencing the implementation of IPSAS: professional competence, resource availability, and managerial attitudes towards development and perceived need. Additional situational limitations existed, as the study was conducted prior to the official translation of IPSAS 46 and its incorporation into the national guidance. Despite these constraints, the findings may hold relevance for other countries facing similar challenges of limited resources, professional capacity, and managerial attitudes. The study provides recommendations for integrating IPSAS 46 principles into local standards.

1. Introduction

Public sector financial reporting has undergone significant transformation over the past decades as governments increasingly adopt international standards aimed at enhancing transparency, accountability, and comparability (Christensen et al., 2018; Oulasvirta, 2014). Among these reforms, the International Public Sector Accounting Standards (IPSAS) have become the dominant global framework for improving the quality of public sector information. The newly issued IPSAS 46 Measurement introduces a comprehensive and conceptually unified approach to valuation, requiring entities to apply consistent measurement bases and to assess current values in accordance with market-based, cost-based, and income-based techniques (IPSASB, 2023c). These requirements impose substantive technical, institutional, and cognitive demands on public sector entities, particularly in jurisdictions with evolving accounting infrastructures and limited experience with valuation-based reporting (Sabauri et al., 2022).
While extensive research has examined public sector accounting reforms more broadly (Grossi & Steccolini, 2015), constituting a critical gap in the literature. Existing studies highlight persistent challenges in implementing valuation standards, including insufficient technical capacity, institutional resistance, resource constraints, and ambiguity surrounding measurement choices (Caruana et al., 2019; Polzer et al., 2021). However, no studies to date systematically analyze how public sector entities perceive the new IPSAS 46 Measurement model, how prepared they are to operationalize its requirements, or how contextual factors shape their readiness to engage with valuation principles.
Furthermore, research on accounting change in transitional and developing public sectors indicates that reforms often fail not due to technical weaknesses but because organizations lack institutional readiness, managerial competence, and supporting infrastructures (Hyndman & Connolly, 2011; van Helden & Uddin, 2016). The introduction of IPSAS 46 heightens these concerns: valuation techniques require specialized knowledge, robust data systems, and judgment-intensive assessments, all of which impose significant challenges in environments with limited evaluation expertise and evolving regulatory frameworks.
The Georgian public sector presents a relevant empirical context. As a country progressing toward greater alignment with international public sector standards, Georgia demonstrates both structural commitment and capacity limitations. Although IPSAS adoption is ongoing, the degree to which entities understand, internalize, and operationalize the principles of IPSAS 46 remains unclear. This gap motivates the present study.

1.1. Research Problem

Despite the conceptual significance of IPSAS 46, little is known about how public sector practitioners interpret the standard and assess their readiness to apply its measurement requirements. Without such understanding, implementation risks remain high, and the broader goals of comparability and transparency may not be achieved.

1.2. Purpose of the Study

The purpose of this study is to empirically assess the following factors:
  • Public sector practitioners’ perceptions of IPSAS 46 and its measurement principles;
  • The extent of organizational readiness—technical, institutional, and cognitive—to implement the standard;
  • The key challenges and capacity gaps shaping adoption prospects.

1.3. Theoretical Contribution

This study contributes to the literature in three ways:
  • It proposes a theory-informed analytical framework of readiness for measurement reforms, integrating institutional theory (DiMaggio & Powell, 2000), change readiness frameworks (Holt et al., 2007), and valuation theory applied to the public sector.
  • It extends the literature on public sector measurement uncertainty, offering empirical insights into how practitioners interpret current value measurement requirements and associated challenges (Polzer et al., 2021).
  • It provides the first empirical evidence on IPSAS 46 readiness, addressing a significant gap in both international and local scholarship.

1.4. Research Questions

In line with the exploratory nature of the study and the novelty of IPSAS 46 Measurement, the research addresses the following questions:
RQ1: How do public sector practitioners perceive the principles and requirements of IPSAS 46 Measurement?
RQ2: To what extent are practitioners’ readiness perceptions associated with professional competence, institutional support, and resource availability?
RQ3: What implementation challenges are perceived as most significant in the early stages of IPSAS 46 adoption?

1.5. Practical Contribution

The findings inform policymakers, regulators, and public sector management by identifying capacity gaps, training needs, and systemic constraints that may hinder the successful implementation of IPSAS 46.

1.6. Structure of the Paper

The remainder of the paper is structured as follows: Section 2 develops the theoretical framework; Section 3 presents the methodology; Section 4 reports the empirical results; Section 5 discusses the findings; and Section 6 concludes with policy implications and future research directions.

2. Theoretical Framework

Public sector accounting reforms are shaped by complex interactions among institutional pressures, organizational capacities, and professional competencies. Understanding readiness for IPSAS 46 Measurement, therefore, requires an integrated theoretical perspective that connects accounting change theory, institutional frameworks, and valuation concepts. This section develops a multidimensional theoretical foundation for analyzing how public sector organizations perceive and internalize measurement reforms (ACCA, 2017).

2.1. Institutional Theory and Public Sector Accounting Change

Institutional theory provides a foundational lens for explaining why and how public sector entities adopt new accounting practices. According to DiMaggio and Powell (2000), organizations respond to coercive, normative, and mimetic pressures, which shape the adoption of formal structures and reporting practices. In the context of international accounting reforms, governments often adopt IPSAS due to external mandates, modernization agendas, donor expectations, or aspirations toward international legitimacy (Dung & Lien, 2024; van Helden & Uddin, 2016).
Public sector accounting change research highlights that reforms frequently face symbolic adoption, resistance, or partial implementation, especially when new standards challenge existing routines or require substantial technical adaptation (Lapsley, 2009; Grossi & Steccolini, 2015). The introduction of IPSAS 46, with its emphasis on current value measurement, amplifies these institutional tensions by requiring valuation techniques that may not align with existing administrative capacities or cultural norms (Beke-Trivunac & Živkov, 2024). Therefore, readiness for IPSAS 46 can be conceptualized as a function of how institutional logics support or constrain the adoption of measurement reforms.

2.2. Readiness for Change: A Multidimensional Construct

Readiness for change is defined as the extent to which organizational members exhibit the cognitive, emotional, and behavioral willingness to implement reforms (Holt et al., 2007). In public sector contexts, readiness is particularly important because reforms often require shifts in professional identity, competencies, and organizational routines (Hyndman & Connolly, 2011).
The literature identifies several core dimensions relevant to IPSAS 46:
  • Change Commitment—belief that the reform is necessary and beneficial;
  • Change Efficacy—belief in the organization’s ability to successfully implement it;
  • Task-Specific Competence—technical capacity to perform new measurement tasks;
  • Resource Availability—financial, human, and informational resources to support adoption (Wang, 2014).
These dimensions interact to shape practitioners’ perceptions of feasibility and desirability. Because IPSAS 46 requires value-based measurement (e.g., fair value, replacement cost, present value), readiness extends beyond procedural adjustments to deeper issues of professional judgment and valuation expertise (Vardiashvili, 2024; Cenar & Cioban, 2022).

2.3. Measurement Complexity and Valuation Theory in the Public Sector

Measurement under IPSAS 46 involves selecting appropriate bases and techniques to estimate the current value. Valuation in the public sector is inherently challenging due to limited market comparability, non-exchange transactions, specialized assets, and uncertainty in determining service potential (Caruana et al., 2019; Polzer et al., 2021).
The literature identifies three key components of measurement complexity:
  • Technical Complexity—difficulty in applying measurement techniques that require specialized expertise or judgment;
  • Data and Information Constraints—lack of market data, asset condition information, or historical cost structures;
  • Cognitive Uncertainty—ambiguity in interpreting valuation guidance or selecting measurement base (ICAEW, 2023). These constraints frequently result in inconsistent or unreliable valuations, leading to reduced comparability and skepticism among practitioners (Oulasvirta, 2014). Understanding practitioners’ perceptions of complexity is therefore central to assessing readiness for IPSAS 46 (IPSASB, 2021).

2.4. Organizational Factors Shaping IPSAS 46 Readiness

Building on the prior streams of literature, four organizational determinants emerge as critical for successful measurement reform:
(a)
Technical Capacity
Readiness depends on whether entities possess adequate knowledge, professional skills, and IT systems to apply IPSAS-compliant measurement methodologies. Studies show that technical deficits are among the most common causes of implementation failures (Polzer et al., 2021).
(b)
Institutional Support and Governance
Governance structures influence reform success by establishing internal controls, providing formal guidance, and promoting adoption through leadership commitment (Grossi & Steccolini, 2015). Weak institutional support reduces readiness even when technical knowledge exists.
(c)
Resource Availability
Effective valuation requires time, budget, data systems, and expert support. Resource scarcity is frequently cited as a barrier in developing or transitional public sectors (van Helden & Uddin, 2016).
(d)
Professional Competence and Training
As IPSAS 46 requires extensive judgment, capacity-building, and continuous education are essential. Lack of training directly undermines the ability of practitioners to apply the standard meaningfully (Caruana et al., 2019).

2.5. Conceptual Model of Readiness for IPSAS 46

Drawing from institutional theory, change readiness models, and valuation literature, this study conceptualizes readiness for IPSAS 46 as a multidimensional construct influenced by four interrelated factors:
  • Institutional Pressures → motivation to adopt measurement reforms;
  • Technical Capacity → capability to perform valuation;
  • Professional Competence → knowledge and judgment required for IPSAS 46;
  • Perceived Measurement Complexity → cognitive and practical barriers.
Under this model, readiness emerges when institutional demands align with organizational and professional capacities, and when the perceived complexity of IPSAS 46 is manageable within existing capabilities (Ada & Christiaens, 2018).
This theoretical framework informs the development of the research instrument, guides the selection of variables for analysis, and establishes the foundation for interpreting empirical findings within a broader conceptual context.

3. Methodology

This study employs a theory-informed empirical design to examine public sector practitioners’ readiness and perceptions regarding the implementation of IPSAS 46 Measurement. The methodological approach reflects the multidimensional nature of readiness conceptualized in the theoretical framework and is structured to ensure analytical rigor, transparency, and alignment between the research questions, constructs, data collection, and analytical procedures.

3.1. Research Design

A cross-sectional, descriptive–explanatory research design was adopted. This design is appropriate for studies aiming to assess organizational readiness, perceptions, and contextual factors during reform implementation (Holt et al., 2007; Samarghandi et al., 2023). While descriptive elements capture the distribution of perceptions and capacity-related indicators, the explanatory dimension enables examination of associations between readiness determinants, such as professional competence, institutional support, technical capacity, and perceived measurement complexity.
The design aligns with the study’s conceptual model derived from institutional theory and readiness-for-change literature, which conceptualize readiness as a multidimensional construct shaped by interacting organizational and cognitive factors (Alessa, 2024; Armenakis & Harris, 2009).
Given the novelty of IPSAS 46 and the absence of previous empirical work, an exploratory component is also justified, providing initial evidence on perceptions and implementation readiness in transitional public sector systems.

3.2. Epistemological Positioning

The study follows a post-positivist epistemology, which acknowledges that perceptions and readiness can be measured empirically through structured instruments, while also recognizing contextual influences inherent in public sector environments.
This epistemological stance supports the use of quantitative measures to identify patterns and associations that reflect deeper institutional and capacity-related mechanisms.

3.3. Sampling Strategy and Participants

A purposive sampling strategy was employed to ensure the inclusion of public sector practitioners directly involved in financial reporting, asset valuation, and IPSAS implementation processes. Purposive sampling is justified in institutional and reform studies where expertise, functional roles, and institutional position are more relevant than statistical representativeness (Hyndman & Connolly, 2011; van Helden & Uddin, 2016).
Participants included the following:
  • Accountants and financial managers from budgetary organizations;
  • Employees responsible for asset management and valuation;
  • Representatives of public entities expected to apply IPSAS 46 in practice.
The sampling frame reflects the target population most affected by measurement reforms and, therefore, most capable of providing informed assessments of readiness and perceptions.
Demographic and professional characteristics (such as years of experience, education level, and prior IPSAS training) were collected to contextualize findings and assess potential associations with readiness variables.

3.4. Instrument Design and Construct Mapping

Data were collected using a structured questionnaire developed in alignment with the study’s conceptual model. The instrument includes items organized into four theoretical dimensions:
  • Institutional Support and Governance
    Perceived leadership commitment, internal guidelines, and organizational prioritization;
  • Technical Capacity
    Availability of IT systems, data quality, and valuation tools;
  • Professional Competence
    Practitioners’ knowledge, experience, and training related to IPSAS and measurement techniques;
  • Perceived Measurement Complexity
    Difficulty interpreting IPSAS 46 requirements, valuation uncertainty, and cognitive load.
Questionnaire items were rated predominantly on Likert-type scales, enabling the assessment of ordinal relationships and categorical associations.
Each item was explicitly mapped to constructs derived from the theoretical framework, ensuring construct validity and theoretical coherence.
Given the exploratory nature of IPSAS 46 implementation, the instrument was reviewed by domain experts to ensure relevance, clarity, and alignment with measurement principles established in the IPSAS 46 guidance notes (IPSASB, 2023c).
The questionnaire also included an item assessing the perceived importance of external institutional involvement in IPSAS 46 implementation, which was omitted inadvertently in the earlier description. The item stated:
“How important do you consider governmental or institutional involvement (e.g., provision of licensed valuation services, methodological guidance, or financial support) for the implementation of IPSAS 46?”
This variable is therefore derived directly from the survey instrument and is fully consistent with the analytical framework of the study.

3.5. Data Analysis Strategy

Data were analyzed using descriptive statistics and association tests to evaluate relationships between readiness-related constructs. Because the data consist primarily of categorical and ordinal variables, the Pearson χ2 test was selected as the primary method for assessing associations.
The χ2 test is appropriate under the following conditions:
  • Variables represent categorical perceptions or readiness indicators;
  • Sample size is sufficient to meet the minimum expected-cell conditions;
  • The goal is to identify statistically significant associations, not causality (Agresti, 2019).
The χ2 analysis was applied to examine whether variables such as professional competence, training exposure, institutional support, and measurement experience are associated with perceptions of IPSAS 46 feasibility and readiness. This approach is consistent with international studies on public sector reform readiness and capacity (Maali & Morshed, 2025; Caruana et al., 2019).
Where expected frequencies were low, Fisher’s exact tests were considered. Effect sizes (Cramer’s V) were reported to assess the strength of associations.
Given the exploratory character of the study and the absence of prior empirical research on IPSAS 46 readiness, the analysis deliberately focuses on bivariate association techniques. The objective is not to establish causal relationships, but to identify initial empirical patterns and relationships that may inform future multivariate and longitudinal research. This approach is consistent with prior public sector accounting studies examining early-stage reforms and emerging standards.

3.6. Reliability, Validity, and Limitations

3.6.1. Reliability

Internal consistency was assessed using Cronbach’s alpha for multi-item constructs. Reliability thresholds above 0.70 were considered acceptable for exploratory research (Lamm et al., 2020).

3.6.2. Construct Validity

Construct validity was supported through the following measures:
  • Theoretical mapping of items to conceptual dimensions;
  • Expert review of instrument content;
  • Alignment with IPSAS measurement requirements and readiness theory.

3.6.3. Internal Validity

Because this study had a non-experimental design, causality was not inferred. Instead, the study focused on associations and theoretical plausibility.

3.6.4. External Validity

Generalizability was limited to similar public sector contexts, particularly transitional systems with emerging IPSAS adoption.

3.6.5. Limitations

  • Self-reported data may be affected by social desirability bias.
  • Purposive sampling restricts generalizability.
  • The novelty of IPSAS 46 means that some respondents may lack full familiarity, influencing perceptions.
Despite these limitations, the methodological design provides robust empirical insights consistent with the aims of exploratory, theory-informed public sector research.

4. Results and Interpretation

The results are presented in line with the research questions and focus on statistically tested associations. Descriptive contextual information is provided only where necessary to support interpretation.
This section presents the empirical results and interprets them through the theoretical lenses discussed earlier—namely institutional theory, readiness-for-change models, and valuation/measurement complexity in the public sector. Unlike purely descriptive interpretations, the present analysis emphasizes conceptual integration, institutional mechanisms, and behavioral determinants that influence practitioners’ readiness to apply IPSAS 46.
The study involved approximately 1000 respondents. The target population comprised professionals involved in public sector financial reporting and IPSAS implementation, including financial managers and accountants employed in public sector entities, other administrative staff from government agencies, municipalities, and legal entities under public law, as well as representatives of academia and professional organizations engaged in IPSAS-related activities. Despite representing different areas of professional activity, all participants shared one common characteristic: professional involvement in public sector financial reporting and the process of standards implementation. Consequently, they can be considered competent and informed stakeholders, capable of providing in-depth and accurate evaluations of the core issues addressed in this research.
As shown in Table 1, the majority of respondents hold advanced academic qualifications: 25% have a bachelor’s degree, while 58% hold a master’s degree. Professional experience levels among the respondents are also notably high. These findings indicate that a significant proportion of those engaged in accounting-related activities already possess specialized expertise and, most likely, the relevant professional competencies. This enhances both the reliability of the survey and the analytical significance of the research findings.

4.1. Overview of Data Patterns

The initial descriptive statistics revealed substantial variation across respondents in terms of the following factors:
  • IPSAS knowledge levels;
  • Perceived institutional support;
  • Resource availability;
  • Attitudes toward the feasibility of implementing IPSAS 46.
These variations reflect the heterogeneous institutional capacities within Georgia’s public sector, aligning with earlier studies highlighting uneven implementation across transitional systems (Gomes et al., 2019; Zasadnyi & Konovalenko, 2025).
Given these disparities, subsequent analysis focused on associative relationships using Pearson’s χ2 statistics, supported by Cramer’s V effect sizes, consistent with the methodological approach for categorical reform-readiness data.

4.2. Professional Knowledge and Support for IPSAS Adoption

A statistically significant association was identified between respondents’ knowledge of IPSAS and their perception of the necessity of adopting the standards. The test result, χ2(4, N = 990) = 28.74, p < 0.001, indicates that higher IPSAS knowledge is strongly associated with more favorable attitudes toward adoption. The strength of the association was assessed using Cramer’s V, yielding V = 0.171, indicating a small-to-moderate effect size consistent with heterogeneous knowledge distribution across respondents (Table 2).
Interpretation through theory
  • Institutional theory suggests that norms and professional identities shape acceptance of reforms. Greater knowledge strengthens normative alignment, making practitioners more receptive to institutional change (Benito et al., 2007).
  • Readiness-for-change theory (Holt et al., 2007) emphasizes change efficacy: knowledge enhances confidence in applying IPSAS 46, increasing perceived feasibility.
  • The finding is consistent with prior evidence that competence is a primary determinant of IPSAS adoption success (Caruana et al., 2019).
Thus, professional competence emerges as a foundational driver of readiness, reinforcing the need for structured capacity-building interventions.

4.3. Resource Constraints and Perceived Feasibility of Measurement

Budgetary constraints also showed a significant association with readiness: χ2(2, N = 400) = 16.58, p < 0.001. Cramer’s V was calculated to assess effect size, resulting in V = 0.143, which reflects a small but meaningful association between budget limitations and readiness perceptions. Respondents identifying financial limitations were markedly less likely to support immediate IPSAS implementation.
Interpretation through theory
  • Institutional capacity theory predicts that reforms requiring high-cost technical expertise—such as fair value and current operational value—are more difficult to implement in resource-constrained settings (Boka, 2010).
  • Valuation theory highlights that the absence of budgetary resources restricts access to licensed valuers and impedes the generation of reliable measurement inputs (Polzer et al., 2021).
  • Readiness theory positions resource availability as one of the core antecedents of organizational readiness.
Thus, financial constraints represent a structural barrier, explaining why IPSAS-compliant valuation remains theoretical in many institutions despite formal adoption.

4.4. Institutional Support and Institutional Readiness

A third association was observed between perceptions of Institutional involvement and support for IPSAS adoption: χ2(2, N = 980) = 21.94, p < 0.001. Effect size analysis using Cramer’s V produced V = 0.155, indicating that institutional support exerts a measurable influence on readiness.
Interpretation through theory
Governmental actors shape the institutional environment in which organizations operate. However, in Georgia:
  • Valuation must legally be conducted by licensed independent experts rather than government bodies (Sabauri, 2024).
  • Government support is therefore indirect—legal frameworks, funding mechanisms, and policy alignment.
  • The findings demonstrate the following:
  • Practitioners who perceive stronger governmental coordination show higher readiness.
  • Institutional legitimacy and clear governance structures function as catalysts for change (Caperchione et al., 2016).

4.5. Integrated Interpretation: Institutional, Professional, and Organizational Dynamics

The combined results indicate that readiness for IPSAS 46 is shaped by three reinforcing dimensions:
Determinant Evidence Interpretation
Professional knowledge χ2 significant; higher knowledge = higher supportStrengthens cognitive readiness and reform legitimacy
Resource availability χ2 significant; budget scarcity = lower readinessStructural barrier that limits measurement capability
Institutional support χ2 significant; perceived support = higher readinessEnhances normative pressure and reduces uncertainty
This triangulation confirms the theoretical model developed in Section 2: readiness emerges from the interaction between institutional pressures, organizational capacity, and professional competence.

4.6. Visualization of Key Findings

“Accounting education forms the fundamental basis of accounting practice; therefore, it is consistently regarded as part of the effort to bridge the gap between theoretical education and practical application” (Babatunde et al., 2025; UNCTAD, 2022). It also represents a key factor in assessing the extent to which individuals are prepared for the practical implementation of IPSAS.
In Figure 1, Categories 1 and 2 reflect participation in training and readiness regarding IPSAS. The purpose of the questions was to determine how well-informed and prepared professional accountants are in relation to the International Public Sector Accounting Standards. To this end, participants were asked several interrelated questions, including self-assessment of their knowledge and prior experience with relevant training.
Analysis of the responses revealed that, in Georgia, despite the growing strategic importance of IPSAS implementation for improving public financial management, only 28% of accountants in this sector possess a good knowledge of IPSAS. This indicates the need for targeted educational interventions. The strategic communication focus should be on 43% of employees (32% have not undergone training, and 11% have no interest), to enhance their motivation and awareness of the standards’ significance.
Categories 3 and 4 illustrate the perceptions regarding the IPSAS implementation process. These questions highlighted accountants’ views on the adoption of international standards within the Georgian public sector. The results show that, despite a consensus on the necessity of recognition and a generally optimistic attitude towards the implementation process, some respondents remain uncertain about the completion timeline, likely due to challenges, resource constraints, or lack of experience.
Regarding the use of subsequent measurement models for recognized main assets (Figure 2), it is evident that the majority—91%—employ the historical cost model, indicating a conservative practice. This approach is straightforward and does not require frequent revaluations or external assessments (IPSASB, 2023b). The low utilization of the revaluation model (9%) may suggest a lack of resources, limited methodological or personnel support, or a preference for practical simplicity.
→ Bar chart: % distribution across knowledge levels.
Although 91% of the respondents indicated using the historical cost model in the previous item, the question regarding the use of the fair value model revealed that 59% reported employing the fair value approach to some extent. This likely applies to specific assets or transactions where fair value is necessary—for instance, assets received through non-exchange transactions or financial instruments.
The next question focused on the valuation method used for assets obtained through non-exchange transactions. Approximately three-quarters of respondents reported valuing these assets at historical cost, even though non-exchange transactions generally exclude any consideration. This may indicate either misinterpretation of the standards by accounting personnel or, again, a lack of resources (Custodio et al., 2025).
Furthermore, Figure 3 illustrates the readiness of financial departments to value assets at current operational value. According to the responses, 58% believe that non-financial assets should be assessed by licensed valuation experts, while 42% support valuation by the financial department staff.
According to the study results (Figure 4), 23% of assets in the financial statements are recorded at historical cost. These include assets not subject to depreciation, such as unfinished construction, land, library collections, unfinished intangible assets, inherited assets, and valuables (IPSASB, 2023a, 2023b). Meanwhile, 76% of assets are recorded at carrying amount, and only 1% at fair value.
The next question (Figure 5) addressed why respondents do not use the fair value method for asset valuation. Thirty percent of respondents indicated that the reason is Article 11 of the “Instructions on the Preparation of Financial Accounting and Reporting Based on International Public Sector Accounting Standards for Public Sector Organizations.” According to this article, certain IPSAS requirements—including asset impairment and fair value measurement—are not mandatory until 1 January 2027 (Ministry of Finance of Georgia, 2020; Vardiashvili, 2025). Seventy percent of respondents cited limited budgets as the main reason for not using the fair value model, which prevents them from financing the associated costs. These costs notably include hiring external experts, conducting market research, and developing methodological support (Ascani et al., 2021).
According to the legislation in force in Georgia, the fair value assessment of assets must be carried out by licensed valuation experts, which makes the process particularly challenging for public institutions with limited financial and human resources. As a result, impairment assessments are not systematically conducted, and assets are often recorded at historical cost, even when clear indicators of impairment exist (Vardiashvili, 2018).
The Public Sector Accounting Assessment (PULSE) Report of Georgia (May 2025) notes that property, plant, and equipment (PPE), intangible assets, investment properties, and employee benefits generally demonstrate significant conceptual alignment with IPSAS standards, yielding mostly A or B ratings. However, the actual implementation level is assessed as low, primarily receiving C or D ratings, especially for PPE, machinery and equipment, leases, service concession arrangements, and biological assets (World Bank, 2025; Sabauri, 2018).
According to IPSAS 12, inventories should be measured at the lower of cost and net realizable value (or current replacement cost). Nevertheless, due to the small volume of inventories in the public sector, this requirement has limited practical application (IPSASB, 2022).

4.7. Synthesis: Why Measurement Reform Remains Difficult

The results provide strong empirical evidence supporting a conclusion also reflected in the international literature:
  • The adoption of IPSAS 46 requires new technical capabilities;
  • Institutional coherence;
  • Financial investment;
  • Professional transformation.
In Georgia, these conditions exist partially, explaining the gap between formal adoption and practical implementation, consistent with World Bank (2025) assessments of low practical compliance in PPE and other measurement-intensive assets.
Thus, the empirical findings form a coherent narrative when interpreted through the theoretical model.

5. Discussion and Contributions

This section synthesizes the empirical results presented in Section 4 with the theoretical constructs outlined in Section 2. The goal is to move beyond descriptive findings and articulate how the evidence advances scholarly understanding of public sector measurement reforms, institutional readiness, and the implementation challenges surrounding IPSAS 46 Measurement.

5.1. Interpreting Readiness Through Institutional Theory

The study’s findings confirm that readiness for IPSAS 46 is shaped by institutional dynamics consistent with the broader literature on public sector accounting change (Sabauri et al., 2023).
The significant association between perceived institutional support and readiness demonstrates that coercive and normative pressures play an important role in shaping practitioners’ acceptance of reforms (DiMaggio & Powell, 2000). Respondents who perceived greater governmental guidance exhibited higher readiness, reinforcing prior findings that reform legitimacy depends on authoritative institutional signals (Azegagh & Zyani, 2024).
However, the results also highlight a contrast with traditional institutional accounts: institutional pressure alone is insufficient to generate readiness when technical capacity and resource availability remain limited. This supports the argument by Grossi and Steccolini (2015) that institutional reforms must be accompanied by organizational capabilities to yield substantive—not merely symbolic—implementation.
Thus, the study expands institutional theory by demonstrating that in measurement-intensive reforms such as IPSAS 46, institutional pressure interacts synergistically with technical and cognitive capacity, rather than serving as an independent driver of change.

5.2. Professional Competence as a Core Mechanism of Readiness

The strong relationship between IPSAS knowledge and readiness illustrates that professional competence is a central mechanism through which readiness emerges. This finding directly supports change-readiness theory (Holt et al., 2007), which emphasizes that belief in one’s capacity to enact change—change efficacy—is essential for the successful adoption of complex reforms (Rudzioniene & Juozapaviciute, 2014).
More importantly, the evidence contributes new theoretical insight:
In the context of valuation-based standards, competence is not merely a technical skill but a form of cognitive alignment with the underlying logic of measurement. Respondents with greater IPSAS familiarity demonstrate stronger normative commitment to the standard’s aims and a deeper understanding of its valuation principles.
This extends the work of Polzer et al. (2021) by showing empirically that individuals’ cognitive readiness is tied not only to training but also to their internalization of the valuation logic that underpins IPSAS 46.

5.3. Resource Constraints as Structural Barriers to Measurement Reforms

The strong association between resource scarcity and low readiness provides compelling empirical evidence that technical reforms requiring valuation cannot be implemented under conditions of budgetary insufficiency. This finding reinforces arguments by Caruana et al. (2019), who indicate that measurement reforms are uniquely resource-intensive because they require several factors:
  • Licensed valuers;
  • Specialized assessment tools;
  • Consistent market data;
  • Significant organizational investment.
The Georgian context illustrates this challenge: even when institutional support is perceived positively, the lack of financial resources creates a structural barrier that prevents entities from operationalizing IPSAS 46’s requirements.
Thus, this study contributes to the literature by empirically demonstrating that, for measurement-focused reforms, resource availability acts as a primary conditioning factor, moderating the relationship between institutional pressure and actual readiness.

5.4. Measurement Complexity and Cognitive Barriers

The Results highlight practitioners’ concerns about valuation uncertainty, lack of comparable market data, and methodological ambiguity. These concerns correspond closely with theoretical predictions regarding measurement complexity (Caruana et al., 2019; Oulasvirta, 2014).
The findings extend this body of research by clarifying two new mechanisms:
  • Cognitive load—Practitioners experience difficulty interpreting IPSAS 46 guidance due to the abstract nature of current value measurement and the judgment required to select appropriate bases.
  • Perceptual risk—Respondents perceive a greater risk of misstatement or audit challenge when applying valuation techniques in settings with limited data.
By empirically documenting these mechanisms, the study deepens theoretical understanding of how cognitive barriers influence readiness for valuation-focused reforms.

5.5. Integrating Findings into a Theoretical Model of IPSAS 46 Readiness

Synthesizing the above analyses yields a multidimensional explanatory model:
  • Institutional legitimacy influences readiness but does not translate into implementation without the following:
    Adequate technical capacity;
    Sufficient resources;
    Strong professional competence.
  • Professional knowledge enhances both attitudinal support and perceived capacity, serving as the cognitive pathway through which institutional expectations become operational reality.
  • Resource constraints act as a moderating factor that weakens institutional influence and reduces readiness even in supportive environments.
  • Measurement complexity forms a separate cognitive barrier that reinforces resistance or cautious attitudes toward IPSAS 46 implementation.
This integrated theoretical perspective advances existing scholarship by highlighting the interactive nature of readiness determinants, particularly for measurement-intensive accounting standards.

5.6. Theoretical Contributions

Based on this synthesis, the study makes three principal contributions to the literature:
  • Conceptual Expansion of Readiness Theory
It develops a context-specific readiness framework for valuation-based reforms, integrating institutional, cognitive, and resource-based determinants—an area previously underexplored.
2.
Empirical Evidence on IPSAS 46 Implementation
This is among the first studies to provide empirical insights into practitioners’ perceptions of IPSAS 46, addressing a gap in both global accounting literature and transitional public sector reform research.
3.
Advancement of Public Sector Measurement Theory
By showing how practitioners interpret and react to current value measurement requirements, the study enriches understanding of the practical and cognitive challenges that valuation poses in the public sector.

5.7. Practical and Policy Contributions

Beyond theoretical implications, the findings offer several practical recommendations:
  • Targeted training programs should focus not only on IPSAS knowledge but also on valuation judgment and measurement methodology.
  • Government agencies must provide stronger institutional support, including clear implementation guidelines and financial allocations.
  • Public entities should invest in data systems and valuation tools to support current value measurement.
  • Regulators may consider the phased implementation of IPSAS 46 to mitigate capacity gaps.

6. Conclusions

This study provides initial empirical insights into public sector practitioners’ readiness and perceptions regarding the implementation of IPSAS 46 Measurement, offering one of the first empirical analyses of valuation-oriented reforms in a transitional public sector context. Drawing on institutional theory, readiness-for-change frameworks, and valuation literature, the findings reveal that readiness for IPSAS 46 emerges from the interaction of institutional legitimacy, professional competence, resource availability, and perceived measurement complexity.
The empirical evidence demonstrates that institutional support alone is insufficient to ensure effective implementation: readiness materializes only when organizational capacities and professional knowledge align with institutional expectations. Furthermore, the results highlight that resource constraints and valuation uncertainty remain significant barriers, restricting the ability of public sector entities to operationalize current value measurement principles. These insights deepen theoretical understanding by framing readiness as a multidimensional construct conditioned by cognitive, structural, and institutional forces.
From a practical perspective, the findings underscore the need for targeted professional development, enhanced data and valuation systems, and stronger institutional coordination to facilitate successful IPSAS 46 adoption. Policymakers and regulators should recognize that implementing valuation standards requires not only legal alignment but also sustained investment in organizational capabilities and technical expertise.
Finally, this study opens new avenues for future research. Subsequent studies could employ longitudinal designs to assess readiness over time, explore comparative perspectives across countries undergoing similar reforms, or investigate the role of auditors and valuation experts in shaping measurement outcomes. As IPSAS 46 becomes operational, further empirical work will be essential to evaluating its impact on public sector transparency, comparability, and decision-usefulness.
Overall, the study contributes conceptual, empirical, and practical insights that advance the understanding of readiness for measurement reforms and inform the broader discourse on public sector accounting transformation.
The findings should be interpreted as exploratory and context-specific, offering a foundation for more advanced empirical investigations rather than definitive conclusions.

Author Contributions

Conceptualization, L.S.; methodology, L.S. and M.V.; data collection, M.V. and M.M.; data analysis, L.S.; writing—original draft preparation, L.S.; writing—review and editing, M.V. and M.M.; supervision, L.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Ethical review and approval were waived for this study because the research involved an anonymous voluntary survey with no collection of personal, sensitive, or identifiable data, and therefore did not fall under mandatory institutional ethics approval requirements in Georgia. All procedures complied with the principles of the Declaration of Helsinki.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. ACCA. (2017). IPSAS implementation: Current status and challenges. Available online: https://www.accaglobal.com/pk/en/professional-insights/global-profession/ipsas-implementation-current-status-and-challenges.html (accessed on 5 May 2025).
  2. Ada, S. S., & Christiaens, J. (2018). The magic shoes of IPSAS: Will they fit Turkey? Available online: https://scispace.com/pdf/the-magic-shoes-of-ipsas-will-they-fit-turkey-512hi6wi4a.pdf (accessed on 15 May 2025).
  3. Agresti, A. (2019). An introduction to categorical data analysis (3rd ed.). John Wiley & Sons. Available online: https://users.stat.ufl.edu/~aa/cda/solutions-part.pdf (accessed on 3 April 2025).
  4. Alessa, N. (2024). Exploring the effect of international public sector accounting standards adoption on national resource allocation efficiency in developing countries. Public and Municipal Finance, 13(1), 1–13. [Google Scholar] [CrossRef] [PubMed]
  5. Armenakis, A. A., & Harris, S. G. (2009). Reflections: Our journey in organizational change research and practice. Journal of Change Management, 9(2), 127–142. [Google Scholar] [CrossRef]
  6. Ascani, I., Ciccola, R., & Chiucchi, M. S. (2021). A structured literature review about the role of management accountants in sustainability accounting and reporting. Sustainability, 13(4), 2357. [Google Scholar] [CrossRef]
  7. Azegagh, J., & Zyani, M. (2024). Accrual accounting and IPSAS implementation in developing countries: The Moroccan case. Revue Française d’Économie et de Gestion, 5(9), 173–183. [Google Scholar]
  8. Babatunde, S. A., Aderibigbe, A. A., & Fadairo, I. O. (2025). IPSAS and financial accountability relationship in public finance management. Available online: https://www.researchgate.net/publication/398109601 (accessed on 23 June 2025).
  9. Beke-Trivunac, J., & Živkov, E. (2024). Measurement of assets and liabilities under the IAS/IPSAS conceptual framework. Revizor, 27(108), 263–270. [Google Scholar] [CrossRef]
  10. Benito, B., Brusca, I., & Montesinos, V. (2007). The harmonization of government financial information systems: The role of the IPSASs. International Review of Administrative Sciences, 73(2), 293–317. [Google Scholar] [CrossRef]
  11. Boka, M. (2010). On the international convergence of accounting standards. International Journal of Business and Management, 5(4), 89. [Google Scholar] [CrossRef]
  12. Caperchione, E., Brusca, I., Cohen, S., & Manes Rossi, F. (2016). Editorial: Innovations in public sector financial management. International Journal of Public Sector Performance Management, 2(4), 303–309. [Google Scholar]
  13. Caruana, J., Brusca, I., Caperchione, E., & Manes Rossi, F. (2019). Exploring the relevance of accounting frameworks in the pursuit of financial sustainability of public sector entities: A holistic approach. In Financial sustainability of public sector entities. Springer International Publishing. [Google Scholar] [CrossRef]
  14. Cenar, I., & Cioban, E. (2022). Implementation of international public sector accounting standards (IPSAS): Variables and challenges, advantages and disadvantages. Annals of the University of Petroşani, Economics, 22(1), 55–64. [Google Scholar]
  15. Christensen, C. M., McDonald, R., Altman, E. J., & Palmer, J. E. (2018). Disruptive innovation: An intellectual history and directions for future research. Journal of Management Studies, 55(7), 1043–1078. [Google Scholar] [CrossRef]
  16. Custodio, C., Mendes, D., & Metzger, D. (2025). The impact of the financial education of executives on the financial practices of medium and large enterprises. The Journal of Finance, 80(5), 2875–2920. [Google Scholar] [CrossRef]
  17. DiMaggio, P. J., & Powell, W. W. (2000). The iron cage revisited: Isomorphism in organizational fields. Advances in Strategic Management, 48(2), 147–160. [Google Scholar] [CrossRef]
  18. Dung, N. T. P., & Lien, N. T. H. (2024). Factors affecting the stages of management accounting evolution: The developing market research. Journal of Governance and Regulation, 13(2), 452–464. [Google Scholar] [CrossRef]
  19. Gomes, P., Brusca, I., & Fernandes, M. J. (2019). Implementing the international public sector accounting standards for consolidated financial statements: Facilitators, benefits and challenges. Public Money & Management, 39(3), 544–552. [Google Scholar] [CrossRef]
  20. Grossi, G., & Steccolini, I. (2015). Pursuing private or public accountability in the public sector? Applying IPSASs to define the reporting entity in municipal consolidation. International Journal of Public Administration, 38(4), 325–334. [Google Scholar] [CrossRef]
  21. Holt, D. T., Armenakis, A. A., Feild, H. S., & Harris, S. G. (2007). Readiness for organizational change: The systematic development of a scale. Journal of Applied Behavioral Science, 43(2), 232–255. [Google Scholar] [CrossRef]
  22. Hyndman, N., & Connolly, C. (2011). Accruals accounting in the public sector: A road not always taken. Management Accounting Research, 22(1), 36–45. [Google Scholar] [CrossRef]
  23. ICAEW. (2023). IFRS vs. IPSAS in public sector financial reporting: Part II measurement. Available online: https://www.icaew.com/technical/public-sector/public-sector-financial-reporting/financial-reporting-insights-listing/ifrs-vs-ipsas-2 (accessed on 5 April 2025).
  24. IPSASB. (2021). ED 77: Measurement [exposure draft]. Available online: https://www.ipsasb.org/publications/exposure-draft-ed-77-measurement (accessed on 3 July 2025).
  25. IPSASB. (2022). IPSAS 12: Inventories. International Public Sector Accounting Standards Board. [Google Scholar]
  26. IPSASB. (2023a). IPSAS 31: Intangible assets. International Public Sector Accounting Standards Board. [Google Scholar]
  27. IPSASB. (2023b). IPSAS 45: Property, plant and equipment. International Public Sector Accounting Standards Board. [Google Scholar]
  28. IPSASB. (2023c). IPSAS 46: Measurement. International Public Sector Accounting Standards Board. [Google Scholar]
  29. Lamm, K. W., Lamm, A. J., & Edgar, D. (2020). Scale development and validation: Methodology and recommendations. Journal of International Agricultural and Extension Education, 27(2), 24–35. [Google Scholar] [CrossRef]
  30. Lapsley, I. (2009). New public management: The cruellest invention of the human spirit? Abacus, 45(1), 1–21. [Google Scholar] [CrossRef]
  31. Maali, B. M., & Morshed, A. (2025). Impact of IPSAS adoption on governance and corruption: A comparative study of Southern Europe. Journal of Risk and Financial Management, 18(2), 67. [Google Scholar] [CrossRef]
  32. Ministry of Finance of Georgia. (2020). Order no. 108: On the approval of the instruction on accounting and financial reporting by budgetary organizations based on the International Public Sector Accounting Standards (IPSASs). Available online: https://mof.ge (accessed on 22 May 2025).
  33. Oulasvirta, L. (2014). Governmental financial accounting and European harmonisation: Case study of Finland. Accounting, Economics, and Law, 4(3), 237–264. [Google Scholar] [CrossRef]
  34. Polzer, T., Grossi, G., & Reichard, C. (2021). Implementation of the international public sector accounting standards in Europe. Variations on a global theme. Accounting Forum, 46(1), 57–82. [Google Scholar] [CrossRef]
  35. Rudzioniene, K., & Juozapaviciute, T. (2014). Quality of financial reporting in public sector. Social Sciences, 82(4), 17–25. [Google Scholar] [CrossRef]
  36. Sabauri, L. (2018, June 2–3). Lease accounting: Specifics. 167th The IIER International Conference, Berlin, Germany. Available online: https://scholar.google.com/citations?view_op=view_citation&hl=en&user=mjrj6wsAAAAJ&citation_for_view=mjrj6wsAAAAJ:zYLM7Y9cAGgC (accessed on 25 May 2025).
  37. Sabauri, L. (2024). Internal audit’s role in supporting sustainability reporting. International Journal of Sustainable Development & Planning, 19(5), 1981. [Google Scholar] [CrossRef]
  38. Sabauri, L., Vardiashvili, M., & Maisuradze, M. (2022). Methods for measurement of progress of performance obligation under IFRS 15. Ecoforum, 11(3), 29. [Google Scholar]
  39. Sabauri, L., Vardiashvili, M., & Maisuradze, M. (2023). On recognition of contract asset and contract liability in the financial statements. Ecoforum, XIX, 42–49. [Google Scholar]
  40. Samarghandi, H., Askarany, D., & Banitalebi Dehkordi, B. (2023). A hybrid method to predict human action actors in accounting information system. Journal of Risk and Financial Management, 16(1), 37. [Google Scholar] [CrossRef]
  41. UNCTAD. (2022). Report of the Intergovernmental Working Group of experts on international standards of accounting and reporting: Thirty-ninth session, Geneva. Available online: https://unctad.org (accessed on 24 May 2025).
  42. van Helden, J., & Uddin, S. N. (2016). Public sector management accounting in emerging economies: A literature review. Critical Perspectives on Accounting, 41, 34–62. [Google Scholar] [CrossRef]
  43. Vardiashvili, M. (2018). Theoretical and practical aspects of impairment of non-cash-generating assets in the public sector entities, according to the international public sector accounting standard (IPSAS) 21. Ecoforum, 7(3), 830. [Google Scholar]
  44. Vardiashvili, M. (2024). Review of measurement-related changes in international public sector accounting standards (IPSAS). Economics and Business, 16(3), 42–48. [Google Scholar] [CrossRef]
  45. Vardiashvili, M. (2025). Measurement of non-financial assets at current operational value. Finance Accounting and Business Analysis, 7(1), 109–119. [Google Scholar] [CrossRef]
  46. Wang, X. (2014). Financial management in the public sector: Tools, applications and cases (3rd ed.). Routledge. Available online: https://api.taylorfrancis.com/content/books/mono/download?identifierName=doi&identifierValue=10.4324/9781315704333&type=googlepdf (accessed on 14 April 2025).
  47. World Bank. (2025). Georgia public sector accounting assessment (PULSE report). Available online: https://cfrr.worldbank.org/publications/public-sector-accounting-assessment-pulse-report-georgia (accessed on 14 April 2025).
  48. Zasadnyi, M., & Konovalenko, L. (2025). Implementation of international public sector accounting standards by developing countries. Economy and Society. [Google Scholar] [CrossRef]
Figure 1. Distribution of IPSAS knowledge levels. Public sector readiness for IPSAS implementation. The figure combines four categories: Category 1—respondents’ self-assessment of IPSAS knowledge (6% do not know, 66% average knowledge, 28% good knowledge); Category 2—experience with IPSAS training (57% have attended, 32% have not attended but willing to participate, 11% have not attended and do not intend to); Category 3—perception of the necessity of IPSAS implementation (1% not necessary, 18% necessary because required by law, 81% necessary as it improves financial reporting); Category 4—expectations regarding the timeline for IPSAS adoption (7% within 1–2 years, 35% within 3–4 years, 58% do not know).
Figure 1. Distribution of IPSAS knowledge levels. Public sector readiness for IPSAS implementation. The figure combines four categories: Category 1—respondents’ self-assessment of IPSAS knowledge (6% do not know, 66% average knowledge, 28% good knowledge); Category 2—experience with IPSAS training (57% have attended, 32% have not attended but willing to participate, 11% have not attended and do not intend to); Category 3—perception of the necessity of IPSAS implementation (1% not necessary, 18% necessary because required by law, 81% necessary as it improves financial reporting); Category 4—expectations regarding the timeline for IPSAS adoption (7% within 1–2 years, 35% within 3–4 years, 58% do not know).
Jrfm 19 00036 g001
Figure 2. Perceived barriers to IPSAS 46 implementation.
Figure 2. Perceived barriers to IPSAS 46 implementation.
Jrfm 19 00036 g002
Figure 3. Responsible entity for valuation of non-financial assets—respondents’ perspectives.
Figure 3. Responsible entity for valuation of non-financial assets—respondents’ perspectives.
Jrfm 19 00036 g003
Figure 4. The valuation methods used by respondents for reflecting assets in financial statements are.
Figure 4. The valuation methods used by respondents for reflecting assets in financial statements are.
Jrfm 19 00036 g004
Figure 5. Use of the fair value model.
Figure 5. Use of the fair value model.
Jrfm 19 00036 g005
Table 1. Characteristics of respondents: educational level, work experience, and positions held—descriptive statistics.
Table 1. Characteristics of respondents: educational level, work experience, and positions held—descriptive statistics.
Characteristics
Level of EducationYears of ExperienceCurrent Position
Education%Years of Experience%Position%
High School11–5 years6Certified Accountant65
Vocational College (two-year program)126–10 years10Practicing Accountant33
Bachelor’s Degree2511–15 years22Trainee2
Master’s Degree5816–20 years25--
Doctoral Studies4Over 20 years37--
Table 2. Summary of chi-square associations.
Table 2. Summary of chi-square associations.
No.Factorχ2dfNpInterpretation
1IPSAS Knowledge × Attitude28.744990<0.001Knowledge increases readiness
2Budgetary Constraints × Attitude16.582400<0.001Lack of funds reduces readiness
3Institutional Support × Attitude21.942980<0.001Institutional support increases readiness
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sabauri, L.; Vardiashvili, M.; Maisuradze, M. Readiness and Perceptions of IPSAS 46 “Measurement” Implementation in Public Sector Entities: Evidence from Georgia. J. Risk Financial Manag. 2026, 19, 36. https://doi.org/10.3390/jrfm19010036

AMA Style

Sabauri L, Vardiashvili M, Maisuradze M. Readiness and Perceptions of IPSAS 46 “Measurement” Implementation in Public Sector Entities: Evidence from Georgia. Journal of Risk and Financial Management. 2026; 19(1):36. https://doi.org/10.3390/jrfm19010036

Chicago/Turabian Style

Sabauri, Levan, Mariam Vardiashvili, and Marina Maisuradze. 2026. "Readiness and Perceptions of IPSAS 46 “Measurement” Implementation in Public Sector Entities: Evidence from Georgia" Journal of Risk and Financial Management 19, no. 1: 36. https://doi.org/10.3390/jrfm19010036

APA Style

Sabauri, L., Vardiashvili, M., & Maisuradze, M. (2026). Readiness and Perceptions of IPSAS 46 “Measurement” Implementation in Public Sector Entities: Evidence from Georgia. Journal of Risk and Financial Management, 19(1), 36. https://doi.org/10.3390/jrfm19010036

Article Metrics

Back to TopTop