Next Article in Journal
Stock Liquidity and Social Media Analyst Coverage: Evidence from Tick Size Pilot Program
Previous Article in Journal
Determining the Most Suitable Distribution and Estimation Method for Extremes in Financial Data with Different Volatility Levels
Previous Article in Special Issue
Influence of FinTech Paylater, Financial Well Being, Behavioral Finance, and Digital Financial Literacy on MSME Sustainability in South Sumatera
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Trust in Financial Technology: The Role of Financial Literacy, Digital Financial Literacy, Technological Literacy, and Trust in Artificial Intelligence

Lacy School of Business, Butler University, Indianapolis, IN 46208, USA
*
Author to whom correspondence should be addressed.
J. Risk Financial Manag. 2026, 19(2), 97; https://doi.org/10.3390/jrfm19020097
Submission received: 11 December 2025 / Revised: 18 January 2026 / Accepted: 21 January 2026 / Published: 2 February 2026
(This article belongs to the Special Issue The Role of Financial Literacy in Modern Finance)

Abstract

This study examines the relationships among financial literacy, digital financial literacy, technological literacy, and trust in artificial intelligence (AI) as predictors of consumer trust in fintech applications involving robo-advisors or chatbots. A sample of 117 college students responded to an online survey with scales designed to measure these constructs. Results confirmed that the three literacy measures were significantly correlated, reflecting their overlapping knowledge and cognitive perspective. However, trust in AI showed no significant correlation with any literacy measure, and regression analysis revealed that trust in AI was the sole statistically significant predictor of trust in consumer fintech. These findings suggest that fintech adoption is driven largely by trust rather than financial or technological competence, creating potential vulnerabilities when consumers lack the literacy to evaluate AI-generated financial advice. The results highlight the need for financial education programs to integrate fintech alongside traditional literacy topics and suggest a possible role for regulatory reform to support users of fintech.

1. Introduction

The rapid digitalization of financial services is transforming how consumers manage their finances, requiring new competencies for effective decision-making. Traditional financial literacy, defined as the knowledge and skills necessary to make informed financial decisions, must now be expanded to incorporate digital financial literacy in the context of artificial intelligence (AI)-enabled consumer financial technology (fintech). This study investigates the relationships among these distinct yet interconnected competencies, examining how trust in fintech is related to consumer knowledge and general trust in AI.
Fintech itself is a relatively amorphous term that has been defined as simply as the application of technology to improve financial activities (Schueffel, 2016) or more comprehensively as encompassing multiple areas: finance and investments, internal operations and risk management, payments and infrastructure, data security and monetization, and customer interface (Giglio, 2021). Thus, fintech comprises innovations in both the internal operations of financial institutions and consumer-facing technologies. Research has generally reported positively on the results of fintech, including financial inclusion, user experience, and product quality (Abis et al., 2025).
For traditional banks and financial start-up companies that employ fintech the results have also been largely positive, including lower operating costs through efficiency gains, stronger risk controls, and better service efficiency (Sajid et al., 2023; Wang et al., 2021). However, fintech has also created a more competitive landscape for traditional banks as they seek to leverage their information advantage and existing relationships to maintain market position and profitability (Stulz, 2019). This dual nature of fintech as both operational tool and competitive force underscores the importance of understanding how consumers navigate the evolving financial ecosystem. The present survey-based study focuses entirely on robo-advisors and chatbots that customers could interact with for financial advice. The purpose of the research is to investigate potential causes of consumer trust in fintech, which presumably precedes and correlates with fintech adoption.

2. Literature Review

The growing adoption of digital financial services has prompted researchers to examine how traditional financial literacy relates to technological capabilities and their combined effect on financial outcomes (Golden & Cordie, 2022; Lyons & Kass-Hanna, 2021a; Yadav & Banerji, 2023). Even as fintech has extended access to financial services, this access requires consumers to understand costs, benefits, and strategies to avoid both costly mistakes and fraud. Therefore, digital financial literacy expands on traditional financial literacy because it requires the ability to access and operate technological devices (Kass-Hanna et al., 2022). Digital financial literacy is also aligned with financial well-being, as it involves both financial knowledge and avoidance of digital fraud (Y. Choung et al., 2023; Kamble et al., 2024).
The overlapping definitions result in digital financial literacy being correlated with measures of both traditional financial literacy and general technological literacy (Morgan et al., 2019). This interrelationship suggests that traditional financial literacy may facilitate the adoption and effective use of digital financial tools, while engagement with fintech creates opportunities to apply and reinforce financial knowledge. However, low and declining levels of financial literacy in the general public have been documented for decades (e.g., Lusardi, 2015; Lusardi & Mitchell, 2007). This situation creates risk for individuals whose technological knowledge and trust in AI exceeds their financial knowledge.
This investigation uses the term technological literacy to refer to an individual’s ability to interact with computer technology in all forms to achieve desired ends. Therefore, the construct of technological literacy comprises multiple dimensions of effective use of technology, including navigating and learning new technology, troubleshooting, and critical evaluation of the risks and benefits of technology. This conception draws on ideas going back to Gilster’s (1997) definition of digital literacy, which emphasized the ability both to understand and utilize information in digital formats. Thus, digital literacy emphasizes practical skills involving effective, critical, and effective use of digital tools and information; the principal abilities include searching, analyzing, evaluating, and communicating using digital information and platforms (Buckingham, 2015). Meanwhile, technological literacy represents an even broader concept that incorporates knowledge of technological systems, including hardware and software; therefore, individuals with greater levels of technological literacy are able to understand and evaluate technology, adapt to technological developments, and utilize the technology to achieve specific objectives (Yeşilyurt & Vezne, 2023).
As technology has evolved, various related terms and measurement approaches have been developed to update the construct accordingly. Martin (2006) offered an expanded definition encompassing the ability, attitude, and awareness to interact with technology as well as create and communicate new knowledge, whereas Chan et al. (2017) emphasized the role of critical thinking in technological literacy. More recently, scholars have proposed similar frameworks to define and measure AI literacy incorporating dimensions of technical knowledge and practical competencies (Bewersdorff et al., 2025). Given the pervasive role of AI in contemporary fintech applications, technological literacy is a crucial foundation for truly understanding and effectively utilizing these services.
The motivation and willingness to adopt new computer technology has been extensively studied, with the Technology Acceptance Model (TAM; Davis, 1989) providing one prominent theoretical framework. In that scheme, perceptions of usefulness and ease of use shape attitudes, which subsequently influence the behavioral intention to use technology. The model has been adapted and expanded by incorporating social influence, facilitating conditions, price, hedonic motivation, habits, and attitudes to create the related Unified Theory of Acceptance and Use of Technology (UTAUT; Venkatesh et al., 2003). Studies employing either TAM or UTAUT usually emphasize the utility of technology as a key factor in adoption of new technologies. This approach emphasizes the value proposition of technology, including its reliability and robustness, as key factors in adoption.
Research on AI technology, however, has placed stronger emphasis on the role of trustworthiness and transparency in influencing technology acceptance (Mustofa et al., 2025; Vorm & Combs, 2022). Incorporating trust as a principal factor may stem partly from deliberate emphasis of trustworthiness of the technology by companies seeking to encourage adoption (Hasija & Esper, 2022). The foundational trust required for AI adoption encourages companies to emphasize benefits and provide assurance of trustworthiness and risk mitigation (Lockey & Gillespie, 2025). Additional antecedents to trust formation include tangibility, transparency, reliability, and immediacy (Glikson & Woolley, 2020).
Lockey and Gillespie (2025) note that trust in transactions relies on one party’s acceptance of vulnerability and risk in believing that another party will serve its best interests. This affective and relational dimension distinguishes trust from the knowledge-based competencies that characterize the three forms of literacy measures discussed previously. Thus, while literacy enables competent usage of technology, trust serves as the psychological gateway determining willingness to engage.
Usage of fintech has been closely linked to trust in AI, encompassing both trust in the technology’s functionality and its ability to create human-like interactions (H. Choung et al., 2022; Maier et al., 2022). Since contemporary consumer fintech applications rely on AI technologies for core functionality, trust in AI naturally enhances trust in fintech more broadly. Behavioral intentions for consumers to use robo-advisors, for instance, have been closely linked to user experience variables (Belanche et al., 2019; Sabir et al., 2023; Senyo & Osabutey, 2020). Therefore, it should be anticipated that trust in AI and trust in fintech will be positively correlated.
Worryingly, surveys have shown that consumers may be motivated more by performance and effort of fintech than by price or risk (Senyo & Osabutey, 2020). This situation suggests potential vulnerabilities in decision-making processes, particularly if consumers adopt fintech while lacking the financial literacy to understand options and risks. Meanwhile, the high stakes and anxiety-inducing nature of financial decisions may lead some people to maintain a preference for conversation and advice from human advisors (Zhang et al., 2020). These consumer fears act as a countervailing force that might limit fintech adoption, even in the case of strong trust in AI.

Hypotheses

Extant scholarship reveals three distinct but interconnected patterns. First, digital financial literacy relies on both traditional financial literacy and technological literacy so that these three competencies should be highly correlated. These literacies draw upon overlapping knowledge bases and skills, with proficiency in one domain facilitating development in others. Second, trust in AI operates through different psychological mechanisms than literacy-based competencies. Trust involves affective evaluations, risk perspectives, and beliefs about AI’s alignment with user interests; these factors are largely independent of an individual’s knowledge about finance or technology. Third, for fintech applications, trust in the underlying AI technology is a key factor in adoption and usage. While literacy enables competent usage once adoption occurs, trust can serve as the psychological gateway that determines willingness to engage with fintech.
Therefore, this survey-based study proposes three hypotheses to be tested empirically to align with these expectations. The first hypothesis argues that the various literacy measures should be related due to their overlapping knowledge bases about finance and technology. The second hypothesis states that trust in AI is distinct from these three literacy measures, due to different affective evaluations, risk perceptions, and attitudes and beliefs about AI that operate through distinct psychological mechanisms. The third hypothesis predicts that trust in AI will be a more reliable predictor of trust in fintech than any of the literacy measures. The present study offers a novel contribution by exploring whether fintech adoption is driven more so by affective factors (trust in AI and customer experience) than by cognitive factors (financial literacy and technological literacy). The three hypotheses are stated more formally as follows:
H1. 
Financial literacy, digital financial literacy, and technological literacy will be positively correlated.
H2. 
Trust in AI will not be correlated to a statistically significant degree with financial literacy, digital financial literacy, nor technological literacy.
H3. 
Trust in AI will be positively correlated and a statistically significant predictor of trust in consumer fintech, whereas financial literacy, digital financial literacy, and technological literacy will not be statistically significant predictors.

3. Materials and Methods

3.1. Participants

A convenience sample of 117 college students between the ages of 18 and 24 completed an online survey, while another 22 respondents completed only a portion of the survey but did not provide sufficient data to include in the analysis. The sample comprised 68 women (58.1%) and 49 men (41.9%) with a mean age of 20.8 (SD 1.65) years. A substantial majority of 83 respondents (70.9%) identified themselves as enrolled in a business-related major. The sample was predominantly Caucasian (n = 83; 79.5%) with the rest identifying as American Indian/Alaska native (n = 3), Asian (n = 5), Black/African American (n = 12), or preferring not to answer (n = 14). Furthermore, 10 respondents indicated that they were Hispanic or Latino. The homogeneity of the sample suggests caution in extrapolating the results, but this sample represents a pilot study that could potentially be expanded in future research. Additionally, the sample should provide an adequate reflection of the underlying population of students enrolled at Midwest colleges and universities. This demographic has a high adoption rate of fintech and is an important group to investigate as the technology diffuses.

3.2. Procedures

This study was conducted in accordance with the Declaration of Helsinki and approved by the Institutional Review Board of Butler University (date of approval: 19 December 2024). The sample was then recruited primarily through email and other electronic communication. Students enrolled during the fall semester of 2024 at a small, private Midwestern university were invited via email to participate. Further announcements were made in classrooms, at meetings of business school organizations, and through social media posts from student organizations. These communications distributed a link to a Qualtrics survey where students could anonymously participate in this online survey. An informed consent letter provided information about the study, and participants acknowledged their assent to participate before proceeding. Survey items were presented to participants in random order to minimize order effects in their responses.

3.3. Measurement

The key measure of this study assesses participants’ trust in consumer fintech that provides financial information or advice. This scale directly links elements of AI and technology with financial advice and behavior in its items. Data was collected with an 8-item, purpose-built scale with responses on a 7-point Likert-type scale; items are provided in Table 1. The scale had strong reliability for this sample (Cronbach’s alpha = 0.89, 95% CI [0.86, 0.920]).
The study’s hypotheses seek to differentiate the effects of general financial literacy, digital financial literacy, technological literacy, and trust in AI on the previous scale of trust in consumer fintech. The first of these constructs, general financial literacy, was measured using the so-called big three items (e.g., Lusardi & Mitchell, 2011) combined with 5 additional items related to interest rates, mortgages, and diversification, based on the OECD International Network on Financial Education survey (see Atkinson & Messy, 2012). These items appear in Table A1 in Appendix A. Participants responded to these 8 multiple choice items, and the number of correct answers was tallied.
The study also sought to measure digital financial literacy, which differs subtly from trust in consumer fintech by emphasizing the five cognitive dimensions of literacy described by Lyons and Kass-Hanna (2021a, 2021b): basic knowledge and skills, awareness, practical know-how, decision-making, and self-protection. At the same time, these items omit any direct reference to AI or receiving financial advice in order to focus on personal knowledge and behavior. A pool of 10 items was generated by the authors and reviewed independently by two scholars with experience in financial literacy research before piloting with a focus group of 4 students. The scale was trimmed to the 7 items presented in Table 2, with data collection using a 7-point Likert-type response format. For the sample in this study, the scale had adequate reliability (Cronbach’s alpha = 0.72, 95% CI [0.63, 0.79]).
The next measure is general technological literacy, which combines concepts of computer self-efficacy and technology acceptance, omitting any reference to finance. This construct was measured with a 10-item Likert-type scale with a 7-point response format. The scale demonstrated adequate reliability in the present sample (Cronbach’s alpha = 0.8, 95% CI [0.74, 0.85]).
The final primary measure is trust in AI, which was measured on a 10-item Likert-type scale. Items were based on the discussion of the construct in Maier et al. (2022) and Belanche et al. (2019). These items emphasized general interactions with AI with no reference to financial matters. The scale showed strong reliability (Cronbach’s alpha = 0.86, 95% CI [0.82, 0.9]).

3.4. Analysis

Statistical analysis was conducted in the R statistical computing environment (R Core Team, 2025), and the study adopted the traditional cutoff of p-values less than 5% to imply statistical significance. The study’s hypotheses were analyzed using correlation analysis and linear regression. Specifically, H1 and H2 were analyzed by testing the statistical significance of Pearson correlation coefficients, and H3 was analyzed with a simple linear regression of trust in fintech as the dependent variable on the independent variables of trust in AI, technological literacy, digital financial literacy, and financial literacy. The primary model appears as follows:
T r u s t   i n   f i n t e c h =   β 0 + β 1 T r u s t   i n   A I + β 2 T e c h   l i t e r a c y + β 3 D i g i t a l   f i n   l i t + β 4 F i n   l i t + ε
This model was expanded to include the participant variables of gender and race. Further investigation was also conducted due to the risk of multicollinearity and the relatively small sample size. First, variance inflation factors (VIF) were computed. Second, regression models were estimated separately for each independent variable. Finally, each variable was tested again with a nonparametric regression model, namely the Theil-Sen estimator (Sen, 1968; Theil, 1950).

4. Results

As described in the methods section, this study includes measurement of five Likert-type scales. Descriptive statistics of these variables for the 117 valid responses appear in Table 3. Maximum scores were attained for all three of the literacy measures, and responses covered a broad range of scores. Specifically for the financial literacy measure, 3 respondents scored 0, and 11 respondents attained the maximum score of 8. For trust in AI and trust in fintech, scores were generally lower, and no scores reached the possible maximum of 70 on either scale.
The first hypothesis anticipates a relationship among financial literacy, digital financial literacy, and technological literacy. A correlation matrix of these three variables appears as Table 4. As predicted, these three variables are correlated at a statistically significant level. The strongest correlation occurs for the two constructs involving financial literacy, reflecting their conceptual overlap. These results provide evidence to support the first hypothesis.
The second hypothesis predicts that trust in AI will not be substantially correlated with the three variables above. This expectation is also borne out. Correlations with trust in AI are 0.17 (p = .067) for financial literacy, 0.04 (p = .668) for digital financial literacy, and 0.13 (p = .160) for technological literacy. None of these correlations are statistically significant at a 5% level. Thus, trust in AI is not directly related to measures of financial or technological literacy, providing evidence in favor of H2.
By contrast, trust in AI is strongly correlated with trust in consumer fintech (r = 0.63, p < .001). This marked difference from the previous three correlations supports H3. Consumer trust in fintech has a stronger relationship with trust in AI than with financial literacy, digital financial literacy, or technological literacy.
The overall relationship of the constructs to trust in consumer fintech is further explored with regression analysis as a simultaneous test of the relationships among the variables. When trust in consumer fintech is regressed on the four independent variables of financial literacy, digital financial literacy, technological literacy, and trust in AI, the overall model is statistically significant (F[4, 112] = 20.32, p < .001, R2 = 0.42), but only trust in AI is a statistically significant independent variable, as shown in Table 5. This model was also estimated with the inclusion of the participant demographic variables of gender and race. These factors were not statistically significant and did not affect the interpretation of the primary variables of interest, so the results are omitted for clarity in presentation.
The correlations among financial literacy, digital financial literacy, and technological literacy raise concern regarding possible multicollinearity in the regression model. As a first check, VIF is computed and presented in Table 5. With all coefficients less than the traditional cutoff of 5, the risk is relatively low, giving greater confidence in the interpretation of the regression. As a further robustness check, each independent variable was tested in individual regression models with results appearing in Table 6. The pattern of statistical significance holds with only Trust in AI resulting in a p-value less than 5%. Finally, the four regression models were estimated again using the nonparametric Theil-Sen estimator. As displayed in Table 6, the results are qualitatively similar.

5. Discussion

The results reveal that trust in consumer fintech is driven to a greater extent by trust in AI than in financial literacy, digital financial literacy, or technological literacy. For the respondents in this study, there were strong correlations among digital financial literacy, traditional financial literacy, and technological literacy, indicating a shared cognitive domain among these three constructs. However, these three measures did not have a statistically significant correlation with trust in AI, which operates through affective and relational mechanisms such as risk tolerance. If trust serves as a principal gateway to fintech adoption to a greater extent than financial and technological competencies, this pattern suggests a fundamental shift in how consumers will engage with financial services in the digital age. These findings challenge the implicit assumption in much of the research on fintech and financial literacy that educating consumers about technology and finance will naturally translate into appropriate levels of trust and risk management.
The regression analysis further supports the third hypothesis and suggests that trust in AI is a major factor in motivating fintech adoption. This pattern holds despite the theoretical relevance of both financial literacy for evaluating the quality of advice and technological literacy for understanding fintech capabilities. These findings align with previous work on the important role of trust (Lockey & Gillespie, 2025; Mustofa et al., 2025) but extend these insights to the area of personal finance. While the importance of financial decisions might be expected to elevate the role of literacy and knowledge-based factors, the results instead imply that the greater uncertainty and anxiety in these decisions may amplify the role of trust. When making financial decisions, consumers may rely to a greater extent on emotional resonance and user experience rather than their own knowledge in evaluating recommendations.
These findings present a paradox for researchers concerned with consumer welfare. On the positive side, trust as an adoption driver may facilitate broader financial inclusion, and fintech is generally shown to expand access to financial services (Amnas et al., 2024). Additionally, fintech affords the potential opportunity to educate consumers at low cost. Migliavacca (2021) provides evidence that interactions with human advisors have educational value for consumers; perhaps regular interactions with fintech could similarly affect financial literacy. However, trust-based adoption of fintech simultaneously creates substantial risks for consumers who may accept recommendations without the capacity to critically evaluate the advice, detect conflicts of interest, or recognize when suggestions are not appropriately tailored to their personal circumstances (cf. Senyo & Osabutey, 2020). This vulnerability is concerning given the opacity of many AI systems and the lack of fiduciary duty.
The findings reinforce a worrying trajectory in the market for financial services. Wealthier individuals might continue to access human advisors for personalized, context-sensitive advice and fiduciary accountability while mass-market consumers increasingly rely on robo-advisors that may optimize provider profits rather than consumer welfare (Coffi & George, 2022). This bifurcation is exacerbated by a trend toward greater individual responsibility for financial decisions (Alsemgeest, 2015; Campbell et al., 2011; Ryan et al., 2011) with the added challenge of opacity in financial advice provided by AI systems. Previously, consumers could at least theoretically attain financial literacy to make better decisions through questions and conversations, but the algorithmic black box may fundamentally limit the possibility of informed decision-making such that regulation and education require revisions. These concerns could be mitigated to the extent that fintech provides sound advice, strong functionality, and positive user experiences.

5.1. Recommendations

The findings here suggest several potential directions for anyone interested in supporting good financial decisions from individual investors. First, regulatory frameworks may be needed to mandate transparency around algorithmic operations, objectives for AI optimization, and particularly the fiduciary status of any AI-generated advice. Second, default options provided by fintech applications warrant regulatory attention. Given that trust drives adoption regardless of literacy, ensuring high-quality defaults becomes essential for consumer protection. Third, financial education needs to evolve from traditional literacy topics to incorporate critical evaluation of AI-mediated advice. As Golden and Cordie (2022) advocate, consumers would benefit from guided, practical interactions with fintech to develop both operational competence and healthy skepticism and verification. This approach also aligns with Lusardi’s (2019) recommendation to ground financial education in personal experience and authentic concerns. Finally, the development of digital financial literacy should directly address the different motivations of literacy and trust. Education could assist individuals to recognize when their trust in technology should be tempered by critical evaluation to challenge and verify AI-generated recommendations given their personal circumstances, objectives, risk tolerance, and values.

5.2. Limitations and Future Research

The findings and recommendations of this study are constrained from broad generalization partly due to the relatively homogeneous sample. Replication will be needed with samples of different ages, educational experiences, incomes, and demographic characteristics. The cross-sectional design of the present study also prevents claims of causation. While the results suggest that trust in AI predicts trust in fintech, longitudinal research would be needed to establish any causal relationship. There is also the question of whether initial experiences with fintech (positive or negative) shape future perceptions and adoption. Data collection for the present study also did not ask specifically about particular fintech usage behaviors or outcomes
Future research could include pedagogical interventions that emphasize digital financial literacy to determine any changes in attitude or adoption. The purpose-built scales in this study would also benefit from replicative studies to support their validity and reliability. Future research could also emphasize outcomes such as how trust in AI translates into adoption, usage patterns, decision quality, and financial outcomes. Important moderating variables such as demographic characteristics, personality traits, and cognitive styles could also be explored for their influence on the balance between trust, financial literacy, and fintech. Financial socialization research could explore whether the influence of family socialization is diminishing or reversing, as young adults help their elders navigate new technology (cf. Hanson & Olson, 2018).
When studying the adoption of fintech, other key factors potentially include customer-related benefits such as reduced time and effort in accessing services (Barbu et al., 2021). Fintech also offers the possibility of greater product personalization and improved customer experience generally (Vandanapu, 2024). However, privacy concerns may temper consumer enthusiasm to adopt fintech (Alalwan et al., 2024; Hanson & Byrd, 2024; Shafik, 2025). Future research could explore these countervailing factors, their relationship to demographic and psychometric traits, how they influence adoption, and how financial institutions can respond in developing fintech. The complicated relationship between literacy, competence, financial well-being, fintech customer experience, and trust requires further theorization and empirical exploration to maximize the benefits of fintech.

6. Conclusions

This study provides evidence regarding consumer fintech adoption and the important role of trust in AI as a primary driver that operates independently of financial literacy, digital financial literacy, and technological literacy. While these latter three literacy measures are highly correlated, as predicted, none of them demonstrates a statistically significant relationship with trust in AI or fintech. While fintech offers the promise of scalable consumer education and financial inclusion, the disconnect between trust and literacy creates substantial consumer vulnerability. When fintech adoption proceeds on a basis of trust rather than competency to evaluate advice, consumers may accept recommendations that do not serve their interests. Therefore, these findings raise the possibility of regulation around fintech default options and disclosures, particularly regarding fiduciary duty. The results also reinforce the importance of financial literacy and a necessary evolution in education efforts to incorporate evaluations of fintech.
Ultimately, consumers will need both sufficient trust to engage with beneficial fintech and sufficient literacy to evaluate AI-generated advice. Achieving the appropriate balance is a key challenge for educators, regulators, and financial institutions. As the role of fintech in financial markets and personal responsibility for financial decisions expand, managing the relationship between trust and literacy will be necessary for consumer welfare, market efficiency, and social equity in financial services access.

Author Contributions

Conceptualization, methodology, data curation, writing—original draft preparation, T.A.H. and C.O.; software, validation, formal analysis, writing—review and editing, T.A.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and was reviewed by the Institutional Review Board of Butler University. The study was determined to be exempt under Exempt Category 2, as defined in the U.S. Code of Federal Regulations (45 CFR Section 46.101), as of 19 December 2024.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author due to IRB restrictions.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Table A1. Survey items.
Table A1. Survey items.
Demographic items
  1.
How would you describe your gender? (Male/Female/Prefer not to answer)
  2.
How would you describe your race? (American Indian or Alaska native/Asian/Black or African American/Pacific Islander/White or Caucasian/Prefer not to answer)
  3.
Would you describe yourself as Hispanic or Latino? (Yes/No)
  4.
What is your age?
  5.
How would you describe your major? (Business school/Non-business school)
Technological literacy scale
Please indicate your agreement with the following statements on a scale from 1 to 7
(1 = Strongly Disagree, 7 = Strongly Agree)
  1.
I am confident in my ability to learn new technologies.
  2.
I regularly use technology in my daily life.
  3.
I understand how to use mobile apps effectively.
  4.
I know how to troubleshoot basic technological problems.
  5.
I am comfortable using online systems and platforms.
  6.
I use technology to compare different options when making decisions.
  7.
I feel confident in using AI-based tools for guidance or recommendations.
  8.
I understand the risks involved in sharing personal information online.
  9.
I am aware of emerging technologies such as blockchain and AI.
  10.
I feel comfortable with automated systems managing parts of my life.
Trust in AI scale
Please indicate your agreement with the following statements on a scale from 1 to 7
(1 = Strongly Disagree, 7 = Strongly Agree)
  1.
I trust AI to handle my data securely.
  2.
I believe AI can make better decisions than humans in some cases.
  3.
I am confident in the reliability of AI when making recommendations.
  4.
I believe AI systems are ethical.
  5.
I trust AI to provide unbiased service.
  6.
I feel comfortable allowing AI to manage certain aspects of my life.
  7.
I think AI can reduce human error in various tasks.
  8.
I worry that AI systems may malfunction or make critical errors.
  9.
I believe that AI systems are transparent in how they make decisions.
  10.
I trust AI more when I understand how it makes decisions.
Financial literacy scale
  1.
Suppose you had $100 in a savings account and the interest rate was 2% per year. After five years, how much do you think you would have in the account?
  a.
More than $102
  b.
Exactly $102
  c.
Less than $102
  d.
Don’t know
  2.
Imagine that the interest rate on your savings account is 1% per year and inflation is 2% per year. After one year, how much would you be able to buy with the money in the account?
  a.
More than today
  b.
Exactly the same
  c.
Less than today
  d.
Don’t know
  3.
If interest rates rise, what will typically happen to bond prices?
  a.
Rise
  b.
Fall
  c.
Stay the same
  d.
Don’t know
  4.
A 15-year mortgage typically requires higher monthly payments than a 30-year mortgage, but the total interest paid over the life of the loan will be less.
  a.
True
  b.
False
  5.
Buying a single company’s stock usually provides a safer return than a stock mutual fund.
  a.
True
  b.
False
  6.
Which of the following is an example of diversification?
  a.
Investing all your money in one stock
  b.
Spreading your money across multiple investments
  c.
Putting all your money in real estate
  d.
Don’t know
  7.
What is the main advantage of saving for retirement in a tax-deferred account?
  a.
You pay less tax upfront
  b.
You save on taxes until you withdraw
  c.
No taxes are due when you retire
  d.
Don’t know
  8.
Is money held in all digital financial services (e.g., Venmo or PayPal) insured through government programs such as the FDIC?
  a.
Yes, in all cases
  b.
Yes, but only in certain cases
  c.
Never
  d.
Don’t know

References

  1. Abis, D., Pia, P., & Limbu, Y. (2025). FinTech and consumers: A systematic review and integrative framework. Management Decision, 63(1), 49–75. [Google Scholar] [CrossRef]
  2. Alalwan, A. A., Baabdullah, A. M., Al-Debei, M. M., Raman, R., Alhitmi, H. K., Abu-ElSamen, A. A., & Dwivedi, Y. K. (2024). Fintech and contactless payment: Help or hindrance? The role of invasion of privacy and information disclosure. International Journal of Bank Marketing, 42(1), 66–93. [Google Scholar] [CrossRef]
  3. Alsemgeest, L. (2015). Arguments for and against financial literacy education: Where to go from here? International Journal of Consumer Studies, 39(2), 155–161. [Google Scholar] [CrossRef]
  4. Amnas, M. B., Selvam, M., & Parayitam, S. (2024). FinTech and financial inclusion: Exploring the mediating role of digital financial literacy and the moderating influence of perceived regulatory support. Journal of Risk and Financial Management, 17(3), 108. [Google Scholar] [CrossRef]
  5. Atkinson, A., & Messy, F. (2012). Measuring financial literacy: Results of the OECD/International Network on Financial Education (INFE) pilot study (p. 15). OECD Working Papers on Finance, Insurance, and Private Pensions. OECD Publishing. [Google Scholar] [CrossRef]
  6. Barbu, C. M., Florea, D. L., Dabija, D.-C., & Barbu, M. C. R. (2021). Customer experience in fintech. Journal of Theoretical and Applied Electronic Commerce Research, 16(5), 1415–1433. [Google Scholar] [CrossRef]
  7. Belanche, D., Casaló, L. V., & Flavián, C. (2019). Artificial intelligence in FinTech: Understanding robo-advisors adoption among customers. Industrial Management & Data Systems, 119(7), 1411–1430. [Google Scholar] [CrossRef]
  8. Bewersdorff, A., Nerdel, C., & Zhai, X. (2025). How AI literacy correlates with affective, behavioral, cognitive, and contextual variables: A systematic review. Computers and Education: Artificial Intelligence, 9, 100493. [Google Scholar] [CrossRef]
  9. Buckingham, D. (2015). Defining digital literacy: What do young people need to know about digital media. Nordic Journal of Digital Literacy, 10, 21–35. [Google Scholar] [CrossRef]
  10. Campbell, J. Y., Jackson, H. E., Madrian, B. C., & Tufano, P. (2011). Consumer financial protection. Journal of Economic Perspectives, 25(1), 91–114. [Google Scholar] [CrossRef]
  11. Chan, B. S. K., Churchill, D., & Chiu, T. K. F. (2017). Digital literacy learning in higher education through digital storytelling approach. Journal of International Education Research, 13(1), 1–16. [Google Scholar] [CrossRef]
  12. Choung, H., David, P., & Ross, A. (2022). Trust in AI and its role in the acceptance of AI technologies. International Journal of Human-Computer Interaction, 39(9), 1727–1739. [Google Scholar] [CrossRef]
  13. Choung, Y., Chatterjee, S., & Pak, T.-Y. (2023). Digital financial literacy and financial well-being. Finance Research Letters, 58, 104438. [Google Scholar] [CrossRef]
  14. Coffi, J., & George, B. (2022). The fintech revolution and the changing role of financial advisors. Journal of Applied and Theoretical Social Sciences, 3(1), 261–274. [Google Scholar] [CrossRef]
  15. Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319–340. [Google Scholar] [CrossRef]
  16. Giglio, F. (2021). Fintech: A literature review. European Research Studies Journal, 24(2B), 600–627. [Google Scholar] [CrossRef]
  17. Gilster, P. (1997). Digital literacy. Wiley. [Google Scholar]
  18. Glikson, E., & Woolley, A. W. (2020). Human trust in artificial intelligence: Reviews of empirical research. Academy of Management Annals, 14(2), 627–660. [Google Scholar] [CrossRef]
  19. Golden, W., & Cordie, L. (2022). Digital financial literacy. Adult Literacy Education, 4(3), 20–26. [Google Scholar] [CrossRef]
  20. Hanson, T. A., & Byrd, A. J. (2024). Developing a measure of financial privacy: A pilot study of U.S. college students. International Journal of Financial Studies, 12(4), 116. [Google Scholar] [CrossRef]
  21. Hanson, T. A., & Olson, P. M. (2018). Financial literacy and family communication patterns. Journal of Behavioral and Experimental Finance, 19, 64–71. [Google Scholar] [CrossRef]
  22. Hasija, A., & Esper, T. L. (2022). In artificial intelligence (AI) we trust: A qualitative investigation of AI technology acceptance. Journal of Business Logistics, 43(3), 388–412. [Google Scholar] [CrossRef]
  23. Kamble, P. A., Mehta, A., & Rani, N. (2024). Financial inclusion and digital financial literacy: Do they matter for financial well-being? Social Indicators Research, 171(3), 777–807. [Google Scholar] [CrossRef]
  24. Kass-Hanna, J., Lyons, A. C., & Liu, F. (2022). Building financial resilience through financial and digital literacy in South Asia and Sub-Saharan Africa. Emerging Markets Review, 51, 100846. [Google Scholar] [CrossRef]
  25. Lockey, S., & Gillespie, N. (2025). Trust in AI: Evidence of trust-supporting mechanisms from 17 countries. In S. Sadiq (Ed.), Enterprise AI (pp. 279–309). Springer. [Google Scholar] [CrossRef]
  26. Lusardi, A. (2015). Financial literacy: Do people know the ABCs of finance? Public Understanding of Science, 24(3), 260–271. [Google Scholar] [CrossRef]
  27. Lusardi, A. (2019). Financial literacy and the need for financial education: Evidence and implications. Swiss Journal of Economics and Statistics, 155(1), 1. [Google Scholar] [CrossRef]
  28. Lusardi, A., & Mitchell, O. S. (2007). Financial literacy and retirement preparedness: Evidence and implications for financial education. Business Economics, 42(1), 35–44. [Google Scholar] [CrossRef]
  29. Lusardi, A., & Mitchell, O. S. (2011). Financial literacy around the world: An overview. Journal of Pension Economics & Finance, 10(4), 497–508. [Google Scholar] [CrossRef]
  30. Lyons, A. C., & Kass-Hanna, J. (2021a). A methodological overview to defining and measuring “digital” financial literacy. Financial Planning Review, 4(2), e1113. [Google Scholar] [CrossRef]
  31. Lyons, A. C., & Kass-Hanna, J. (2021b). A multidimensional approach to defining and measuring financial literacy in the digital age. In G. Nicolini, & B. J. Cude (Eds.), Routledge handbook of financial literacy (pp. 61–76). Routledge. [Google Scholar] [CrossRef]
  32. Maier, T., Menold, J., & McComb, C. (2022). The relationship between performance and trust in AI in e-finance. Artificial Intelligence, 5, 891529. [Google Scholar] [CrossRef]
  33. Martin, A. (2006). A European framework for digital literacy. Nordic Journal of Digital Literacy, 1(2), 151–161. [Google Scholar] [CrossRef]
  34. Migliavacca, M. (2021). Keep your customer knowledgeable: Financial advisors as educators. In J. O. S. Wilson, G. A. Panos, & C. Adcock (Eds.), Financial literacy and responsible finance in the FinTech era (pp. 106–123). Routledge. [Google Scholar] [CrossRef]
  35. Morgan, P. J., Huang, B., & Trinh, L. Q. (2019). The need to promote digital financial literacy in the digital age. In Realizing education for all in the digital age. Asian Development Bank Institute. [Google Scholar]
  36. Mustofa, R. H., Kuncoro, T. G., Atmono, D., Hermawan, H. D., & Sukirman. (2025). Extending the technology acceptance model: The role of subjective norms, ethics, and trust in AI tool adoption among students. Computers and Education: Artificial Intelligence, 8, 100379. [Google Scholar] [CrossRef]
  37. R Core Team. (2025). R: A language and environment for statistical computing. R Foundation for Statistical Computing. [Google Scholar]
  38. Ryan, A., Trumbull, G., & Tufano, P. (2011). A brief postwar history of U.S. consumer finance. Business History Review, 85(3), 461–498. [Google Scholar] [CrossRef]
  39. Sabir, A. A., Ahmad, I., Ahmad, H., Rafiq, M., Khan, M. A., & Noreen, N. (2023). Consumer acceptance and adoption of AI robo-advisors in fintech industry. Mathematics, 11(6), 1311. [Google Scholar] [CrossRef]
  40. Sajid, R., Ayub, H., Malik, B. F., & Ellahi, A. (2023). The role of fintech on bank risk-taking: Mediating role of bank’s operating efficiency. Human Behavior and Emerging Technologies, 2023(1), 7059307. [Google Scholar] [CrossRef]
  41. Schueffel, P. (2016). Taming the beast: A scientific definition of Fintech. Journal of Innovation Management, 4(4), 32–54. [Google Scholar] [CrossRef]
  42. Sen, P. K. (1968). Estimates of the regression coefficient-based on Kendall’s tau. Journal of the American Statistical Association, 63(324), 1379–1389. [Google Scholar] [CrossRef]
  43. Senyo, P. K., & Osabutey, E. L. C. (2020). Unearthing antecedents to financial inclusion through FinTech innovations. Technovation, 98, 102155. [Google Scholar] [CrossRef]
  44. Shafik, W. (2025). Security, privacy, and trust in fintech. In V. Sharma, M. Gupta, N. Arora, & A. A. Shaikh (Eds.), Fintech and financial inclusion (pp. 216–233). Routledge. [Google Scholar]
  45. Stulz, R. M. (2019). FinTech, BigTech, and the future of banks. Applied Corporate Finance, 31(4), 86–97. [Google Scholar] [CrossRef]
  46. Theil, H. (1950). A rank-invariant method of linear and polynomial regression analysis. Indagationes Mathematicae, 12(85), 386–392. [Google Scholar]
  47. Vandanapu, M. K. (2024). AI-drive personalization in financial services: Enhancing customer experience and operational efficiency. Journal of Economics, Management, and Trade, 30(11), 1–13. [Google Scholar] [CrossRef]
  48. Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425–478. [Google Scholar] [CrossRef] [PubMed]
  49. Vorm, E. S., & Combs, D. J. Y. (2022). Integrating transparency, trust, and acceptance: The intelligent systems technology acceptance model (ISTAM). International Journal of Human-Computer Interaction, 38(18–20), 1828–1845. [Google Scholar] [CrossRef]
  50. Wang, Y., Xiuping, S., & Zhang, Q. (2021). Can fintech improve the efficiency of commercial banks? An analysis based on big data. Research in International Business and Finance, 55, 101338. [Google Scholar] [CrossRef]
  51. Yadav, M., & Banerji, P. (2023). A bibliometric analysis of digital financial literacy. American Journal of Business, 38(3), 91–111. [Google Scholar] [CrossRef]
  52. Yeşilyurt, E., & Vezne, R. (2023). Digital literacy, technological literacy, and internet literacy as predictors of attitude toward applying computer-supported education. Education and Information Technologies, 28, 9885–9911. [Google Scholar] [CrossRef] [PubMed]
  53. Zhang, L., Pentina, I., & Fan, Y. (2020). Who do you choose? Comparing perceptions of human vs robo-advisor in the context of financial services. Journal of Services Marketing, 35(5), 634–646. [Google Scholar] [CrossRef]
Table 1. Trust in consumer fintech scale.
Table 1. Trust in consumer fintech scale.
Please indicate your agreement with the following statements on a scale from 1 to 7
(1 = Strongly Disagree, 7 = Strongly Agree)
1. I would be comfortable receiving financial advice from a robo-advisor (automated online service).
2. I believe that a robo-advisor using AI would make investment decisions that are in my best interest.
3. A robo-advisor using AI would provide better financial advice than a human advisor in most situations.
4. Knowing that a robo-advisor is powered by advanced AI technology would make me more likely to use it.
5. I would trust a robo-advisor to manage my personal finances completely without any human intervention.
6. I would trust a chatbot to guide me through a financial fraud investigation process, from identifying suspicious activity to resolving the issue.
7. I would trust a chatbot to guide me through the selection of a new insurance policy, including comparing options, understanding coverage, and finalizing the purchase.
8. I am confident that a chatbot would be capable of managing and resolving an insurance claim or policy adjustment without the need for human intervention.
Table 2. Digital financial literacy scale.
Table 2. Digital financial literacy scale.
Please indicate your agreement with the following statements on a scale from 1 to 7
(1 = Strongly Disagree, 7 = Strongly Agree)
1. I’m aware that I can borrow money online or via mobile phone.
2. I understand the purpose and usage of online/mobile stock trading.
3. I understand the purpose and usage of online/mobile payment services (e.g., Venmo).
4. I know how to initiate and complete necessary financial services (e.g., payments, stock trading, borrowing) via digital financial apps.
5. I know how to cancel a transaction or order when an error occurs with a digital financial app.
6. I pay attention to keeping passwords for digital finance apps safe.
7. I tend to ignore electronic messages or links sent by unknown financial institutions
Table 3. Descriptive statistics.
Table 3. Descriptive statistics.
VariableMeanSDMin.MedianMax.
Trust in fintech20.976.6482037
Trust in AI40.999.35144264
Technological literacy56.936.73305870
Digital financial literacy38.446.05183949
Financial literacy5.541.89068
Table 4. Correlation matrix.
Table 4. Correlation matrix.
Financial
Literacy
Digital Financial LiteracyTechnological Literacy
Financial literacy10.54 *0.46 *
Digital financial literacy 10.34 *
Technological literacy 1
* Statistically significant at the 5% level.
Table 5. Regression results for overall model; dependent variable is Trust in fintech.
Table 5. Regression results for overall model; dependent variable is Trust in fintech.
CoefficientEstimatet-Statp-ValueVIF
Intercept8.692.08.040
Trust in AI0.498.68<.0011.20
Technological literacy0.201.85.0672.03
Digital financial literacy0.070.58.5662.10
Financial literacy0.240.78.4371.48
Table 6. Regression results for each independent variable; dependent variable is Trust in fintech.
Table 6. Regression results for each independent variable; dependent variable is Trust in fintech.
Least-SquaresTheil-Sen
Independent VariabletpR2tpR2
Trust in AI8.72<.0010.3989.64<.0010.447
Technological literacy1.59.1150.0211.57.1200.021
Digital financial literacy1.79.0750.0271.50.1360.019
Financial literacy1.30.1980.0141.53.1280.020
Each row presents results of a regression model with a single independent variable. Results are presented for both least-squares regression and the nonparametric Theil-Sen estimator.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hanson, T.A.; Ott, C. Trust in Financial Technology: The Role of Financial Literacy, Digital Financial Literacy, Technological Literacy, and Trust in Artificial Intelligence. J. Risk Financial Manag. 2026, 19, 97. https://doi.org/10.3390/jrfm19020097

AMA Style

Hanson TA, Ott C. Trust in Financial Technology: The Role of Financial Literacy, Digital Financial Literacy, Technological Literacy, and Trust in Artificial Intelligence. Journal of Risk and Financial Management. 2026; 19(2):97. https://doi.org/10.3390/jrfm19020097

Chicago/Turabian Style

Hanson, Thomas A., and Caleb Ott. 2026. "Trust in Financial Technology: The Role of Financial Literacy, Digital Financial Literacy, Technological Literacy, and Trust in Artificial Intelligence" Journal of Risk and Financial Management 19, no. 2: 97. https://doi.org/10.3390/jrfm19020097

APA Style

Hanson, T. A., & Ott, C. (2026). Trust in Financial Technology: The Role of Financial Literacy, Digital Financial Literacy, Technological Literacy, and Trust in Artificial Intelligence. Journal of Risk and Financial Management, 19(2), 97. https://doi.org/10.3390/jrfm19020097

Article Metrics

Back to TopTop