Explore all blogs

The $27.2 billion problem: Why AI is reshaping digital identity and trust

author

published

Jan 14, 2026

categories

Agreements

read time

4 mins

Digital signing and digital identity

AI is reshaping digital identity and exposing weaknesses in traditional digital trust signals, creating new risks for organizations and the agreements they rely on.

You can also read this article in Français, Português, Español and Tiếng Việt.

Table of Contents

  • 1. The democratization and scale of identity fraud

  • 2. A foundational threat to business

  • 3. Identity must be proven, not assumed

  • 4. Your guide to rebuilding digital trust

  • 1. The democratization and scale of identity fraud
  • 2. A foundational threat to business
  • 3. Identity must be proven, not assumed
  • 4. Your guide to rebuilding digital trust

share this post

AI is altering the way work gets done — rapidly, profoundly, and permanently. Processes that once took weeks of manual effort now take hours thanks to AI-driven automation.

At Lumin, we embrace this shift. From simplifying agreement generation to streamlining how documents are created, edited, and signed, we’re employing AI throughout the document workflow to help organizations work smarter and faster. But AI also introduces its own set of problems.

AI has weaponized deception at scale. Deepfake video and audio are proliferating, and document and signature fraud have never been easier to execute. As a result, trust in digital systems is eroding across nearly every industry and rebuilding that trust requires moving beyond assumptions about identity to verifiable proof of who we are really interacting with.

The democratization and scale of identity fraud

The threats organizations face today look very different from those of just a few years ago. Fraudsters once needed complex operations to forge documents or impersonate executives, but AI has made deception cheap, accessible, and far more sophisticated.

Consumers lost a massive $27.2 billion to identity fraud in 2024. Businesses have fared just as badly, with the average organization reporting a loss of $7 million annually to identity fraud. And the problem is accelerating rapidly: financial institutions report a staggering 2,137% increase in deepfake fraud attempts over the last three years.

Recent real-world examples provide insight into how this type of fraud can take place and the impact it can have. In Italy earlier this year, business leaders were targeted via calls using an AI-cloned voice of the country's Defence Minister. The scammers claimed there were kidnapped Italian journalists and solicited urgent financial assistance, resulting in one high-profile businessman transferring nearly €1M.

In 2024 in Hong Kong, a finance worker attended what appeared to be a routine video conference with their company's CFO and senior staff — every other participant was an AI-generated deepfake. The realistic meeting convinced the worker to authorize 15 transactions totaling HK$200M to fraudulent accounts.

A foundational threat to business

The impact of AI-powered identity fraud goes beyond financial loss to an erosion of trust in the agreements organizations depend on. When they can no longer trust the critical point of agreement — the signing by each party involved — the effects are felt across the entire business.

Exposure to legal and compliance risk

For high-stakes transactions, a lack of (or insufficient) identity verification at the signing stage creates significant legal exposure:

  • Organizations face increased legal disputes because they lack certainty about signature authenticity.
  • Organizations in regulated industries face increasing scrutiny, and traditional eSignatures often result in gaps in compliance and security.
  • The breakdown of identity assurance means digital business processes can no longer be relied on for the agreements that matter most.

Increased operational costs and delays

The lack of identity assurance leads directly to operational friction:

  • Organizations face longer contract cycles due to manual checks, follow-up calls, or re-signing procedures required to address security concerns.
  • There is a retreat to paper-based processes, which undermines the efficiency gains of digital transformation.
  • Organizations experience higher operational costs due to extensive manual verification of signature authenticity for critical documents.

Sector-specific impacts

This pattern of vulnerability is consistent across industries:

  • Financial services: Loan officers question whether loan applications are from real borrowers or synthetic identities.
  • Professional services: Lawyers worry about the authenticity of client signatures on high-stakes agreements.
  • Public sector: Administrators are unable to verify that citizens are who they claim to be when signing statutory declarations or applying for permits.

Identity must be proven, not assumed

AI makes identity fraud easier to create and harder to detect, which means trust can no longer rely on surface-level identity cues. The solution isn’t to pull back from digital processes, but to reinforce them with stronger assurance.

This is a critical challenge for agreements, where even a single mis-attributed identity can compromise compliance, security, or financial outcomes. Instead of relying on inbox access or device possession as proof of identity, organizations can adopt cryptographic methods that bind a verified, biometric-backed credential from the signer’s device directly to the specific document being approved.

This approach, exemplified by Verified Digital Signing, restores confidence in digital agreements without adding unnecessary friction. It anchors every approval to a verified individual and cryptographically links the signer, the document, and the moment in time, allowing trust, compliance, and efficiency to advance together.

Your guide to rebuilding digital trust

To understand how organizations are addressing the identity assurance gap and rebuilding trust in their digital workflows, download our white paper, Securing digital agreements in the age of AI deception.

Developed in partnership with MATTR, recognised pioneers in TrustTech, the white paper delivers real-world examples, industry insights, and a practical framework for strengthening digital trust in an AI-driven threat landscape.

Meet our author

Joel Foster, Chief Commercial Officer at Lumin

Joel Foster is Chief Commercial Officer at Lumin, where he helps guide the company’s strategic direction and growth. He works closely with enterprise and public sector organizations on complex, high-assurance agreement needs, while also collaborating with ISV partners on platform integrations.

Joel plays a key role in shaping Lumin’s approach to digital trust and strengthening the security, efficiency, and trustworthiness of agreement workflows.

share this post

Lumin tools

Lumin PDF

Lumin PDF

  • Organize
    Merge PDF
    Split PDF
    Delete PDF page
    Compress PDF
    Rotate PDF
    Organize PDF
    Extract PDF
    AI PDF
    AI PDF summarizer
    Chat with PDF
    Scan
    PDF OCR
    Scan PDF
  • Edit & annotate
    Edit PDF
    Crop PDF
    Annotate PDF
    Edit PDF text
    Create fillable PDF
    PDF reader
    Redact PDF
    More
    Unlock PDF
    Flatten PDF
    Protect PDF
  • Convert to PDF
    PDF converter
    JPG to PDF
    PPT to PDF
    Word to PDF
    Excel to PDF
    PNG to PDF
    Convert from PDF
    PDF to PNG
    PDF to JPG
    PDF to Word
    PDF to PPT
    PDF to Excel
AgreementGen

AgreementGen

  • AI agreement generator
    AI agreement editor
    NDA generator
    Lease agreement generator
    Employment contract creator
    Terms and conditions generator
    Operating agreement generator
    Non-compete agreement generator
    Business plan generator
Lumin Sign

Lumin Sign

  • Sign PDF