
Opinion
Proving truth in a world of digital forgeries
2025 underscores a critical lesson: technology doesn’t replace judgment but enables justice to operate when evidence can no longer be taken at face value.
In 2025, a turning point is emerging in the relationship between digital reality and the ability of law enforcement systems to fulfill their mission. This is no longer an academic discussion: according to an official publication in the government procurement arena, the Israel Police examined AI-based solutions for deepfake detection and for identifying edits in images and videos - a signal that distinguishing between truth and forgery has become an operational need. At the same time, the National Cyber Directorate warned that deepfake-based video scams are developing into a real threat to organizations and the public. Deepfake, simply put, is a video or recording in which a person appears or sounds as if they said or did something, even though it never happened, because the content was created or altered using AI in a way that blurs the boundary between documentation and forgery. From this follows the claim: in an era in which crime and evidence have moved to digital, technological tools are the key to uncovering truth - and to doing justice.
First, we need to understand how the DNA of crime has changed. Today it operates like a digital platform: faster, at a broader scale and with an improved ability to hide behind layers of online identities and infrastructures. Offenses are carried out and coordinated on platforms - using fake identities, cloud services and social networks - and they leave behind not one testimony, but a fabric of messages, metadata, files, locations and connections. Without digital capability, the investigation is not only slow; it struggles to decipher the arena in which the relevant material was created. In this sense, technology is not a shortcut, but a tool that expands the field of view and makes it possible to build a factual picture out of information scattered across many sources.
Second, the decisive value of digital investigation tools lies in correctly channeling human attention: reducing sorting hours and freeing the investigator for the place where there is no substitute for a person - professional decisions, in-depth investigation, understanding context and meeting victims and witnesses. Digital generates enormous volumes of information that blur what matters; therefore the central achievement is not “more collection”, but translating overload into investigative direction: building a timeline, cross-checking versions, identifying connections between entities and events and locating anomalies that focus the effort. This does not replace judgment, it creates the conditions for it to operate and returns to the human being what the system truly needs from them: judgment.
Third, and this may be the clearest sign of 2025: digital evidence has lost its “presumption of truth.” If once we assumed that video is documentation that is hard to dispute, this year it became clear how much that assumption has eroded: it is possible to produce fake videos and voices that look and sound real. Therefore law enforcement systems are required to add a new layer of verification - not only what the content shows, but where it came from, whether it was edited, and what can be proven about its reliability. In this context, the fact that the Israel Police is examining tools for deepfake detection illustrates a basic need: to protect the process from manipulations and prevent decisions that rely on misleading material. Precisely when truth becomes more complex to identify, technological tools are not a substitute for justice - they are a condition for it to be based on reliable, transparent and verifiable facts.
The Israeli picture also connects to a global trend. The FBI’s IC3 report published in April this year illustrates the scale: in 2024, about 860,000 complaints about cybercrime were received in the U.S., and the reported damage crossed 16 billion dollars - an increase of 33% compared to 2023. In Europe, Europol - the European Union’s law enforcement agency - warned in March 2025 that AI increases the operational capability of organized crime, including through sophisticated fraud and impersonation, the use of deepfakes, and voice cloning. When crime adopts tools that increase output and blur traces, enforcement systems must equip themselves with capabilities that restore the relative advantage - reach the truth, establish evidence, and present it in a way that can be examined.
And from here, the look ahead as well: 2026 will be a year of transition from expanding access to information - to the ability to work with it correctly: to verify, cross-check, document and explain. Not only to improve effectiveness, but to protect the fairness of the process and public trust. Technology does not replace investigation, prosecution and adjudication, it enables them to operate in a world where truth is no longer self-evident, and evidence is found in a digital space that can also be forged. In this sense, 2025 did not provide just another trend, but an institutional reminder: without advanced technological infrastructure, justice becomes harder to achieve; with it, it can remain possible, precise, and verifiable. In a world where it is possible to forge almost everything - the ability to prove truth is a condition for justice.
Ronnen Armon is the Chief Products & Technologies Officer at Cellebrite.














