READ THE SIGNALS

31 de marzo de 2026

How to recognize change before it becomes evident

There are decisions that are not based on conclusive evidence or definitive data, but on the perception that something has started to shift, even if it is not yet clearly visible. There is no complete proof, but there is a disturbance in the balance. Something no longer fits exactly as it did before. A small friction appears, an inconsistency, an anomaly that has not yet reached the status of fact but is beginning to demand attention. Interpreting signals consists of recognizing these alterations before the change becomes evident to everyone.


In extreme environments, this capability stops being a competitive advantage and becomes a condition for survival. Above eight thousand meters, the available oxygen is roughly one third of normal levels, fatigue affects judgment, and the margin for error disappears. There are no perfect indicators or clear warnings. There are small perceptions that do not fully make sense. Waiting for the problem to become obvious may be too late. The difference does not lie in having more information, but in knowing how to interpret what is already happening, even if it has not yet taken full shape.


Ed Viesturs formulated a simple rule that captures this logic. Reaching the summit is optional. Returning is mandatory. Its value does not lie in superficial caution, but in the cognitive discipline it requires. When the summit is close, when the effort invested has been immense, and when success feels within reach, turning back is not intuitive. Yet there are moments when the right decision is not based on an obvious signal, but on an accumulation of weak indications. The signal at that point is not dramatic. It is sufficient only for those who have learned to distinguish between apparent normality and emerging deterioration.


This pattern repeats across contexts. Deep transformations do not begin with visible announcements or conclusive data. They start at the margins, in scattered behaviors, in changes that do not yet affect the main indicators. The common mistake is to wait for the system to confirm the change before acting. By that moment, much of the transformation is already underway. The future tends to appear first as a weak signal and only later as robust evidence.


In the late 1970s, Akio Morita made a decision that did not respond to any explicit demand. While the audio industry was moving toward increasingly sophisticated and feature-rich devices, he proposed something that seemed limited. A lightweight, portable device, without recording capability or speakers, designed solely for listening to music on the move. From a conventional perspective, it seemed like a mistake. However, Morita was observing an emerging shift in everyday life. Mobility was beginning to reorganize personal time, and music was no longer a static experience. He did not respond to an existing demand, but to a weak signal correctly interpreted. The Walkman was not an incremental improvement, but the materialization of a change that had not yet been fully articulated.


This type of decision forces a reconsideration of a common assumption. In complex environments, the problem is rarely the lack of information, but the difficulty of giving it meaning. Karl Weick explained this through the concept of sensemaking, showing that individuals and organizations must construct interpretive frameworks to understand what is happening. Reality does not arrive organized. It comes fragmented, ambiguous, and sometimes contradictory. Interpreting signals involves connecting dispersed elements before a coherent narrative exists.


The difficulty increases when previous frameworks have worked well for a long time. The more successful a model has been, the harder it becomes to abandon it. Organizations continue to measure what they have always measured and validate what fits existing categories. Meanwhile, many relevant signals remain outside because they cannot easily be translated into those indicators. Important transformations often appear first at the periphery before becoming visible at the center of the system.


Nassim Nicholas Taleb argued that the most significant events are often those that do not fit existing models and are therefore ignored. This does not mean everything is unpredictable, but that our frameworks are inherently incomplete. As a result, the most important signals are not always the most visible. Noise is immediate and amplified, while signals are subtle, incipient, and often counterintuitive. Detecting them requires sustained attention and a degree of independence from dominant expectations.


The history of science offers clear examples. Rachel Carson identified cumulative effects of pesticides at a time when environmental awareness was not yet established. There was no recognized crisis, only scattered indications that could easily be dismissed in isolation. Her contribution was not only scientific, but interpretive. She connected these signals and turned them into a coherent narrative. She transformed anomalies into patterns, and patterns into warning. That is the essence of interpreting signals.


However, perceiving a signal does not guarantee impact. There is a second difficulty that often goes unnoticed. A correctly interpreted signal may fail to generate action if it cannot be translated into a language others can understand. The myth of Cassandra illustrates this limitation. It is not enough to anticipate. That anticipation must be heard. Between individual perception and collective action lies a space where interpretation must become communicable and legitimate.


This problem intensifies in organizational contexts. Irving Janis described groupthink as a dynamic that reduces the ability to question collective decisions, even when contrary signals exist. Barry Staw analyzed escalation of commitment, showing how individuals persist in a course of action despite opposing evidence. In both cases, signals do not disappear. They lose their ability to influence decisions. The system does not stop seeing, but it stops reacting.


For this reason, interpreting signals is not only an individual capability. It also depends on the environment. It requires contexts where assumptions can be questioned, doubts can be introduced, and interpretations can be revised without immediate penalty. A system that prioritizes confirmation over exploration gradually loses sensitivity. The ability to anticipate does not depend only on talent, but on the conditions that allow weak signals to be recognized and discussed.


At an individual level, this process requires a different relationship with uncertainty. Weak signals do not appear as clear evidence, but as partial perceptions or inconsistencies. The challenge is not to accept every intuition, but to develop the judgment to decide which ones deserve attention. This involves tolerating ambiguity without resolving it too quickly. The urgency to close an interpretation can prematurely eliminate a relevant signal.


At this point, the connection with Minority Report becomes particularly meaningful. In that universe, crimes are anticipated before they occur through fragmented visions of the future. The premise is extreme, but it raises a critical question for decision-making. What happens when a signal is incomplete, ambiguous, or open to multiple interpretations. The film introduces the concept of the minority report, an alternative version that does not align with the dominant interpretation. The key is not the existence of the signal, but how it is interpreted and the level of certainty assigned to it.


This idea connects directly with reality. Weak signals are rarely unambiguous. They allow for multiple readings and may point in different directions. The risk is not only ignoring them, but interpreting them too quickly or assuming a single interpretation as definitive. In Minority Report, the system fails not because of a lack of information, but because of the rigidity of its interpretive framework. The problem is not the signal, but the inability to coexist with its ambiguity.


Interpreting signals does not mean predicting the future, but recognizing when the present is beginning to transform. It means paying attention to what does not yet dominate the system but is already altering its structure. It implies accepting that meaningful change rarely begins as consensus or majority. It begins as deviation, as anomaly, as an emerging pattern.


The future does not appear suddenly. It is built progressively through small modifications that, in their early stages, are barely perceptible. Most people ignore them. Some perceive them but do not act. Only a few interpret them in time. In that difference lies the capacity to navigate uncertainty.



When change becomes fully evident, what matters most has already happened. At that point, interpretation is no longer enough. Only reaction remains.


31 de marzo de 2026
Cómo reconocer el cambio antes de que sea evidente
17 de marzo de 2026
The moment an idea meets reality
17 de marzo de 2026
El momento en que una idea se enfrenta a la realidad
7 de marzo de 2026
The invisible architecture of actions that change systems
7 de marzo de 2026
La arquitectura invisible de las acciones que cambian sistemas
20 de febrero de 2026
The silent architecture of creativity
20 de febrero de 2026
La arquitectura silenciosa de la creatividad
11 de febrero de 2026
Talent and the invisible mathematics that took humanity into space
11 de febrero de 2026
El talento y las matemáticas invisibles que llevaron al ser humano al espacio
29 de enero de 2026
The ghost in the machine and what science still cannot fully explain
Show More