
FANUC, the Japanese industrial robot manufacturer, has been operating factories since the 2000s that produce robots in complete darkness. No light, no heating, no humans present during production. The machines manufacture other machines for weeks without intervention. What the industry calls dark factory is not a metaphor. It is a decision about what the workspace must contain.
The constraint of visibility disappears with the worker. Light was infrastructure for human eyes. Without eyes, it becomes a cost without function. But light was not merely light. It made it possible to see smoke, oil leakage, the mispositioned part, the fraying belt. These observations were not formalized in procedures. They had no name in safety manuals. They occurred in passing, by workers whose presence in the space was not instrumentalized as detection.
Reliability engineering distinguishes detectable failures and latent failures. A latent failure exists without manifesting, it awaits the conditions that will make it visible. In a factory with workers, part of the latent failures became detectable before becoming catastrophic, not because a protocol prescribed it, but because a human body was there and perceived something abnormal. This body was an uncalibrated, undocumented measuring instrument, unrecognized as such.
Modern predictive maintenance systems seem to address this problem. Accelerometers detect vibratory signatures characteristic of imminent failures, well before any sign perceptible to humans. Learning algorithms identify correlations that no one would have seen. But these systems detect what they were designed for, failures whose signature has been previously characterized, whose pertinent parameters have been identified, whose data has been decided as needing to be collected. Predictive maintenance is formalized detection of the known unknown. It does not cover the unknown unknown, the failure whose signature no one has yet observed because it has not yet occurred in a sufficiently instrumented system.
Redundancy by automatic shutdown functions according to the same logic. A parameter deviates from a pre-established threshold, the system stops. This presupposes that the failure produces a measurable deviation in a monitored parameter, that this deviation precedes the tipping point by a sufficient interval, and that the threshold has been correctly calibrated. What the worker captured in passing had no threshold. It had no parameter. It had a presence in space.
Reason (1990) distinguishes active errors from latent errors in complex systems. Active errors are committed by operators in direct contact with the system. Latent errors are deficiencies buried in the organization, invisible until the moment when they combine with other factors to produce an accident. The dark factory eliminates active errors by human operators. It does not touch latent errors. It adds a category: failures that only a non-specialized presence would have detected.
Doctrine
The worker was not only a producer. He was a sensor. His withdrawal does not automate production. It blinds part of the detection system that no one had formalized as such because no one had imagined his absence.
Vecteur ouvert
Predictive maintenance and redundancy cover failures whose form is known. They displace the boundary between the known unknown and the unknown unknown without suppressing it. If human presence in an industrial space constitutes an undocumented detection instrument, then any space from which humans are progressively withdrawn loses a detection capacity whose extent no one knows precisely before it disappears. Hospitals that reduce their night staff, control rooms that switch to remote supervision, electrical networks piloted by algorithm, in each of these cases, something ceases to be perceived. What remains on the other side of this boundary does not necessarily diminish. It changes form. The question is whether this transformation is measurable before an accident reveals its extent.
