By Élise Moreau, European Affairs Analyst at Phoenix24
Brussels, July 2025 —
At the outer terminals of Frontex, algorithms operate tirelessly. Faces are scanned, documents cross-checked, movement patterns evaluated in milliseconds by systems promising security and efficiency. But what these systems fail to detect—what they cannot translate into data—are the silences, fears, and fragile acts of resistance embodied by migrant women who cross Europe with no guarantees, yet everything to lose.
In this post-Schengen Europe, where free movement is being reshaped by securitarian logics and the rapid expansion of biometric control, a new kind of border has emerged: invisible, digital, and profoundly unequal. It is no longer just a physical fence that blocks access. It is also the code—unseen and unaccountable—that decides whether a woman can move, be protected, or even be believed.
In my reporting across Calais, Lampedusa, and the tense crossings between Poland and Belarus, I have met dozens of displaced women. Some flee wars, others escape domestic violence or regimes that criminalize their existence. But when they arrive in Europe, they do not find refuge—they enter a system. They are processed, categorized, and digitized. Often, their gender becomes a liability rather than a consideration. Are they traveling with children? Do they bear physical signs of abuse? Does their testimony fit the “verifiable profile” recognized by risk-assessment software?
Systems like ETIAS (European Travel Information and Authorization System), alongside databases such as EURODAC and VIS, are increasingly governed by algorithmic logic that reproduces structural bias. According to watchdogs like AlgorithmWatch and Access Now, these technologies frequently lack strong transparency and gender accountability frameworks. The intersection of migration, surveillance, and institutional sexism remains a blind spot in European policy.
As digital infrastructure grows stronger, human support structures grow weaker. There are fewer interpreters for African dialects, fewer trauma-informed psychologists, fewer women-led community shelters with resources. In their place: more cameras, more sensors, more automated forms. Digitization is not the enemy—but deploying it without social awareness becomes a new form of exclusion.
Europe, which champions itself as a beacon of human rights, cannot ignore this contradiction. A border that scans faces but does not hear stories is not neutral—it is an instrument of power. And when that power is exercised over racialized, feminized, and precarious bodies, the harm is not merely institutional—it is moral.
It is also economic and political. Much of Europe’s border surveillance software is developed by private corporations with multimillion-euro contracts, limited parliamentary oversight, and opaque ownership structures. Public accountability is weak, while technological expansion is exponential. Who is programming the future of mobility? Under what values?
What happens at the border is no longer just about asylum policy—it is about data politics. Decisions are based on algorithmic risk scores, using machine learning systems trained on past cases but stripped of context. In this statistical learning, women fall through the cracks—not for what they’ve done, but for what they cannot prove, for what does not fit, for what the machine cannot read.
And if things continue as they are, Europe will perfect a new type of border: efficient, automatic, and blind to gender. But if disruption occurs—through a legal scandal, a European Court of Justice ruling, or a transnational alliance of women who document and resist—this model may begin to crack. In a bifurcated future, Europe will have to choose: between a data-driven architecture of control, or a human-centered governance model grounded in dignity.
Because behind every biometric scan, there is a story. And behind every invisible border, there is a political responsibility that no algorithm can carry.
Élise Moreau, French investigative journalist and international correspondent at Phoenix24. Specialist in European affairs, gender equity & digital democracy.