In 2015 and 2016, then-candidate Trump made a variety of opaque statements suggesting support for a so-called “Muslim registry” that would presumably track both U.S. citizens and non-citizen visitors based on their religion. Much of the ensuing analysis about the likelihood and constitutionality of such an endeavor has focused on two possibilities: (1) Self-registration by (or registration of) Muslims and (2) Revival of a post-9/11 Bush-era database known as NSEERS that stored data on individuals visiting from a set of majority-Muslim countries.
As for self-registration, most commentators have concluded it is unlikely for the Trump Administration to make such an overt move. Many have also argued that doing so would be unconstitutional under the Free Exercise and Establishment Clauses, as well as the Equal Protection Clause and the Fifth Amendment equal protection guarantee.
And as for revival of NSEERS, opinion is divided. A number of law professors have argued this program would pass constitutional muster because of judicial deference to the Executive on matters of foreign policy and immigration enforcement. Other scholars, however, have argued that evidence of anti-Muslim animus behind such a facially neutral database could render it unconstitutional. These debates are related to the ongoing legal battle over the travel ban.
But the Trump Administration need not actively register a single Muslim, nor reboot the NSEERS database, to build a Muslim registry: it likely already has has the tools it would need to predict with high accuracy the religious identity of a significant percentage of U.S. citizens and visiting Muslims.
The key insight here is that a combination of “big data” and predictive algorithms enable a small set of software engineers—whether working directly with the government or indirectly through an unscrupulous vendor—to use databases constructed for other valid purposes, such as deportation prioritization and national security, to predict who is Muslim. Such engineers could create one aggregate “Muslim registry” or, less crudely and less detectably, build religious identity into its targeting approaches for individual programs. For example, Immigrations and Customs Enforcement (ICE) could add deportation-prioritization points for those who are predicted or known to be Muslim.
How does this work?
Building a predictive database of Muslims depends on: (1) Access to databases tracking individuals; (2) A variety of contextual information about those individuals that could be used to predict religion; and (3) Software engineers with experience writing algorithms to make such predictions.
Happily for a government wishing to accomplish these goals, Palantir, the “big data” company founded by Peter Thiel, advertised such capabilities nearly a decade ago: in a 2008 pitch to GCHQ (the United Kingdom’s equivalent of the CIA), “Palantir engineers showed how their software could be used to identity [members of a] religious sect and graph their social relationships.”
And since that time—as reported by the media and revealed through FOIA requests—Palantir and others have constructed a number of systems for the U.S. government that demonstrate it has each of the three capabilities described above. While it could be argued that in some circumstances each of these systems is perfectly legal, morally defensible, and perhaps even life-saving, the Trump Administration might misuse them in illegal and unethical ways.
According to The Verge, one such system is the Analytical Framework for Intelligence (AFI). This system “tracks and assesses immigrants and other travelers, according to public records.”
It uses “top-secret algorithms that process personal data to assess travelers and would-be immigrants. This helps federal authorities determine a person’s eligibility to travel into — or even within — the United States . . . . CBP lends out access credentials for [AFI] to other law enforcement agencies, including [ICE] . . . .”
The Verge has also described another system, called FALCON, that Palantir created for ICE.
Like AFI, FALCON “stores and analyzes information it receives from databases kept by various government agencies,” and is “used by agents within ICE’s Office of Homeland Security Investigations (HSI).” According to The Intercept, FALCON “does not segregate data contained within individual data sets when searches are performed.” Instead, “if a user searches on a particular Person, Event, or Object, all records connected to that Person, Event, or Object which are accessible to FALCON are called up.” The system allows HSI agents to “use FALCON . . . to pull data from offices within the Department of Homeland Security, the FBI, and other sources that include information on foreign students, family relationships, employment information, immigration history, criminal records, and home and work addresses.”
And according to The Verge, both AFI and FALCON may also have access to old NSEERS data.
Additionally, as reported by the The Intercept, a new system called ICM (Investigative Case Management), to be completed this fall, will allow ICE agents to “access a vast ‘ecosystem’ of data to facilitate immigration officials in both discovering targets and then creating and administering cases against them.” The system gives its users access to variety of other government “intelligence platforms maintained by the Drug Enforcement Administration, the Bureau of Alcohol, Tobacco, Firearms and Explosives, the Federal Bureau of Investigation, and an array of other federal and private law enforcement entities. It can provide ICE agents access to information on a subject’s schooling, family relationships, employment information, phone records, immigration history, foreign exchange program status, personal connections, biometric traits, criminal records, and home and work addresses.”
Thus, the Trump Administration already has at its disposal: (1) a number of databases containing myriad personal information; (2) the ability to interconnect at least some of these databases; (3) the ability to assess, profile, and make predictions about individuals (potentially including their religion); and (4) the ability to do so for a significant number of U.S. citizens and visitors.
For example, a predictive algorithm might start with a set of individuals for whom Muslim religion is already known. Using reams of information about those individuals, such as their public records, social media behavior and profiles, residential location, country of origin, social connections, etc., the algorithm could look for individuals with similar patterns and “predict” that those individuals are Muslim. And given advanced technology, algorithms might also predict—at a lower confidence level—–individuals who are Muslim without having a “starter set” of known Muslims.
Software engineers who create such algorithms may not even know why exactly certain individuals end up predicted to be Muslim; such is the nature of machine learning. And the fact that the predicted attribute here is religion—rather than, say, predicted propensity to conduct criminal conduct—is not particularly remarkable from a technological standpoint: big data algorithms could work in the same fundamental way to predict nearly any attribute of interest.
Importantly, both Peter Thiel and Palantir’s CEO have publicly declared that Palantir will not help to build a Muslim registry. But this does not end the story. First, to the extent that interconnected databases with large amounts of information tied to individuals already exist, the government does not necessarily need Palantir to then write algorithms that use those tools to predict who is Muslim. A small number of in-house software engineers could do this, as could less scrupulous (and less scrutinized) vendors.
Additionally, secondment programs, such as the Presidential Innovation Fellows program, could be used to temporarily move employees out of the official employ of private vendors and into the federal government to help build a database, thereby providing formalistic deniability to private vendors. Companies would want this deniability not only to avoid a public backlash, but also to avoid blowback from their employees—who, like much of Silicon Valley, would likely recoil at being associated with such a project.
Potential Constitutional Issues
Building a database based on predicted or known religion—or using religion to supplement existing databases (I will henceforth use "building a database" as shorthand)—would arguably violate equal protection guarantees. (Though outside the scope of this post, others have suggested that building a registry would also violate both the Religion Clauses.)
In interpreting the Equal Protection Clause, at least one federal Court of Appeals has determined that, like governmental race-based classifications, religion-based classifications are subject to strict scrutiny.[1] This demanding test provides that a religious classification may be upheld only if it furthers a "compelling" governmental interest that is "narrowly tailored" to that interest. Several other courts have held that religion-based classifications are subject to “heightened” scrutiny, while reserving judgment on whether this means strict or immediate scrutiny.[2] (Under immediate scrutiny, the government interest need only be “important” and the means need only be “substantially related” to that interest.”) Whichever rule is held to apply, the government must make a powerful showing to justify any religion-based classification.
Building a database based on predicted or known religious identity would be a paradigmatic example of impermissible religion-based classification: Individuals would become associated with a database based solely on the presumption that their Muslim religious identity is somehow inferior, dangerous, or otherwise stigmatic. Put differently, a person’s predicted religion would be the sole basis for how they are treated in a government program (and potentially, as a result, in many other government programs). As noted elsewhere, in the recent case Hassan v. City of New York, the Third Circuit recognized that an NYPD program that made individuals eligible for surveillance based on their Muslim religion was a facially discriminatory classification.[3]
Given this impermissible classification, the question then becomes whether the government has adequate justification under intermediate or strict scrutiny. The government would almost certainly claim that its program is justified by an interest in national security.
However, a court likely would hold that such a broad religious classification would neither be narrowly tailored nor substantially related to national security. The litany of databases and surveillance programs the government has already designed to combat terrorism demonstrates that it has far narrower means of protecting the nation than by crudely classifying individuals based on their religion. The Third Circuit recognized as much in City of New York, whileexpressing serious doubt as to whether a Muslim-targeted surveillance program would be even substantially related to the government’s stated security interests.[4]
And although at least one representative of then-candidate Donald J. Trump cited the Japanese Internment Case—Korematsu v. United States (1944)—as precedent supporting a Muslim registry, that case is widely regarded as part of the constitutional “anti-canon.” As Ian Samuel and Leah Litman have noted on Take Care, Korematsu is properly deemed “one of the greatest blunders in American legal history,” not a precedent on which to base new discriminations.
Accordingly, any effort to build a religion-based database would almost certainly be struck down as unconstitutional—either under the principles just discussed, or because the government may not act on the basis of animus toward a religious minority. As discussed next, perhaps the bigger problem is whether we will know if it’s happening.
Adjudication May Effectively Rest in the Hands of Software Engineers
In contrast to President Trump’s very public travel bans, if the government were to do what I’ve suggested is possible, we might not know about it in real-time, if ever. Unlike visa policy, which must be publicly declared and then implemented by thousands of government employees, a small few could execute in secret the undertaking described above.
Thus, responsibility for stopping unconstitutional behavior may—in the first instance, at least—rest with software engineers. People tasked with any projects similar to those described here should take it upon themselves to think through whether what they are doing might be driven by an impermissible religious purpose, or might be used for an impermissible religious end. If they conclude that they are being put to work for such problematic reasons, it might be prudent to raise concerns to superiors or in-house lawyers, refuse to execute the work, or perhaps even to whistleblow (but only if doing so would be lawful). (To be clear, I am not a lawyer, I have not been barred, and this statement is not legal advice and should not be relied upon as such.)
* * * * *
A web of already-existing databases may give the Trump Administration the ability to construct an imperfect, yet serviceable (and unconstitutional) “Muslim registry.” The Administration likely won’t be so foolish as to call it that, and any such technology may not stand alone but rather be used, for example, to help prioritize deportation targets.
And although some vendors may claim today that they won’t build it, that doesn’t settle the question. The fears of a particular moment, the wiggle room for plausible deniability, and the government’s ability to do this itself or with other, less scrupulous, vendors all mean that a hard-to-discover violation of fundamental rights is all too possible.
[1]United States v. Brown, 352 F.3d 654, 668 (2d Cir. 2003) (Calabresi, J.) (holding that the exercise of a peremptory strike due to a venire member’s religious affiliation would violate Batson v. Kentucky, 476 U.S. 79 (1986), because “religious classifications . . . trigger strict scrutiny”)
[2]SeeHassan v. City of N.Y., 804 F.3d 277, 300–01 (3d Cir. 2015) (surveying these cases).
[3] Id. at 29–34.
[4] Id. at 305–08.