In the wake of the Silicon Valley Bank collapse, another shutdown went unnoticed: that of SpankPay, one of the few payment platforms catering to the adult industry
Last week, countless media outlets dissected the highly-publicized collapse of Silicon Valley Bank, which left tech startups and VC funds scrambling to access their money. Falling by the wayside, however, was the collapse of crypto-based payment platform SpankChain—one of the few payment processors specifically geared toward serving the adult industry. Founded in 2017, the company intended to provide a “safe haven” for sex workers, a community that often faces discrimination from mainstream banks. But now, it’s subject to the same type of targeted shutdown it hoped to help sex workers avoid.
“Operating SpankPay in a hostile banking environment has always been challenging, but the escalating attacks have become untenable for our small team and the niche market we serve,” the company tweeted upon learning that Wyre, the upstream payment processor they’d been using to process non-crypto payments, had terminated their account—allegedly because their own payment processor, Checkout.com, doesn’t work with adult businesses. “After a long and difficult consideration, we have decided to close down SpankPay.”
For nearly a decade, payment processors and credit card companies have systematically closed the accounts of people whose work happens to involve sex, even if the kind of sex work they do is legal. They’ve also coerced websites into banning sexual content, threatening to withdraw their services based on “high-risk” assessments—a pattern of systemic discrimination that was underway long before Visa and Mastercard cut ties with Pornhub in 2020, in the face of massive pressure from conservative anti-sex groups like Exodus Cry and the National Center on Sexual Exploitation.
According to a 2021 survey, nearly 50 percent of sex workers report that they’ve had a negative experience with national banks or online payment processors as a result of their profession. “I’ve lost several accounts—from Wells Fargo, from Capital One, from a credit union,” sex worker, author, and activist Liara Roux tells Document, explaining that, at this point, SpankPay’s collapse isn’t surprising—just disappointing. After all, “wasn’t their whole business model, like, not having this happen to them?”
“The discrimination is not about actual payment processing activity, it’s about individuals. The shutdown happens because the algorithm has determined that they are involved in some sort of sex work.”
That major companies struggle to secure financial services, simply because they support sex workers, indicates the myriad challenges facing individuals in the industry—many of whom are forced to find smaller banks, which often lack the technical support of mainstream options. “Companies are always like, Oh, we’re gonna swoop in and solve this thing, because they think they’re gonna be able to make a bunch of money off sex workers. And we’re like, This is an extremely hard issue to solve,” Roux says. “If you wanna use modern banking technology and be a sex worker, it’s like, Good luck.”
In one of the more highly-publicized instances of banking discrimination, OnlyFans abruptly announced in 2019 that it would ban adult content. Why? “The short answer is banks,” founder and chief executive Tim Stokely stated, claiming they had no choice but to cut ties with the thousands of sex workers on whom they had built their business. In the face of massive backlash, OnlyFans later reversed the ban, announcing its compliance with a range of restrictions on the sexual content they host. This included everything from natural bodily processes like menstruation, to content depicting people whose hands and feet are bound during sexual penetration. OnlyFans also rolled out a policy requiring content creators to upload proof of age and identification in order to use the site—which, by tying their legal identity to their sex work, only serves to earmark them as part of the same group these banks discriminate against.
This attack on the ability of sex workers to do business online poses a particular challenge for those who are chronically ill or disabled. For instance, Danielle Blunt—a dominatrix and researcher with Hacking//Hustling—transitioned to remote work during the pandemic, finding that it was the best way to continue her practice without risking exposure. This allowed her to keep making money without endangering her health—at least until she lost access to her accounts. Some sites she relied on for income stopped processing payments altogether—and, as a disabled person, resolving these halted payments and seeking new platforms took energy and time she didn’t have to spare.
“Financial discrimination and online censorship make online work less stable options for many people who rely on it,” says Blunt, likening the real-life impact of these policies to that of FOSTA-SESTA, the Senate and House bills that—though framed as an effort to stop sexual exploitation—actually made it harder to catch and prosecute child traffickers, and endangered countless sex workers by restricting their ability to vet clients online. To make matters worse, “the removal of resources disproportionately impacts those already at the most risk of violence—including survivors and people already working in exploitative labor conditions,” says Blunt.
“The insidious part of this is that it’s part of a legacy of financial discrimination that’s been going on since before the internet. It’s the modern equivalent of redlining.”
FOSTA-SESTA chips away at Section 230, a law that prevents websites from being held accountable for the content users post—meaning that, in instances where sexually-exploitative content slips through the cracks, platforms themselves can be held criminally liable. Because it’s near-impossible for platforms to moderate content at the scale and speed at which it’s posted, many have chosen to ban sexual content entirely, resulting in the mass deplatforming of sex workers from online spaces. “If legislators were actually concerned with increasing the safety of vulnerable communities, you would see them investing in universal healthcare, housing, and access to food,” says Blunt. “Instead, they’re only making it more difficult to make money or build community.”
According to Aaron Mackey, a senior attorney working with the Electronic Frontier Foundation, it shouldn’t be the role of banks to determine what people are allowed to say online—but that’s exactly what’s happening, especially in light of increasing risk that they might be held accountable for illegal content. “Banks are not in the content moderation business, and they’re not in the content distribution business—they’re in the business of processing payments. So what they should do is adopt a core value that, to the broadest extent possible, they’re not going to cut off service from customers just because they disagree with the content of the media that they create,” he says. In the meantime, however, Mackey says that these companies are allowed to decide who they want to do business with, and under what terms.
This means there are few options for sex workers whose accounts are shut down—even if they weren’t used for matters related to adult content in the first place. Part of the issue is a lack of clarity: “You can’t call Venmo and be like, What exactly did I do to get banned?” says Gabriella Garcia—co-founder and director of Decoding Stigma, an organization dedicated to prioritizing sexual ethics in technology. She also notes that people have been banned from other online services like Airbnb, simply because an algorithm determined that they might be a risky customer. “At this point, biases are deeply embedded in the technology we use. So even as we progress socially, racist, sexist, and homophobic discrimination is part of the algorithmic matrix that’s used to determine risk and police peoples’ rights to financial services,” Garcia says. “The discrimination is not about actual payment processing activity, it’s about individuals. The shutdown happens because the algorithm has determined that they are involved in some sort of sex work, whether it’s digital or in-person, and whether it’s legal or quasi-legal.”
Garcia describes this decision-making mechanism as “completely blackboxed,” noting that she and her peers attempted to contact several financial institutions on the subject when researching a CUNY Law Review article on the topic, to which Blunt also contributed. “The insidious part of this is that it’s part of a legacy of financial discrimination that’s been going on since before the internet. It’s the modern equivalent of redlining,” she continues, referencing the way that, in categorizing sex workers as “risky” people to do business with, these companies only increase the stigma associated with their professions and push sex workers further underground.
“We are continuing to see moral panics being stoked to pass legislation to ‘save the children’ and ‘stop sex trafficking,’ without meaningful analysis of how these bills actually do nothing to protect vulnerable communities. In many instances, they actually make communities more vulnerable by taking away resources, limiting our access to community, and erasing our stories from the internet,” Blunt says. “When people are denied access to resources, it increases their exposure to harm. It’s really simple: Financial discrimination is an act of violence.”