by Charlotte Ariel Finn
The first thing you did was throw up.
The internal pharmaceutical implant did what it could to mix up antacids and anti-anxiety medication out of internal stores and stray hormones in your body. But a miracle maker, it was not. And so you knelt by that ancient cold porcelain toilet that you kept around because tearing out the plumbing would do too much damage to the fixtures and paneling of your home.
After last night's meal and something that looked like blood, after your stomach gave everything it couldn't allow any more, you check your notifications again, to see if there was a mistake. It's as familiar as swallowing by now, given the years you've had the neocortex. You hesitate a little, like a kid who hasn't learned how peek-a-boo works yet; the deeply irrational part of you thinks, maybe if you don't check, it won't be real.
But you check, and it's real. The message from HaveIBeenPwned is real.
One of your neural backups was just uploaded to the Internet.
* * *
After the shock wears off, you put your neocortex in safe mode, leaving yourself outside the touch of wireless neural backup for the first time in eighteen months. You expect to feel fear, but you don't. It'd be like being scared of dehydration while you were drowning. A lack of backups is not your problem right now.
With your brain in safe mode, you go digging through your old banker's box full of electronics for something you can fashion into a crude means of accessing the Internet. An old wireless card, a keyboard from that brief flirtation with DVORAK, that old external holograph you use for when you need to diagnose a problem with a display.
You boot up and the first thing you do is change your local wireless password; thankfully, no one who knows it is anywhere close to your house. What you had before was seventeen extremely rare words long, but that doesn't matter; you remembering it means that now they all know about it.
That solved, you then log onto the provider for uSync Neocortexes. You used the hardware but not the firmware, but check anyways to see if there's a noted issue with the hardware. Tens of thousands of words of legalese stream by your eyeballs; your own fluency with the language isn't good enough to know if they have covered their asses enough.
You have no leads, and so, with some heaviness in your heart, you go to the software forum for Jucci, the programming collective that makes the software in your neocortex. It's a lot more easily modified than uSync's default backups and regulatory behavior; it also has a steeper learning curve.
You wonder for the first time if maybe, despite the degree on your wall, if maybe it was too steep.
They are already talking about the hack. There are a lot of theories as to how it could have happened and no one's taking responsibility. There's some idle speculation as to how to invalidate the data, but everyone knows the central problem: trying to take data off the Internet is like trying to take water out of the ocean. And translating the human psyche into data meant it was inevitable that this would happen.
There are people there who proclaim that well, yes, this is a serious breach and all, but be honest, if it had to happen to anyone, it might as well have happened to you, right?
They're banned. Eventually.
You log off and try not to cry.
* * *
You spend your morning changing every single password that you have.
All your old backup phrases are useless; everyone now knows your favorite author, your favorite movie, your first crush. You contemplate going back to carrying around a tabula recta, because yes, carrying a piece of paper with random characters on it is a pain, but is it that much worse than this?
As you rebuild your digital security, you get a message from Cleo, who tells you that she knows where the hack is hosted, and where people are discussing it. With morbid curiosity, you open a parallel session and see for yourself.
Without the actual neocortex and the very specific configuration of your neurons, they can't actually simulate your brain that easily. (Yet.) But they have already begun trawling through the memories they can decode. You read reaction posts full of that corrupted image of a soda can that looks like it's laughing painfully, as people go over all your private shames and failures. That time you wet the bed when you were sleeping over at a friend's. That time you plagiarized a joke. That time you lied when you said your favorite movie was the four-hour cut of In Paradise, Nobody Laughs when it's actually a superhero movie starring an actress you had a crush on.
They laugh at the time someone slipped something into your drink at a party, and they denigrate the hazy memory of Sophie as she gets you out of there before anything can get worse.
The neocortex alerts the pharmaceutical implant, and it does what it can, but you know.
There's no fixing this.
* * *
You no longer have a job when lunch arrives.
Some anonymous citizen sent a thorough list of all the sites you visit when you're in the mood for a certain kind of erotica; your employer reads them off to you and takes your silence as confirmation. Of course, they are very open minded, very open-minded, of course they know it's all pretend. But it's a messed up kind of pretend, isn't it, and in contrast to everyone else's pretend, it's now a public pretend instead of a private pretend, and your access to the work servers has been revoked.
At the least, they have no grounds to sue, since you were very diligent at booting into safe mode during work hours and not thinking about work outside of work hours. Still: better safe than sorry. They are going to be spending their afternoon revamping their security as well, which of course, is just what they don't need. Why did you have to do that, they ask. Why did you have to get one of those things.
In the middle of this conversation, a stranger pings you on the social network you do all that pretending on, and asks if you would be into that very specific kind of pretend.
You're meaner to them than they really deserve.
* * *
That afternoon, more anonymous messages roll in.
There's the ones you expect; the ones that say you deserved it because of all the shit you tried to stir within the community, with your writing and your advocacy and the criticisms you've had for how the community handles things. None of them admit that your memories exonerate you in the public dispute you had with Dr. Feine, of course.
You don't cry. Not yet.
Some of them are messages saying that they knew you were always faking it, sending you a compiled list of every time you doubted the gender identity you arrived at. You remember, freshly, each and every time you did exactly that.
You don't cry. Not yet.
You check the forum and someone is already working on You v1.1—bullshit posturing, of course, hacking a brain image on that scale would take tens of thousands of hours of work. But he seems pretty dedicated to his new hobby. He says that he has friends who are pretty into it too. Says he can't wait for simulation tech to get better so that he can have fun with the copy of you from six months ago. He has a lot of ideas.
He says that version 1.1 of you will smile more.
You don't cry. Not yet.
Someone posts a crude deepfake of a memory you had, where you were naked and looking in the mirror and admiring the progress you were making; they've inserted themselves behind you, wearing a tactical balaclava, their gloved hands reaching for your throat.
You don't cry. Not yet.
As you're finishing up the last of your changed passwords, as you're finally eating something you cooked on the stove because right now you don't trust anything with a wireless connection to the Internet, you get a message from someone telling you that they really enjoyed how you felt at your mother's funeral. How the speech you gave was heartwarming and that they know—they have confirmed—that the emotion you felt was genuine.
And that is what does it.
Then, you finally cry.
* * *
The local mutual aid network's head organizer says that this will all be solved once we've achieved full socialism.
He talks at length about how once everyone's basic needs are met, there will no longer be any incentives to steal someone's brain images, because the secrets within won't be valuable anymore. That once what he, and you, have advocated for becomes the way people live, what you're going through won't happen to anyone else.
As he talks, you idly look him up. He seems sincere, is the thing; he seems to honestly believe this. He wants a better way of life for everyone, and so do you. But he did defend Dr. Feine that one time, though he later changed his mind. And he does have an account on the forum currently tearing your life apart in a very literal sense. You don't have anything concrete. But you wonder.
You wonder how, considering how well off the forum in question's userbase is, if seeing to the basic needs of everyone will really solve the problem. Capitalism sure isn't solving it, of course. But that doesn't mean anything else can.
Then you remember that your bills don't care about your feelings and that your implant needs a restock, and so, you swallow your pride and nod along as he goes into the speech he's practiced.
Then you wonder if he'll ever find out what you really think of him. But thankfully, you haven't done a backup since booting into safe mode.
You think about toggling it off, but . . . not yet.
* * *
That evening, Sophie calls you in tears.
It's not your fault, she says. She knows it's not your fault. You didn't tell a soul about it, just like you promised. But you remembered, and your neocortex backed up the memory, and now, everyone knows, so whether or not it's your fault, her life is still in tatters.
You can still remember that day, the day she cried into your arms as she told you something she never told anyone before, or anyone since. About why she refused to testify. You remember holding her, and you remember not agreeing. But you remember saying "I understand," and you remember saying it enough that you began to actually understand.
But as you hold onto that memory, you know that it's not just yours and hers anymore. Now it belongs to anyone with an Internet connection. There will be stories written about you soon. And well, they'll have to talk about some of your memories, won't they. Otherwise they'd just be dancing around what a violation this is. And in talking about them, they'll be making sure that they spread that much further and that much wider.
And meanwhile, Sophie confesses that she no longer feels like she can trust you with anything while that thing is in your head, that she can't share anything in confidence because she no longer has any. It's not you, it's the implant. She swears, it's the implant.
You say you understand, and you cry, as your oldest friendship dissolves into dust and you can't do a thing about it.
* * *
By the time midnight hits, you're looking up transhuman collectives.
A few people out there have anticipated this problem, that neural uploads could never be fully secure and would necessitate a change in social order. So they built a collective where there is an explicit expectation that you cannot trust all secrets to stay secret forever, and they have built their community around that.
You do the research. They seem okay. No major incidents. They are understandably resistant to outsiders; who wouldn't be, given that they've given up on even the pretense of privacy over their own memories and thoughts? The few videos of them make them seem charming and affable, and you wonder if it's just because everyone in their community is pointing a camera at everyone else all day, every day. Who wouldn't learn to be photogenic?
The big catch is, of course, that you couldn't be in contact with anyone outside the collective; everyone in the collective has agreed to be an open book, and without the consent of those outside, it's unethical to form any kind of meaningful relationship with those whose privacy is at risk.
You think about turn-of-the-century literature about the digital divide, about a world divided neatly in two, between the digital haves and the digital have-nots, and you wonder if that's just the nature of the beast.
You don't find an answer. You just wonder.
* * *
You finish this journal, and you're still in safe mode.
While you're in safe mode, all uploads are restricted; in the event of catastrophic failure, you'll be restored from backup from the day before writing this.
You think about that, and as you write, you think about trauma.
You ask yourself if it's really ethical to subject a future version of yourself to the memories of this day, considering how often throughout the course of the day you've fantasized about taking a drill to the part of your skull that feels a little colder than the rest. You can spare them ever having to remember all of this. Just stay in safe mode forever.
You also ask if it's ethical for that other version of you to wake up one day with no idea of how, or why, their life has so radically changed. If the version of this that they'd go through, all over again, would be any better. You can prevent that from happening. Simply turn off safe mode, even once, just long enough for an upload.
You ask if it's better to forget the trauma, or let it live on forever. And you contemplate the old fear of death, awoken by seeing your mother's body wither away, and you ask yourself if she'd have gotten one of these things. And you'll never be sure, because she's gone forever. Meanwhile, you—once the technology of backup bodies catches up with the state of backup minds—could choose to live forever.
The toggle for safe mode is easy. Pressing the toggle under the skin while reciting the passphrase. Impossible to turn on, or off, by accident. Turn it back on, and you return to being immortal, and that means you remember today for all your infinite days. Leave it off, and let yourself go through this for the first time, again and again forever. Or wipe your backups and make peace with knowing that one day, you'll never experience anything ever again.
You think about Hell, back when you believed in it, and how what scared you the most was how it never ended. And you think about the first time you stopped believing in Hell, and contemplated death—contemplated ceasing to be, forever.
And you still haven't decided which one's worse.
And you stop writing this journal, and prepare to find out.
Charlotte Finn has been self-publishing her webcomic, BRAND ECHO, for the past few years, with artist ING. Her short comics have appeared in the STRANGE ROMANCE anthology series, with art by Ing and Change Brown and letters by Josh Krach. She is a former blogger at ComicsAlliance and her work has been published in spaces such as HeroCollector and Shelfdust. She can be found on her personal website, as well as Twitter. She actually is a skunk lady in real life.