Editor-in-chief Su Ertekin-Taner sat down with poet, editor, and economist Zoë Hitzig. Zoë Hitzig is the author of Mezzanine and Changes Book Prize-winning Not Us Now. Her poems appear in The Paris Review, The New Yorker, Harper’s, Granta, and The Drift, where she serves as poetry editor. She received a PhD in economics from Harvard in 2023.
This interview has been edited for brevity and clarity.
1. A Kind of Productive Dislocation
Su Ertekin-Taner: I’m wondering if you could give me an overview of the buckets of things you do. I’m thinking about your role as The Drift’s poetry editor, your economics research, and your independent authorship.
Zoë Hitzig: Hmm, do we need to get it all out? I prefer to pretend that at every moment I am just one person.
SET: That’s fair.
ZH: I do a lot of things, but I think in some sense they are just different modes of working through the same set of questions. I refused to choose between these different modes – I wouldn’t recommend it to others necessarily, but I’m having a lot of fun!
SET: Let’s talk one bucket at a time then, starting with your poetry. I’m really interested in the synergy that you find between sexuality and technology in your poetry. I feel like that nexus has been done wrong a lot and prioritizes these fembot-esque narratives. How do you see these two themes interacting without becoming distorted or tending toward oversexualization?
ZH: Well, technology offers us a profound way of thinking about otherness. There’s a long and rich conversation—especially within queer theory and certain strands of consciousness studies—that uses technology as a metaphor, a method, even a mode of being, to interrogate the boundaries of self and other. Sexuality is often framed through the question of what is “natural.” How has the idea of the natural been constructed, policed, and destabilized—what gets framed as normative, what’s seen as deviant, and how do those definitions shift?
Personally, I find that technology introduces a kind of productive dislocation. It lets us step outside the body’s usual constraints—its anatomical, hormonal, or even material givenness—and enter a space that feels speculative, generative, and estranged in just the right way. When we think through machines, code, networks—through the inhuman—we’re sometimes able to articulate desires, identities, or modes of embodiment that wouldn’t otherwise find language. It’s less about abandoning the body and more about imagining other ways of inhabiting or relating to it. Does that make sense?
SET: It absolutely does. I read your recent Drift article about surveillance, location sharing, and our growing attraction to both. The body as a node of data, and especially as a widely accessible node of data, can be as scary as it is generative.
ZH: Yeah, totally—something I come back to a lot is a kind of axis between two ways of rendering the body. On one side, there’s the idea of the body as an organic, chemical system—hormones, molecules, all of that. And then on the other, there’s a way of thinking about a life as data—like, an accumulation of bits, behavior patterns, metadata, scattered across servers we don’t really see or understand.
Both of those frameworks kind of scare me, honestly. They’re both ways of turning the body, or a life, into a system—something measurable, knowable, maybe even controllable.
And I think that’s where poetry, or literature, or art in general, can push back. There’s something deeply resistant in those forms—something that refuses that kind of reduction. A poem doesn’t resolve cleanly into information. It holds ambiguity, contradiction, presence. A body or a self is always more than just its chemical makeup or its data trail.
2. “We’re On the Mezzanine”
SET: I want to talk a bit about the way that you render data in your poetry through punctuation: dashes, parentheses, even commas. How do you see poetic and algorithmic language operating on one another in your poetry? Do they clarify one another? Parallel? Something else entirely?
ZH: Yeah, totally. I think a lot of my poems are trying to draw attention to the fact that nothing can really be stored or recorded in a purely objective way—every record of experience is mediated somehow. And a lot of Not Us Now is thinking through that: who’s doing the mediating, and what does it mean to live in a world where our inner lives are increasingly filtered through systems—technological, linguistic, cultural—that are mostly outside our control?
Probably the weirdest poem in that book is the long one at the end, “Exit Museum.” It feels relevant here because, in some ways, it was an attempt to document experience—a kind of stream-of-consciousness tracking of thoughts across a pretty unremarkable day. There’s nothing especially dramatic happening plot-wise. But what felt meaningful to me was the form: how that experience gets rendered, or broken down. The words in that poem are split into syllables, and then the syllables fracture in ways that are kind of awkward—hard to read out loud, hard to process. I was trying to mimic that feeling of distortion, of being archived or encoded in ways that feel alien or uncomfortable. It’s like asking: what happens to consciousness when it’s stored? What gets lost? What gets misrepresented or corrupted?
SET: Even the syllables breaking feels like these facets of ourselves made infinitesimally smaller into data points.
ZH: Yeah, exactly. I actually had a friend—a fiction writer—read that poem and be like, “This is boring. There’s no plot.” And I was like, of course there’s no plot! It’s not prose. It’s experience. It’s meant to be slow, maybe even frustrating at times. The form asks you to engage differently, to feel those strange breaks, and to sit with what it means to be held inside a storage system that doesn’t fully recognize you. Or maybe does—but only in fragments.
SET: Mmm. I’m also wondering about the title of Mezzanine. Objects and people are positioned at different places on the “mezzanine” throughout the book: looking down, upward, or horizontally. From what position are you viewing the world and its technological developments now? On the mezzanine looking down? Ascending to a mezzanine?
ZH: That’s a great question, and I will say that some poems in Mezzanine were written when I was an undergraduate. So, there’s some continuity.
SET: Can I know which ones?
ZH: A bunch from the opening sequence—the dramatic monologues—were written back then. And a lot of the book came together while I was studying history and philosophy of science at Cambridge, which feels like a lifetime ago now. 2016, 2017. I have some distance from those poems at this point.
That said, I still feel like we’re on the mezzanine. It’s a kind of liminal platform, right? You’re not on the ground floor, you’re not at the top—you’re suspended, looking out. And that’s how the present moment feels to me. We’re standing at this strange in-between, trying to decide how much of ourselves—our society, our bodies, our relationships—we want to integrate with technology.
When it comes to society, in many ways we’re already deep in it. Social media, for instance, doesn’t just reflect our desires—it shapes them. It tells us what to want, how to behave, what to buy. But the next phase, I think, is about something even more intimate: how we coexist with technologies that simulate thought, or emotion, or choice. Whether that’s machine learning systems being called “AI,” or developments in bioengineering—these aren’t abstract questions anymore. They’re personal, ethical, existential.
That’s the mezzanine to me. It’s the moment before the drop—or before the climb. A threshold where we still have some agency, but the choices we make now are going to determine how much agency we have in the future. What’s unsettling is that many of these technologies originated from the desire to expand human potential, right? To let us do what wasn’t possible before. But now they’re also enabling a kind of passive automation, or even control.
Like, something that looks like freedom—choosing an embryo, for instance—can so easily flip into coercion, or conformity, or a narrowing of possibility. That’s the contradiction I keep circling. The mezzanine is where all of that is still unresolved.
SET: Speaking of integrating ourselves with technology, this interview is going to be a data point in the history of Zoë Hitzig. How do you feel about that?
ZH: I know. This interview will be part of my permanent record!
But really, I have a kind of mystical argument for why we should be more careful about what we release into the world. I think we all need some protected zone of oblivion—something that’s just ours, something unrecorded, unindexed, unsearchable. It’s what allows for transformation, for becoming someone new, someone untraceable to who you used to be. Without that space, your past calcifies around you. You lose your right to reinvent yourself.
That’s the mystical version. But there’s also a very practical argument: you have no idea who will be in power in ten years, or what kind of access they’ll have to the information you’re leaving behind now. I’ve had so many conversations where people say, “Sure, Alexa’s recording everything, but who’s going to bother listening?” And that just seems naïve at this point.
Because already, we’re seeing that assumption—about the cost or effort it takes to synthesize huge amounts of information—being proven wrong. These new AI tools might not be great at everything, but one thing they’re remarkably good at is distilling, summarizing, pattern-finding. They can scan hours of conversation and give you a neatly packaged narrative, which might not always be accurate, but can still be used against you—especially where I live now, in a political system where democratic norms are thinning out and legal protections aren’t keeping pace with the technologies reshaping our lives.
3. Art is Where We Stay Messy
SET: Do you think creating a mythology or chronicling the self online should be more or less important in our algorithmic society?
ZH: It’s a cool question—and one I’m honestly deeply conflicted about. Because on the one hand, mythology—the self chronicled, performed, archived—is such a human instinct. But in an algorithmic society, it becomes… complicated.
If you really sit with the implications of what we’ve been talking about—how data is stored, how it moves, how it can be repurposed without your consent—it starts to feel almost paralyzing. There’s no guarantee that anything you say today will stay in the context in which you said it. And without context, meaning destabilizes. If you don’t know who you’re speaking to, or how what you said might resurface, or what system might eventually interpret it—you start to lose the ability to choose your words at all.
One thing I’ve written about is how the rise of human-like machine learning systems adds another strange layer to all this. Not only are your interactions being recorded—archived, processed, stripped of nuance—but soon, you won’t even be sure who—or what—you’re talking to.
I don’t think that always matters. There are plenty of exchanges where it makes no difference whether the other person is a bot or a human. But in more intimate or consequential contexts, the uncertainty starts to erode trust. Like, imagine hopping on a Zoom call and not knowing whether the person on the other end is real—or just convincingly real enough. Like, how do I know you are who you say you are, and not some kind of… I don’t know… foreign adversary?
That’s where we’re headed. And again, maybe it doesn’t always matter. But sometimes it will.
SET: Don’t you think it matters regardless, even outside of the question of foreign adversaries?
ZH: Yeah. I mean, the argument I was starting to make is that if we really follow through on some of the trends we’re seeing now—if we reach a point where you can’t trust that your information will stay in the context in which it was shared, or that people will interpret it as it was meant—then something breaks down. And it’s not just about privacy or secrecy anymore—it’s about communication itself.
Right now, I can feel reasonably confident that you’re a real person. You’re on a Columbia Zoom, your face moves naturally, you seem like a thoughtful undergrad. Great. I feel like I’m talking to the person you say you are. But in a year? Those signals might not be trustworthy. They might be completely forgeable. So what happens when the basic cues we use to establish reality—identity, tone, intention, infrastructure—become unstable or meaningless?
And then you’re stuck in this double bind: you can’t trust your information will be safe, and you also can’t trust who—or what—you’re sharing it with. So how do you choose what to say at all? How do you avoid spiraling into total silence, or babble, or paranoia?
I mean, sorry—I can get really nerdy about this. I actually wrote a paper that lays out some of this logic, which is why I sound like I’m mid-lecture.
SET: Please go on!
ZH: Yeah, part of what’s happening is that the bar for establishing reality keeps rising. Every interaction starts to come with this extra layer of doubt, of verification. Is this person real? Is this event real? Do I seem real to them? There’s a mutual skepticism that builds—like, a dyadic paranoia.
We already see the early version of this with bots on social media. You don’t know if an account is real, or if the photo is real, or if the statement was generated by a human or a machine. And so you stop taking things at face value. Which is one thing—but it goes further. Because once you assume nothing can be taken at face value, then nobody takes you at face value either. And that feedback loop, a kind of equilibrium effect, makes meaningful communication harder. You start to wonder: why am I even speaking into this space? Who’s listening, and what can they actually hear?
SET: That’s interesting. It’s dyadic in the sense that you’re considering if the person on the opposite side of you is real and the event is real, but also considering whether you’re being interpreted as real or not.
Earlier, you brought up the question of choice. I want to get at the relation between choice-making in an algorithmic context and choice-making in poetry. When institutions choose one algorithm over another, they make a normative judgment about which algorithm is most valuable, ethical, or some other superlative. I see poetry as a sort of precise choice-making as well. Do you feel like there’s a link or seam there?
ZH: I mean, you write, right? So you know that once you’re deep into a piece—especially something creative—a lot of the decisions you make start to feel kind of arbitrary. And there’s that thing everyone says in writing workshops: the piece is never finished, only abandoned.
So if you take that seriously, there’s a kind of built-in arbitrariness to any final form. The poem could’ve ended differently. The novel could’ve taken another turn. And we’ll never know whether that other version would’ve been better—or what “better” would even mean. Some people might see early drafts and think one version was stronger, but you can never explore the full space of possibilities. You can’t test every version of the piece. And even if you could, by what metric would you decide? Aesthetic pleasure? Originality? Emotional impact? It all starts to unravel.
Which, now that I say it out loud, I think is kind of interesting—I hadn’t really thought about it in those terms before. And it seems relevant in the context of the (slightly overplayed right now) conversations around machine-generated art. Like, can a machine produce something with intention? Or what do we even mean by intention in art? One way to think about it is to imagine this latent space of every poem that could ever be written—by a person, by a machine, by some hybrid of the two. And when I publish a poem, or a book, what I’m really doing is pulling one version from that vast field and declaring: this is the one.
Which is a strange kind of authority. It makes authorship feel almost dictatorial—like, I get to choose what gets seen, but there’s this infinite swarm of other possible poems that never make it into view. Some written by other people, sure, but also all the things that could be generated algorithmically, recombined, remixed.
So then the question becomes: what do we elevate? Why this one? Why now? What makes something recognizably human—or does that even matter anymore?
SET: Last question: you said that we’re on a mezzanine now. I’m wondering what you imagine we’re looking at when we look up and what we’re looking at when we look down.
ZH: Let me think about it for a minute. I’m enjoying reflecting–you’ve got me turning over a lot of different things. So, okay—one way I’d answer that is: when I think about how we might eventually step off the mezzanine, and do it in a good way…
SET: Parachute off?
ZH: Exactly. A soft landing, maybe even a joyful one.
I guess when I try to imagine how we get there, I think a lot about how we’ve shifted our sense of what technology is for.
It used to be about expanding what was possible. You know, opening doors, creating new ways of being, seeing, connecting. And that’s still true, to some extent. But more and more, it feels like the tools we’ve built are being used to limit possibility—to make people more predictable, to optimize them into patterns, to strip away ambiguity and unpredictability.
And that’s the paradox: the very systems that promised more freedom are now being used to manage us, to flatten us, to make everything legible and controllable.
So I guess what I hope for isn’t just better tools, but a better imagination about what to do with them. One that doesn’t treat unpredictability as a bug in the system—but as the thing that keeps us human.
SET: That’s a good way to think about writing. If we are very deterministic with it, then we lose all the entropy in between.
ZH: Yeah, absolutely. I think art is where we stay messy – just blurry enough to be real.