Share Dialog

The conversion of human beings into tradable commodities did not begin with algorithms. It did not begin with credit scores, engagement metrics, or creator tokens. It began the moment one person claimed ownership over another and assigned them a price.
For centuries, humans have been bought and sold in literal markets. The transatlantic slave trade operated pricing systems as sophisticated as any modern exchange—age, health, gender, skill, projected labor output all factored into valuations. Insurance companies wrote policies on enslaved people as property assets. Auctions treated human beings as inventory subject to supply and demand. This was not metaphor. This was accounting.
Indentured servitude turned years of human life into contractual debt instruments. Colonial census records converted entire populations into administrative units for taxation and resource extraction. By the 19th century, actuarial tables reduced lives to mortality probabilities so insurance companies could calculate risk and price policies accordingly. People became statistical instruments before they ever became data points.
When Frederick Taylor introduced "scientific management" in the early 1900s, he merely formalized what had always been implicit: that human labor could be studied, timed, broken into component motions, and optimized like any other industrial process. Workers were not people with interior lives—they were efficiency variables to be measured and improved. The stopwatch replaced the auction block, but the logic remained identical: extract maximum value from the human unit.
Credit scoring systems, launched in their modern form in 1989, converted complex lives into three-digit numbers that determined access to housing, employment, and mobility. These scores claimed neutrality while encoding centuries of structural inequality—research shows that U.S. counties with higher enslaved populations in 1860 still have significantly elevated subprime credit rates today, even controlling for current economic factors. The algorithm doesn't escape history. It inherits it.
Social media platforms accelerated the pace but not the principle.

The conversion of human beings into tradable commodities did not begin with algorithms. It did not begin with credit scores, engagement metrics, or creator tokens. It began the moment one person claimed ownership over another and assigned them a price.
For centuries, humans have been bought and sold in literal markets. The transatlantic slave trade operated pricing systems as sophisticated as any modern exchange—age, health, gender, skill, projected labor output all factored into valuations. Insurance companies wrote policies on enslaved people as property assets. Auctions treated human beings as inventory subject to supply and demand. This was not metaphor. This was accounting.
Indentured servitude turned years of human life into contractual debt instruments. Colonial census records converted entire populations into administrative units for taxation and resource extraction. By the 19th century, actuarial tables reduced lives to mortality probabilities so insurance companies could calculate risk and price policies accordingly. People became statistical instruments before they ever became data points.
When Frederick Taylor introduced "scientific management" in the early 1900s, he merely formalized what had always been implicit: that human labor could be studied, timed, broken into component motions, and optimized like any other industrial process. Workers were not people with interior lives—they were efficiency variables to be measured and improved. The stopwatch replaced the auction block, but the logic remained identical: extract maximum value from the human unit.
Credit scoring systems, launched in their modern form in 1989, converted complex lives into three-digit numbers that determined access to housing, employment, and mobility. These scores claimed neutrality while encoding centuries of structural inequality—research shows that U.S. counties with higher enslaved populations in 1860 still have significantly elevated subprime credit rates today, even controlling for current economic factors. The algorithm doesn't escape history. It inherits it.
Social media platforms accelerated the pace but not the principle.
Share Dialog
BitClout scraped thousands of profiles and created tradable "creator coins" in people's names—you could buy and sell shares in human beings based on speculation about their future reputation. The SEC eventually charged the founder with fraud, but the damage was done: tokenization proved it could be imposed without consent.
Now we have AI-driven replacement of human workers—not because the AI performs better, but because it costs less and doesn't require health insurance, breaks, dignity, or the inconvenient complexity of having needs.
Corporations frame this as "efficiency" and "innovation" while investment firms that prioritize shareholder returns above all else push for adoption regardless of whether it actually works. The point is not whether AI journalism, customer service, or content creation is good. The point is whether it's cheaper. And if it's cheaper, humans become the line item to cut.
This is not a new logic.
This is the same logic that has always governed systems built on extraction: if value can be pulled from something, it will be. If a human can be replaced by something that costs less to maintain, they will be. If personhood gets in the way of profit, personhood will be redefined as a problem to solve.
HumanStock is an art installation that makes this machinery visible by building it.
Four interconnected pieces—You, Inc., Recall Notice, Limited Lifetime Warranty, and Terms & Conditions of the Self—each use the aesthetic language of corporate systems to demonstrate how identity has become something managed, evaluated, optimized, and occasionally discarded like any other product.
This is not speculative fiction about a dystopian future. This is documentary work about a condition that has existed for centuries, now operating at machine speed with algorithmic precision.
The only thing that has changed is how explicit the interface has become.
If the history of commodification is the backdrop, You, Inc. is where the machinery steps into the light. It presents a single human being—one artist, one biography, one finite life—as if it were a publicly traded security whose value updates in real time.
The interface borrows the visual grammar of financial markets on purpose. It looks like a financial terminal because finance has become the dominant language for talking about value. The colors, the layout, the dense clusters of metrics: all borrowed from screens designed to track commodities, equities, volatility. Only here, the underlying asset is not a company or a fund. It is a person.
The data is not fictional. It's mine. That matters. The system ingests a real CV, actual press mentions, search trends, social sentiment.
The piece surfaces key events from my life as if they were regulatory filings and market-moving disclosures: founding a company becomes a material event; publishing a book looks like a strategic announcement; changing direction appears as heightened risk. I'm not gesturing vaguely at “how this might feel.” but how I'm actually being processed.
That exposure is part of the work. It refuses the safety of abstraction and stock photography. A fake persona would let viewers hold the piece at arm’s length—“this is about them.” A real person forces a different recognition: this is already about us.
The system’s treatment of namesakes makes this clearer. Other people who share my name begin to bleed into the interface: a coach here, a student there, a professional in another field. Their activities briefly register as signals—possible press, possible risk, possible “sentiment drift.” The machine tries to decide who belongs to the ticker and who doesn’t. Identity collapses into probability.
To be “visible” in such a world is to live with a permanent, invisible investor-relations function humming in the background. Every choice becomes legible as potential signal: will this raise or lower the line? The piece doesn’t need to tell you to feel that pressure. Most people already do. You, Inc. just gives the pressure a dashboard.

If You, Inc. is about being listed, Recall Notice is about being deemed unfit for continued circulation.
The piece borrows the form of a product recall—official, bureaucratic, and oddly clinical in its concern for “user safety.” But instead of defective batteries or contaminated food, the recall applies to episodes, roles, and eras from a human life.
Childhood, career moves, relationships, entire phases of becoming are reframed as “units” with potential defects.
The language tracks how institutions already talk: incident reports, known issues, remediation steps. Only here the “failure modes” are not technical. They are emotional and existential—burnout, ambivalence, grief, contradiction, change.
In this frame, the most human parts of a life are recoded as hazards. Changing direction mid-career becomes instability. Grieving a loss becomes a prolonged outage. Wanting something that can’t be measured becomes nonconformance.
The recall format makes visible how quickly support systems shift into risk language when confronted with anything they cannot standardize.
This is where the historical thread of commodification shows up as quality control. Once a person is treated as a product, inspection becomes continuous. Are you still performing as expected? Are you still safe for others to rely on, invest in, or extract from? Are there signs of “deviation”? The corrective instrument is rarely a conversation; it is more often a policy, a form, a decision about continued eligibility.
Recall Notice doesn’t resolve this. It holds up the absurdity of what happens when you apply corporate remediation logic to lived experience—and then asks you to notice how often that already happens.

Where Recall Notice reveals how systems handle “defects,” Limited Lifetime Warranty looks at how they define care in the first place.
The format is instantly recognizable: a warranty document that promises protection, within limits. The headline language offers reassurance—coverage, support, repair. The structure is built to calm. It suggests that someone has your back.
The exclusions tell a different story. All the conditions most likely to require care—aging, fatigue, uncertainty, grief, divergence from the planned script—slide into the categories that are “not covered.” The document grants support in precisely the zones where it is rarely needed and withdraws it at the edges of real life.
This is less exaggeration than translation. Institutions routinely express compassion in their public language while their underlying agreements are written to minimize liability and cost.
Health benefits that don’t cover the therapy someone actually needs.
Safety nets that only work for those who fall in precisely the right way.
Support programs that evaporate the moment someone no longer fits clean criteria.
The phrase “limited lifetime” captures the contradiction. The warranty invokes the entire span of a life while reserving the right to define which parts of that life count. Protection becomes something like a marketing promise—true in some technical sense, but structurally incapable of meeting the situations where it’s most needed.
The piece doesn’t claim that care never exists. People take care of each other all the time, often against the grain of the systems around them.
What it does show is that when care is routed primarily through corporate or institutional frameworks, it is often built on the same logic as any other contract: preserve optionality, cap exposure, avoid obligations that are hard to quantify.
In that light, the warranty doesn’t just describe a policy. It describes a culture in which even empathy comes with terms.

The last piece shifts from support to consent. If You, Inc. shows how you are seen, Recall Notice and Warranty show how you are managed, Terms & Conditions of the Self shows how you are bound.
The format is the archetype of contemporary non-choice: the wall of text you scroll past to access anything from an app to a bank account. No one reads it. Everyone accepts it. The apparatus knows this and is designed accordingly.
Here, the agreement applies to identity itself. The clauses don’t just govern use of a service. They govern how you are expected to exist.
You agree, by participating, to continuous measurement.
You agree that your behavior can be turned into metrics, shared, sold, and used to train future systems.
You agree to maintain a legible narrative, to be coherent enough for algorithms to model, to present a version of yourself that “makes sense” in data form.
You agree that changing your mind will be treated as inconsistency, that going quiet will be read as signal, that refusing to participate will be interpreted as deviance more than as choice.
None of this is presented as coercion. It is presented as access. If you want to work, bank, communicate, move, you accept. Decline is technically possible and practically unavailable.
This is where AI threads in quietly. As workplaces, platforms, and services increasingly delegate decisions to models—who to hire, who to serve, who to flag, who to ignore—the cost of being incomprehensible to those models rises. Identities that don’t flatten cleanly into training data become friction.
Friction, in the logic of optimization, is a problem to remove.
Terms & Conditions of the Self doesn’t try to litigate that. It does something more uncomfortable: it makes the non-negotiability of the current arrangement explicit. You are already inside the contract. The “I Agree” button is less a choice than a confession of dependence.

Taken together, the four pieces describe a single architecture viewed from different corporate angles.
You, Inc. shows what happens when a life is listed: it acquires a price, a chart, and an implied obligation to justify itself through performance.
Recall Notice shows what happens when that life fails to comply with expectations: its deviations are framed as defects requiring remediation or removal.
Limited Lifetime Warranty shows what happens when that life asks for care: support exists in theory but is structured to avoid the very forms of vulnerability that make us human.
Terms & Conditions of the Self shows what happens before any of this begins: the quiet assumption that participation equals consent, that using the system means accepting its terms—even when there is no viable outside.
Threaded through all of it is the question of what cannot be captured.
The unrecorded conversation.
The decision that makes no sense on a spreadsheet and absolute sense to the people involved.
The act of care that generates no content, produces no proof, appears in no feed.
These are not romantic exceptions. They are reminders that meaning and value do not collapse entirely into metrics, no matter how aggressively systems behave as if they do.
HumanStock doesn’t offer an escape route. It doesn’t promise that if we just change our apps or build better platforms, the deeper logic will reverse itself. The history that opens the essay makes that clear: this logic predates our current tools. It has moved from slave markets to credit markets to creator markets without losing its core shape.
What the installation does offer is recognition. Not in the abstract, but at the level of interface: this is what it looks like to have a life treated as a product, a risk, a warranty claim, a set of contractual obligations. This is how it feels when the culture around you starts to talk as if the only way to justify your existence is to show your chart.
HumanStock
Aaron Vick, 2025-2026
Four-part web-based installation
You, Inc.: The Human Ticker
Real-time financial interface tracking biographical data as market instrument
humanstock.art
Recall Notice
Interactive product recall system for lived experience
recallnotice.humanstock.art
Limited Lifetime Warranty
Warranty document for human life with standard exclusions
warranty.humanstock.art
Terms & Conditions of the Self
End-user license agreement for identity
toc.humanstock.art
Medium: HTML, CSS, JavaScript, archival documents
Aaron Vick is a founder, technical builder, and conceptual artist working at the intersection of Web3 infrastructure, autonomous systems, and institutional critique. His work examines how emerging technologies encode—and accelerate—historical patterns of commodification, surveillance, and control.
Living Arcade (2025), his ongoing installation on Base blockchain, explores permanence and autonomy through three fully onchain pieces: Vibe Pools (self-trading liquidity systems that turn arbitrage bots into unwitting performers), LEXI (a conversational agent implemented entirely in smart contract code), and Paint Sudoku (gameplay as permanent monument). Each piece exists not displayed via blockchain but made of blockchain—code as material, execution as exhibition.
HumanStock inverts this logic: where Living Arcade gives machines immortality, HumanStock shows humans processed as products. Both build functional systems that reveal the violence embedded in "neutral" interfaces.
BitClout scraped thousands of profiles and created tradable "creator coins" in people's names—you could buy and sell shares in human beings based on speculation about their future reputation. The SEC eventually charged the founder with fraud, but the damage was done: tokenization proved it could be imposed without consent.
Now we have AI-driven replacement of human workers—not because the AI performs better, but because it costs less and doesn't require health insurance, breaks, dignity, or the inconvenient complexity of having needs.
Corporations frame this as "efficiency" and "innovation" while investment firms that prioritize shareholder returns above all else push for adoption regardless of whether it actually works. The point is not whether AI journalism, customer service, or content creation is good. The point is whether it's cheaper. And if it's cheaper, humans become the line item to cut.
This is not a new logic.
This is the same logic that has always governed systems built on extraction: if value can be pulled from something, it will be. If a human can be replaced by something that costs less to maintain, they will be. If personhood gets in the way of profit, personhood will be redefined as a problem to solve.
HumanStock is an art installation that makes this machinery visible by building it.
Four interconnected pieces—You, Inc., Recall Notice, Limited Lifetime Warranty, and Terms & Conditions of the Self—each use the aesthetic language of corporate systems to demonstrate how identity has become something managed, evaluated, optimized, and occasionally discarded like any other product.
This is not speculative fiction about a dystopian future. This is documentary work about a condition that has existed for centuries, now operating at machine speed with algorithmic precision.
The only thing that has changed is how explicit the interface has become.
If the history of commodification is the backdrop, You, Inc. is where the machinery steps into the light. It presents a single human being—one artist, one biography, one finite life—as if it were a publicly traded security whose value updates in real time.
The interface borrows the visual grammar of financial markets on purpose. It looks like a financial terminal because finance has become the dominant language for talking about value. The colors, the layout, the dense clusters of metrics: all borrowed from screens designed to track commodities, equities, volatility. Only here, the underlying asset is not a company or a fund. It is a person.
The data is not fictional. It's mine. That matters. The system ingests a real CV, actual press mentions, search trends, social sentiment.
The piece surfaces key events from my life as if they were regulatory filings and market-moving disclosures: founding a company becomes a material event; publishing a book looks like a strategic announcement; changing direction appears as heightened risk. I'm not gesturing vaguely at “how this might feel.” but how I'm actually being processed.
That exposure is part of the work. It refuses the safety of abstraction and stock photography. A fake persona would let viewers hold the piece at arm’s length—“this is about them.” A real person forces a different recognition: this is already about us.
The system’s treatment of namesakes makes this clearer. Other people who share my name begin to bleed into the interface: a coach here, a student there, a professional in another field. Their activities briefly register as signals—possible press, possible risk, possible “sentiment drift.” The machine tries to decide who belongs to the ticker and who doesn’t. Identity collapses into probability.
To be “visible” in such a world is to live with a permanent, invisible investor-relations function humming in the background. Every choice becomes legible as potential signal: will this raise or lower the line? The piece doesn’t need to tell you to feel that pressure. Most people already do. You, Inc. just gives the pressure a dashboard.

If You, Inc. is about being listed, Recall Notice is about being deemed unfit for continued circulation.
The piece borrows the form of a product recall—official, bureaucratic, and oddly clinical in its concern for “user safety.” But instead of defective batteries or contaminated food, the recall applies to episodes, roles, and eras from a human life.
Childhood, career moves, relationships, entire phases of becoming are reframed as “units” with potential defects.
The language tracks how institutions already talk: incident reports, known issues, remediation steps. Only here the “failure modes” are not technical. They are emotional and existential—burnout, ambivalence, grief, contradiction, change.
In this frame, the most human parts of a life are recoded as hazards. Changing direction mid-career becomes instability. Grieving a loss becomes a prolonged outage. Wanting something that can’t be measured becomes nonconformance.
The recall format makes visible how quickly support systems shift into risk language when confronted with anything they cannot standardize.
This is where the historical thread of commodification shows up as quality control. Once a person is treated as a product, inspection becomes continuous. Are you still performing as expected? Are you still safe for others to rely on, invest in, or extract from? Are there signs of “deviation”? The corrective instrument is rarely a conversation; it is more often a policy, a form, a decision about continued eligibility.
Recall Notice doesn’t resolve this. It holds up the absurdity of what happens when you apply corporate remediation logic to lived experience—and then asks you to notice how often that already happens.

Where Recall Notice reveals how systems handle “defects,” Limited Lifetime Warranty looks at how they define care in the first place.
The format is instantly recognizable: a warranty document that promises protection, within limits. The headline language offers reassurance—coverage, support, repair. The structure is built to calm. It suggests that someone has your back.
The exclusions tell a different story. All the conditions most likely to require care—aging, fatigue, uncertainty, grief, divergence from the planned script—slide into the categories that are “not covered.” The document grants support in precisely the zones where it is rarely needed and withdraws it at the edges of real life.
This is less exaggeration than translation. Institutions routinely express compassion in their public language while their underlying agreements are written to minimize liability and cost.
Health benefits that don’t cover the therapy someone actually needs.
Safety nets that only work for those who fall in precisely the right way.
Support programs that evaporate the moment someone no longer fits clean criteria.
The phrase “limited lifetime” captures the contradiction. The warranty invokes the entire span of a life while reserving the right to define which parts of that life count. Protection becomes something like a marketing promise—true in some technical sense, but structurally incapable of meeting the situations where it’s most needed.
The piece doesn’t claim that care never exists. People take care of each other all the time, often against the grain of the systems around them.
What it does show is that when care is routed primarily through corporate or institutional frameworks, it is often built on the same logic as any other contract: preserve optionality, cap exposure, avoid obligations that are hard to quantify.
In that light, the warranty doesn’t just describe a policy. It describes a culture in which even empathy comes with terms.

The last piece shifts from support to consent. If You, Inc. shows how you are seen, Recall Notice and Warranty show how you are managed, Terms & Conditions of the Self shows how you are bound.
The format is the archetype of contemporary non-choice: the wall of text you scroll past to access anything from an app to a bank account. No one reads it. Everyone accepts it. The apparatus knows this and is designed accordingly.
Here, the agreement applies to identity itself. The clauses don’t just govern use of a service. They govern how you are expected to exist.
You agree, by participating, to continuous measurement.
You agree that your behavior can be turned into metrics, shared, sold, and used to train future systems.
You agree to maintain a legible narrative, to be coherent enough for algorithms to model, to present a version of yourself that “makes sense” in data form.
You agree that changing your mind will be treated as inconsistency, that going quiet will be read as signal, that refusing to participate will be interpreted as deviance more than as choice.
None of this is presented as coercion. It is presented as access. If you want to work, bank, communicate, move, you accept. Decline is technically possible and practically unavailable.
This is where AI threads in quietly. As workplaces, platforms, and services increasingly delegate decisions to models—who to hire, who to serve, who to flag, who to ignore—the cost of being incomprehensible to those models rises. Identities that don’t flatten cleanly into training data become friction.
Friction, in the logic of optimization, is a problem to remove.
Terms & Conditions of the Self doesn’t try to litigate that. It does something more uncomfortable: it makes the non-negotiability of the current arrangement explicit. You are already inside the contract. The “I Agree” button is less a choice than a confession of dependence.

Taken together, the four pieces describe a single architecture viewed from different corporate angles.
You, Inc. shows what happens when a life is listed: it acquires a price, a chart, and an implied obligation to justify itself through performance.
Recall Notice shows what happens when that life fails to comply with expectations: its deviations are framed as defects requiring remediation or removal.
Limited Lifetime Warranty shows what happens when that life asks for care: support exists in theory but is structured to avoid the very forms of vulnerability that make us human.
Terms & Conditions of the Self shows what happens before any of this begins: the quiet assumption that participation equals consent, that using the system means accepting its terms—even when there is no viable outside.
Threaded through all of it is the question of what cannot be captured.
The unrecorded conversation.
The decision that makes no sense on a spreadsheet and absolute sense to the people involved.
The act of care that generates no content, produces no proof, appears in no feed.
These are not romantic exceptions. They are reminders that meaning and value do not collapse entirely into metrics, no matter how aggressively systems behave as if they do.
HumanStock doesn’t offer an escape route. It doesn’t promise that if we just change our apps or build better platforms, the deeper logic will reverse itself. The history that opens the essay makes that clear: this logic predates our current tools. It has moved from slave markets to credit markets to creator markets without losing its core shape.
What the installation does offer is recognition. Not in the abstract, but at the level of interface: this is what it looks like to have a life treated as a product, a risk, a warranty claim, a set of contractual obligations. This is how it feels when the culture around you starts to talk as if the only way to justify your existence is to show your chart.
HumanStock
Aaron Vick, 2025-2026
Four-part web-based installation
You, Inc.: The Human Ticker
Real-time financial interface tracking biographical data as market instrument
humanstock.art
Recall Notice
Interactive product recall system for lived experience
recallnotice.humanstock.art
Limited Lifetime Warranty
Warranty document for human life with standard exclusions
warranty.humanstock.art
Terms & Conditions of the Self
End-user license agreement for identity
toc.humanstock.art
Medium: HTML, CSS, JavaScript, archival documents
Aaron Vick is a founder, technical builder, and conceptual artist working at the intersection of Web3 infrastructure, autonomous systems, and institutional critique. His work examines how emerging technologies encode—and accelerate—historical patterns of commodification, surveillance, and control.
Living Arcade (2025), his ongoing installation on Base blockchain, explores permanence and autonomy through three fully onchain pieces: Vibe Pools (self-trading liquidity systems that turn arbitrage bots into unwitting performers), LEXI (a conversational agent implemented entirely in smart contract code), and Paint Sudoku (gameplay as permanent monument). Each piece exists not displayed via blockchain but made of blockchain—code as material, execution as exhibition.
HumanStock inverts this logic: where Living Arcade gives machines immortality, HumanStock shows humans processed as products. Both build functional systems that reveal the violence embedded in "neutral" interfaces.
1 comment
An analysis traces how modern systems turn life into value, from slave markets to AI-driven replacement, through the four-part HumanStock installation—You, Inc.; Recall Notice; Limited Lifetime Warranty; Terms & Conditions of the Self—mapping identity as asset, risk, and consent. @aaronv.eth