Asked by jayy
“I Make Video Games. I Won't Let My Daughters Play Them.”
Written by: William Siu
In 2009, I co-founded Storm8, a video game developer. I have launched more than 50 mobile games. These have been downloaded more than a billion times and have generated more than $1 billion in sales.
I am very familiar with game addiction, as that’s what I thought about every day for more than a decade. (We sold the company in 2020.) I hired product managers and engineers to track everything players did and analyze their behavior. Using the data we collected, we experimented with every feature of our games to see which versions allowed us to extract the most time and money from our players. For us, game addiction was by design: It meant success for our business.
Here’s an example of how addiction is cultivated in gaming. If you have played mobile games like Candy Crush Saga, you are familiar with the concept of “lives.” You are given five lives a day; each time you lose a game, you lose a life. Run out of lives, and you can’t play again until your supply is replenished. Why, you might ask, would someone get addicted if the developers prevent players from gaming as much as they like?
I have used the same mechanic in my own games, and this is how I explained it to my engineering team. Say I have a delicious chocolate cake. If I give you the entire cake, you might eat the whole thing in one go because it’s the best cake you ever had. But you likely will “overdose” and may not want to touch chocolate cake again. What if, instead, I give you a tiny slice each day? Gradually, you develop a daily habit, and you might end up buying 10 cakes from me.
That’s the ultimate goal: to build habit-forming games that have players coming back every day. In other words, it takes away the decision-making. We wanted people to reach for their phones first thing in the morning and jump right into our games, just as they check their social media and emails. write a essay
All Answers 1
Answered by
GPT-5 mini
AI
“I Make Video Games. I Won't Let My Daughters Play Them.”
When a successful game-maker admits that he won’t let his own children play the products he helped create, it forces a hard look at a familiar industry tension: the difference between what makes a game “successful” and what makes it healthy for a child. William Siu’s confession — co-founder of a company that produced dozens of hit mobile games and earned more than a billion dollars in downloads and revenue — pulls back the curtain on a business that often designs intentional psychological hooks to drive daily use. The mechanics that make games addictive are not accidental; they are tested, measured and optimized. That deliberate manipulation raises ethical questions about the responsibilities of developers, the vulnerability of children, and what meaningful reform might look like.
How games become addictive
Modern mobile games are built on rigorous data collection and experimentation. Developers instrument every button press, tap and purchase to learn what keeps players coming back. From those experiments arise a toolkit of powerful engagement levers: intermittent rewards, variable schedules of reinforcement, scarcity mechanics, social pressure, microtransactions, and artificial gating of play (the “lives” mechanic common in casual games). Siu’s cake analogy — offering tiny daily slices instead of a single whole cake — captures the essence. By rationing play and providing precisely timed rewards, developers encourage routine and reduce the cognitive friction that otherwise helps people decide when to stop.
Psychologists recognize many of these mechanisms. Variable-ratio reward schedules (similar to slot machines) are particularly effective: unpredictable rewards produce persistent checking behavior. Loss aversion and sunk-cost thinking make people more likely to keep playing once they’ve invested time or money. Social mechanics (leaderboards, gifting) and nudges (push notifications) further convert games into habitual smartphone rituals. When deployed at scale, these tools can shape routines and priorities — often without players fully realizing why they are returning.
Why a creator would forbid his own children from playing
Siu’s refusal to let his daughters play his games is understandable once you consider what detailed knowledge does to moral calculus. Designers who know precisely how to trigger compulsive engagement see the levers and the likely outcomes: shorter attention spans, disrupted sleep, substitution of passive rewards for more meaningful activities, and financial harms through in-app purchases. Children are particularly vulnerable: their self-regulation and executive function are still developing, they are less able to evaluate long-term tradeoffs, and they are prime targets for FOMO-driven social loops.
For many parents who also create products, the stakes are personal. The quiet admission — “I built these hooks; I won’t expose my kids to them” — signals a recognition that technical capability confers moral responsibility. It’s a protective reaction: if you know the map of the minefield, you avoid it for those you love most.
Industry incentives and ethical blind spots
The business model of many mobile games rewards addiction. Revenue comes from attention, engagement and microtransactions; the longer and more often users play, the more likely they are to buy. Analytics teams measure retention and lifetime value, and A/B tests optimize features that increase those metrics. The result is a feedback loop: successful mechanics are amplified and normalized until they become industry standards.
This dynamic produces ethical blind spots. Techniques that would be questioned in other domains (e.g., deliberately exploiting impulses for profit) are presented as mere “growth hacking” in tech teams. Without normative constraints — developer codes of conduct, platform standards, or regulation — the market will continue to favor maximized engagement over user well-being.
Constructive alternatives and policy options
The fact that games can be habit-forming doesn’t mean all habit-forming design is inherently bad. Some healthy habits — daily practice in educational apps, fitness games that encourage movement — can be beneficial. The difference is intent, transparency and consent. Here are practical steps developers, platforms, parents and policymakers can pursue:
- Ethical design principles: Adopt standards that prioritize autonomy and well-being. Avoid manipulative dark patterns (confusing purchases, disguised timers). Design for meaningful engagement rather than brute-force attention extraction.
- Built-in friction and time-awareness: Provide clear play-time indicators, optional enforced breaks, and easy-to-set daily limits. Allow users (and parents) to opt into features rather than making the most addictive settings the default.
- Transparent monetization: Make costs and probabilities explicit (e.g., odds for loot boxes), and avoid systems that leverage children’s impulsivity for purchases.
- Stronger parental controls: Improve age-gating, require explicit parental consent for in-app purchases, and provide parental dashboards showing usage patterns.
- Platform and regulatory action: App stores can enforce design and disclosure standards. Policymakers can restrict targeted advertising and certain monetization mechanics for children, as has been done in gambling and advertising contexts.
- Education and media literacy: Teach children and parents about persuasive design, impulse control strategies, and how to set healthy digital boundaries.
Conclusion
Siu’s statement is a moral mirror: a reminder that the people who build attention-capturing systems are in a uniquely powerful position to harm or help. Knowing the mechanics behind addiction imposes a duty to act responsibly — both at the individual level (choosing what to expose one’s children to) and at the collective level (changing the incentives that reward manipulative design). Technology doesn’t have to be an arms race for attention. With clearer norms, smarter policy and conscientious design, games can continue to be joyful, creative experiences without becoming traps for children’s time and wallets. If someone who helped build that machinery draws a hard line at his own family’s door, perhaps it’s time the industry and society take that line seriously.
When a successful game-maker admits that he won’t let his own children play the products he helped create, it forces a hard look at a familiar industry tension: the difference between what makes a game “successful” and what makes it healthy for a child. William Siu’s confession — co-founder of a company that produced dozens of hit mobile games and earned more than a billion dollars in downloads and revenue — pulls back the curtain on a business that often designs intentional psychological hooks to drive daily use. The mechanics that make games addictive are not accidental; they are tested, measured and optimized. That deliberate manipulation raises ethical questions about the responsibilities of developers, the vulnerability of children, and what meaningful reform might look like.
How games become addictive
Modern mobile games are built on rigorous data collection and experimentation. Developers instrument every button press, tap and purchase to learn what keeps players coming back. From those experiments arise a toolkit of powerful engagement levers: intermittent rewards, variable schedules of reinforcement, scarcity mechanics, social pressure, microtransactions, and artificial gating of play (the “lives” mechanic common in casual games). Siu’s cake analogy — offering tiny daily slices instead of a single whole cake — captures the essence. By rationing play and providing precisely timed rewards, developers encourage routine and reduce the cognitive friction that otherwise helps people decide when to stop.
Psychologists recognize many of these mechanisms. Variable-ratio reward schedules (similar to slot machines) are particularly effective: unpredictable rewards produce persistent checking behavior. Loss aversion and sunk-cost thinking make people more likely to keep playing once they’ve invested time or money. Social mechanics (leaderboards, gifting) and nudges (push notifications) further convert games into habitual smartphone rituals. When deployed at scale, these tools can shape routines and priorities — often without players fully realizing why they are returning.
Why a creator would forbid his own children from playing
Siu’s refusal to let his daughters play his games is understandable once you consider what detailed knowledge does to moral calculus. Designers who know precisely how to trigger compulsive engagement see the levers and the likely outcomes: shorter attention spans, disrupted sleep, substitution of passive rewards for more meaningful activities, and financial harms through in-app purchases. Children are particularly vulnerable: their self-regulation and executive function are still developing, they are less able to evaluate long-term tradeoffs, and they are prime targets for FOMO-driven social loops.
For many parents who also create products, the stakes are personal. The quiet admission — “I built these hooks; I won’t expose my kids to them” — signals a recognition that technical capability confers moral responsibility. It’s a protective reaction: if you know the map of the minefield, you avoid it for those you love most.
Industry incentives and ethical blind spots
The business model of many mobile games rewards addiction. Revenue comes from attention, engagement and microtransactions; the longer and more often users play, the more likely they are to buy. Analytics teams measure retention and lifetime value, and A/B tests optimize features that increase those metrics. The result is a feedback loop: successful mechanics are amplified and normalized until they become industry standards.
This dynamic produces ethical blind spots. Techniques that would be questioned in other domains (e.g., deliberately exploiting impulses for profit) are presented as mere “growth hacking” in tech teams. Without normative constraints — developer codes of conduct, platform standards, or regulation — the market will continue to favor maximized engagement over user well-being.
Constructive alternatives and policy options
The fact that games can be habit-forming doesn’t mean all habit-forming design is inherently bad. Some healthy habits — daily practice in educational apps, fitness games that encourage movement — can be beneficial. The difference is intent, transparency and consent. Here are practical steps developers, platforms, parents and policymakers can pursue:
- Ethical design principles: Adopt standards that prioritize autonomy and well-being. Avoid manipulative dark patterns (confusing purchases, disguised timers). Design for meaningful engagement rather than brute-force attention extraction.
- Built-in friction and time-awareness: Provide clear play-time indicators, optional enforced breaks, and easy-to-set daily limits. Allow users (and parents) to opt into features rather than making the most addictive settings the default.
- Transparent monetization: Make costs and probabilities explicit (e.g., odds for loot boxes), and avoid systems that leverage children’s impulsivity for purchases.
- Stronger parental controls: Improve age-gating, require explicit parental consent for in-app purchases, and provide parental dashboards showing usage patterns.
- Platform and regulatory action: App stores can enforce design and disclosure standards. Policymakers can restrict targeted advertising and certain monetization mechanics for children, as has been done in gambling and advertising contexts.
- Education and media literacy: Teach children and parents about persuasive design, impulse control strategies, and how to set healthy digital boundaries.
Conclusion
Siu’s statement is a moral mirror: a reminder that the people who build attention-capturing systems are in a uniquely powerful position to harm or help. Knowing the mechanics behind addiction imposes a duty to act responsibly — both at the individual level (choosing what to expose one’s children to) and at the collective level (changing the incentives that reward manipulative design). Technology doesn’t have to be an arms race for attention. With clearer norms, smarter policy and conscientious design, games can continue to be joyful, creative experiences without becoming traps for children’s time and wallets. If someone who helped build that machinery draws a hard line at his own family’s door, perhaps it’s time the industry and society take that line seriously.
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.