If this feels like an interview, that’s because it kind of is.
— Let's chat about Opinionated
What surprised you about how people actually used Opinionated once it took off?
What surprised me most was how quickly people formed real connections. We expected quick reactions and casual feedback, but users started recognizing each other through repeated questions and responses.

Over time, those interactions turned into familiarity and even friendships. People came back not just to ask questions, but to see how the same group would respond. That shift from anonymous feedback to a sense of community was unexpected and ended up being one of the most meaningful parts of the product.
How did you think about moderation and safety in a live, opinion-driven environment?
Moderation was one of the hardest parts of the product. A live, opinion-driven environment moves fast, and fully manual moderation was expensive and not scalable.

Image sharing made the problem significantly harder, so we made the call to move to text-only. That shift reduced risk and allowed us to rely more on AI for real-time moderation. It was not perfect, but it was the right trade-off to keep the space safe and usable.
When did you realize Opinionated was becoming social, not just interactive?
We realized it when people started coming back for each other, not just for answers. Users recognized familiar names, followed certain voices, and responded to patterns in how others asked and answered questions.

At that point, it was no longer just about quick feedback. It had become a shared space where people felt seen and connected, even though the interactions were simple and lightweight.
What did you not expect people to care about as much as they did?
I did not expect recognition to matter as much as it did. Even though the interactions were simple, people cared about being seen and remembered.

Noticing familiar names and repeated interactions turned quick feedback into something personal, and that recognition became a big reason people came back.
What almost broke the product, and how did you catch it in time?
Moderation at scale almost broke the product. As usage grew, the cost and complexity of keeping the space safe grew even faster, and low-quality or bad-faith questions amplified the problem.

We caught it by realizing that safety and question quality were tied together. Tightening moderation, moving away from images, leaning on AI, and setting clearer expectations for what made a good question helped keep the product usable and the community healthy.