I’m trying to get perspective on this particular beauty standard and how I want to approach it. Do people whiten their teeth where you live? Is it seen as expected to do so? Do you live in a city?

I have healthy teeth that have nevertheless seen a lot of tea and coffee. I have generally thought of this as similar to wrinkles, i.e. a natural thing bodies do that I don’t want to pay money to fix since it isn’t broken. I still think this. But I have been feeling lately like there might be more actual social stigma to my teeth being discolored. I am wondering if this is at all real? Has whitening teeth become an expected thing for all adults to do now? I thought I’d ask how other people feel and think about this and what the general norm is in your social circle.

Edit: thanks for the responses everybody.

  • 😈MedicPig🐷BabySaver😈Banned from community
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    2 years ago

    I use the strips x twice/year. Also, whitening toothpaste.

    I’d say it’s popular/normal in my area. Southern New England.

    • ShepherdPie@midwest.social
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      2 years ago

      I think generally you’re supposed to avoid whitening toothpaste as it essentially works by sanding your enamel down which can cause issues long term.