
Some foods taste better when eaten with your hands. Ancestrally, our people prepared and consumed their dishes first until colonizers forced utensils down our throats and deemed it “uncivilized.”
But this couldn’t be further from the truth.
Recent research shows that eating food with your hands makes the food tastier and the experience more satisfying. Eating with your hands may also improve digestion, prevent mouth burn, and encourage mindful eating.
Food has always been a communal thing for our people–a reason to gather, talk, share, and laugh. And many believe that eating with your hands only adds to the connection we feel to our food and the people we’re breaking bread with.
Many African cultures today still eat with their hands, from Ghana to Ethiopia. So why have some of us been made to think that practice is wrong?
Dehumanization was a huge component of the enslavement and colonization processes. They intentionally shamed us out of our cultures and traditions, and the effects have lasted generations.
Whether we decide to ditch the fork or not, we should question the things they forcefully have taught us as wrong. We deserve to experience the world for ourselves, not through a filtered white supremacist lens hellbent on erasing our Blackness.