You know what I don't get. Okay, lemme start over. I just saw a commercial for Macy's with lots of celebrities saying things like, "There is a Santa Claus" and "Without Santa Claus, we have no hope in the world."
That sucks, because I don't believe in Santa Claus and in fact I'm willing to bet nearly 98% or more of all adults don't believe in ol' Father Christmas.
So this makes we wonder, why is it that parents go to such lengths to make their kids believe in some fat jolly guy who brings presents to good kids for no good reason? Why is it that we must force children into believing bullshit knowing fully that eventually they will find out the truth. The amount of years that kids believe in Santa Claus is usually 2 or 3. Why in the world do parents do this?
Well let me tell you something, if I had kids I would NEVER tell them that Santa Claus is real. It's a bold face lie, period. We all know that the kid will eventually accept that there is no Santa, so why tell them there is one to begin this. It's utter bullshit.
Can't we teach kids that they are getting gifts from their parents because they love them and want them to be happy, not that some fat guy who doesn't know them is gonna break into their home and leave questionable wrapped packages under a killed tree (or a fake one, depending).
When I was a kid my parents honestly wrapped presents "From: Santa Clause." I remember at an early age saying something to the effect of, "Really? Isn't it better for me to appreciate the gifts that my parents purchased and wrappted for me rather some made-up guy?"
I say end the father Christmas delusion! Teach kids about family, togetherness, good will towards men or, I dunno, maybe even the story of Christmas. Fuck the bullshit lies that parents force down their kids throats. Can't we be honest to our kids FOR ONCE!