Seeing all the negative propaganda that guns have received over the past 30 years by the liberal media and gun grabbers, I've wondered why no ones decided to tell the truth?
Seeing that they rely on all lies, I feel like a single solid documentary could blow the truth wide open and change millions of opinions. We have so many anti-establishment documentaries that tell the truth about health care, insurance blah blah blah...
All this without even being biased, just telling the damn truth... I feel like it could make some up and coming reporters career.
Why hasn't anyone tried it yet?
Seeing that they rely on all lies, I feel like a single solid documentary could blow the truth wide open and change millions of opinions. We have so many anti-establishment documentaries that tell the truth about health care, insurance blah blah blah...
All this without even being biased, just telling the damn truth... I feel like it could make some up and coming reporters career.
Why hasn't anyone tried it yet?


Comment