oh man, you are committing some classic mistakes. you think every publication is good? try grasping a better understanding of research and then actually read the studies. there is a lot of dogshit out there buddy. and a lot of excited people and people tooting their own horn for funding.
Now I understand that many of you may be lost, especially with the advent of the commericalized web, that has a tenedency to turn otherwise great publications into a free for all advertising cash grab.
Which publications can you trust? Right?
So, with that in mind, never ever fucking accept anything someone says on the web, unless they link to the authoritative source where they got the information from.
It doesn't have to be defacto authoritative, like the smithsonian, or nature.com, but you must know where the information originally came from, otherwise you will never understand where issues such as economics, and religious politics have distorted the views of source information.
If someone is preaching about cancer cures, or new health fads, and they are not linking to the scientific journals which study the truths of claims, then, don't bother reading the web.
Hyperlinks are the very basic way in which the web works. Scratch, that, it is the very basis in which the way information and modern science works.
http://en.wikipedia.org/wiki/Hyperlink#History
The term "hyperlink" was coined in 1965 (or possibly 1964) by Ted Nelson at the start of Project Xanadu. Nelson had been inspired by "As We May Think", a popular 1945 essay by Vannevar Bush. In the essay, Bush described a microfilm-based machine (the Memex) in which one could link any two pages of information into a "trail" of related information, and then scroll back and forth among pages in a trail as if they were on a single microfilm reel.
I said it didn't need to come from a defacto authoritarian, I said you need to know where the information came from to understand economic or religious or other biases.
Sometimes a bias is right, but sometimes, it's not.
I don't know how to tell you in a nice way, so I'll tell you in the direct way. Don't let your envy obscure or blind you to a secret that you know of as the reason to success of others, which may be twisting your ego into a knot of jealousy.
If it's envy, don't worry, you can probably find social mobility in a different country, right? Or do what I do, understand the nightmare those people have dug themselves so deeply into.
Seriously, it is easier to measure politics than it is to wade through the papers for potential flaws in methods.
People beat the crap out of each other for accuracy on methods.
If it's a one time study, you need to focus on the methods. If it a study repeated by several different institutions with varying degrees of political alignments, well, it's not 100%, but it's sometimes more substantial.
Don't get me started on rubbing rats with pure nicotine to discover where the tumors grow.
LD50!
http://whyquit.com/pr/041812.html
"We demonstrated for the first time that chronic treatment of A/J mice with an LD50 dose of nicotine cause carcinogenic transformation of both smooth and striated muscles as well as transient hair loss," wrote University of California Ervine researchers in their new study published online at Life Sciences.[1]
http://www.sciencebasedmedicine.org/everything-causes-cancer/
It is helpful to have published evidence and statistics to back up my casual observation – that nearly all foods are touted as having health benefits or risks. Earlier this year Schoenfeld and Ioannidis did just that. They selected 50 common ingredients at random out of cookbooks, then scoured the literature looking for studies showing an association (positive or negative) with cancer. They found that 80% of the ingredients had such published studies:
Forty ingredients (80%) had articles reporting on their cancer risk. Of 264 single-study assessments, 191 (72%) concluded that the tested food was associated with an increased (n = 103) or a decreased (n = 88) risk; 75% of the risk estimates had weak (0.05 > P ≥ 0.001) or no statistical (P > 0.05) significance. Statistically significant results were more likely than nonsignificant findings to be published in the study abstract than in only the full text (P < 0.0001). Meta-analyses (n = 36) presented more conservative results; only 13 (26%) reported an increased (n = 4) or a decreased (n = 9) risk (6 had more than weak statistical support). The median RRs (IQRs) for studies that concluded an increased or a decreased risk were 2.20 (1.60, 3.44) and 0.52 (0.39, 0.66), respectively. The RRs from the meta-analyses were on average null (median: 0.96; IQR: 0.85, 1.10).
It's your lympathics. Go watch a biology course.
Personally I think it's more like the longer you are on the internet, the less of an idiot one becomes.
So there was a recent influx of new people to the web, and I mean in massive waves back around the Facebook party, some dug a little deeper, some went back to TV dreamland.