by kublikhan » Fri 22 Jun 2018, 16:06:59
$this->bbcode_second_pass_quote('Outcast_Searcher', '')$this->bbcode_second_pass_quote('onlooker', 'N')o, its a question of who you wish to believe. You refuse to concede that alot of info out there is intentionally false and/or misleading or deceptive.
Look in the mirror. The kind of misleading financial blogs you love to quote random doomer "news" from, like zerohedge are about as misleading and deceptive as it gets.
Especially if you do a sanity check over time re their accuracy. But I know, the fast crash doomer community isn't interested in accuracy as much as FUD.
Hint: all "info" isn't of equal value in objective terms.
Exactly. Onlooker, there is plenty of misleading information out there. However that doesn't mean you can latch onto anyone spewing random BS and claim it has just as much validity as credible sources. There are ways to wade through the BS without throwing your arms in the air and saying: "There is so much BS out there! From now on, I am just going to parrot those who say what I want to hear." A good first step would be
The CRAP Test. Another would be to review any counter arguments and use your own brain to decide which argument is more sound. Admittedly this is more work than reading a headline from a blog and trumpeting "See! The end really is nigh!" But it sure can be helpful in stopping you from being draw into BS.
$this->bbcode_second_pass_quote('', 'W')hy do people so easily believe false things? There are probably as many answers to this question as there are people who have ever believed falsehoods. Nonetheless, psychologists have shown that a relatively small set of cognitive biases or mental shortcuts can explain a lot about how false notions take root. One of the most agreed-upon ideas in the field of psychology is that people routinely use mental shortcuts to understand what happens around them. All kinds of things occur in the world around us, and we don't always have the time or energy to sit down and carefully examine all of them. So, we tend to use quick and largely unconscious rules-of-thumb to determine what we should believe—and these shortcuts sometimes steer us in the wrong direction. Here are some of the culprits:
The Availability HeuristicWhich job is more dangerous—working as a police officer or a fisherman? If you guessed police officer, you’re wrong. According to figures from the United States Bureau of Labor Statistics, fishing workers are ten times more likely than police to be killed on the job. This doesn't make police work any less important, of course, though it does mean that many of us have underestimated how dangerous other jobs are in comparison. The reason most of us believe that police officers are more likely to die at work is because of the availability heuristic, a mental shortcut that can lead us to overestimate the frequency of an event when that event is more “available” or vivid in our memory. When a police officer is killed in the line of duty, it’s rightly widely reported in the news and sticks with us in memory, so we tend to believe it must be more common than deaths in other professions. The availability heuristic is also the reason why doctors sometimes believe that diseases are more widespread than they really are—their jobs naturally fill their memories with vivid examples. In fact, when any of us read or watch a news story about an instance of terrorism, voter fraud, or other crime, we’re likely to overestimate how common such events are. Unless we’re careful, the vivid nature of the news story in our memory can unconsciously bias our estimate of how often these events actually happen. So, how common are things like voter fraud and crime? We can’t necessarily trust our hunches. It’s best to consult the statistics.
Emotional ReasoningWhether we like it or not, all of us can be powerfully swayed by emotions. We'd like to think that our feelings are driven by logic and reason, particularly when it comes to our political beliefs. Unfortunately, this relationship is often reversed. Sometimes we end up using our reasoning ability to justify or defend a conclusion that we’ve already drawn based on our emotions. This phenomenon, called emotional reasoning, can lead us astray without our ever knowing. Psychiatrist Aaron T. Beck first noticed this in depressed patients. He observed that many patients drew obviously untrue conclusions about themselves based on how they felt, rather than the actual facts. "If I feel depressed,” one of his patients might say, "then there must be something objectively wrong with my job, my marriage, my children, or other parts of my life." But feelings are just feelings, even when they're powerful, and they can sometimes lie to us. Even in those of us who aren’t depressed, this tendency can affect our beliefs about virtually any emotionally charged topic, whether we’re talking about sexuality, religion, money, crime, or war. When we feel scared, angry, anxious, or even just uneasy about a topic, we can easily jump to the conclusion that the topic is somehow objectively bad or dangerous. Next time a topic makes you feel uncomfortable, that’s probably reason to keep an open mind, not to draw a conclusion.
Confirmation BiasOnce we have a belief, we tend to cling to it, even when it’s untrue. The confirmation bias is the tendency to seek out information that supports what we already believe. We do this in two important ways. First, we tend to surround ourselves with messages that confirm our pre-existing opinions. This is why, in the United States, conservatives tend to get their news from sources like Fox, whereas liberals tune into MSNBC. Second, we tend to ignore or discount messages that disprove our beliefs. If we’re sure that climate change is a hoax and someone shows us a research study disputing this belief, we might dismiss the study’s findings by saying that the researcher is obviously biased or corrupt. This protects us from having to change our beliefs. When our ideas are true, this probably isn’t such a bad thing. Unfortunately, it also can keep us firmly believing things are false.
While it’s clear that some people lie out of expedience or spite, most of us value the truth. We genuinely desire to accurately understand the facts and help others to do the same. As flawed human beings, however, none of us is a perfect barometer of the truth. Despite our best intensions, it’s easy to unconsciously buy into beliefs that feel right, even though they’re not. But it’s precisely when we’re sure that we’ve cornered the truth that we should take a step back, breath deeply, and open our minds as far as we can. If we were all able to take this basic truth about human nature to heart, perhaps this would allow us to more effectively come together during times of political strife.
Why Do People Believe Things that Aren’t True?This guy fell into the trap of believing what he wanted to believe too. However just to try and be balanced, he looked at dissenting voices as well. What he found shocked him. He did a complete 180 on his position after he found out the sweet words he was hearing at first were all BS.
$this->bbcode_second_pass_quote('', 'D')ear Editor,
I run Macrotrends, a financial newsletter that's doing quite well here in Belgium and the Netherlands, and I am always curious about the next big thing.
I discovered the story about Andrea Rossi's Energy Catalzyer about a year ago, and I thought this might just be the thing for the energy sector. I wrote a small article about it in August 2011, and I promised to give an update after the E-Cat tests that were to be announced in late October 2011. The tests weren't convincing so I waited, all the while following the information published on ecatnews.com and some other sources.
Then, finally, last month I decided to do a follow-up because there seemed to have been some important developments: claims of the involvement of NASA, SIEMENS, National Instruments, production of the units, the Defkalion story, etc. It all made me very excited about the E-Cat.
After making preparations for an article about Rossi's apparatus for a day and a half, I decided to look for some skeptical voices to try and cross-check my story and make it more balanced. That is how I found the New Energy Times Web site. Your writing style struck me as very reasonable yet critical, and the thoroughness of your arguments and the level of documentation entirely swayed my position.
It hurt, because I had to let go of a dream I'd fallen in love with. I had to admit that I had not found the extraordinary evidence that is required to back up the extraordinary claims put forward by Rossi and his team. Quite the contrary: There are many things I’ve discovered that seem to suggest the E-Cat is, in reality, a fraudulent scheme.