>humans fundamentally don't really have those either
A well-reproduced finding in social science is that slight variations in the phrasing of a poll or survey question can cause dramatic changes in the results.
This is just an empirical observation, and there are probably a lot of underlying reasons for it, but I do think one of them is that most humans don't have consistent well-considered opinions on most subjects. When we're asked our opinion on a subject that we haven't thought about a lot, we default to what Daniel Kahneman calls "System 1" (fast/intuitive) thinking, which is highly susceptible to priming effects and other distortions.
That said, it's not true that none of us have consistent beliefs on any subject. It's just that forming and applying a coherent belief system requires effort, and we only put in that effort on topics we care about and in contexts where we think it matters.
Here's a great book on decision making that may give you some insight. I think there's a real difference between taking risks and being fool hardy.
There are lots of risks in life where you are an absolute underdog, but the investment you pay versus the reward is a good pay off. For example, making a $20 bet at the pot because you stand to win $160 with 5/1 odds is a calculated risk, but a good one. In this case your pay off is 8/1 but the investment is 5/1, so you realize an arbitrage of $60 bucks.
Another example is, you have the correct information and make a decision, but the outcome is still bad. It was a good risk, but outside things may have changed your result. An example would be, you move to a great neighborhood because the school choice is fantastic, but the state redistricts it out of the blue and now you're kids are going to lesser school. Good decision, bad outcome.
The key with all risk is to understand the outcomes, and manage investment so that in the long run that repeated investment would yield more than you lose.
It's from Thinking Fast and Slow by Daniel Kahneman. Really great book!
No. OP is asking for "red flags," which are objective indications of something deeper.
What I'm saying is that this is inherently judgmental and that we, as humans, are poor at making judgements. It's much better to simply directly ask someone something. That way, things are less likely to be lost in translation.
Don't take my word for it. There's plenty of great research out there. Some of which has been included in these books:
​
Steven Weinberg is using "religion" when he actually means something more like "ideology", and it's actually more vague than that. The only ingredient you need for a person to do evil is for them to believe that something they value is threatened by someone who differs in some way.
Buddhists are committing atrocities against Muslims in Myanmar. Communists committed atrocities (against Christians and Jews, among others) in the Soviet Union. Russians are committing atrocities against civilians (regardless of faith) in Ukraine. Romans committed atrocities against Carthage. And Christians have committed their fair share of atrocities, like the Albigensian Crusade. Nobody ever claimed that Christians are good at being Christian.
In fact, another of the core tenets of Christianity is that people, in general, will fail to do the right thing at least some of the time. That particular tenet does happen to be empirically verifiable.
Books like Thinking Fast and Slow and The Worm at the Core are popular-level science books about why humans fail to make ethical, rational decisions, even when given all of the information they would need to make them.
I do not recommend this book. Funny, the Nobel Prize winner, who in the own book warns about research on small samples, made this mistake himself.
> readers of his [Kahneman’s] book “Thinking Fast and Slow” should not consider the presented studies as scientific evidence that subtle cues in their environment can have strong effects on their behavior outside their awareness.
of course "thinking fast and slow" about how the corporations make you buy https://www.amazon.com/Thinking-Fast-Slow-Daniel-Kahneman-ebook/dp/B00555X8OA/ref=sr_1_1?ie=UTF8&qid=1529391517&sr=8-1&keywords=thinking+fast+and+slow
> I've come to believe that the majority of what you think you think is the result of unconscious thought, and only the tip of the thought iceberg is conscious, and it is one of the great blind spots of well over 90% of the scientific establishment (and philosophic!)
I have no idea how you came to that view of the scientific/philosophical establishment:
> Logic and facts
Something else you may like in this journey: Thinking Fast and Slow
My impression is that as much as we idealize rationality, "reasoning" in practice always takes place within some emotional context; there is always a reason or objective for why we are trying to make a rational argument, and such context will always influence the shape of our rationality in some way.
That may just sound like "bias", but the idea is presented in a more thought-provoking and humanist way.
Vamos escrever mais uma vez já que o camarada leu e não compreendeu: não há formas de determinar se este desenvolvimento seria superior ou inferior ao atual sem especulação. Você está fazendo exatamente o que tenta criticar ironicamente ao copiar minha frase, especulando à favor de seu posicionamento político, o que é natural, mas não deixa de ser um erro que deve ser reconhecido e corrigido por si mesmo.
Historicamente o ser humano é péssimo em fazer previsões para o futuro baseado em seu pequeno conhecimento de mundo, devido à predisposições e a nossa deficiência de pensar estatisticamente. Goste você ou não isso é um fato, se quiser aprender mais sugiro a leitura Thinking, Fast and Slow do Daniel Kahneman, psicólogo que ganhou um Nobel de economia.
​
Aguardando os downvotes da ignorância seletiva.
Yeah man, I am. Myself included. It doesn't need to be disproven, and it's not my personal opinion either, the science behind it supports it too*. And it's not limited to race, everyone has biases based on sex, ethnicity, orientation, etc.
I think the part that you're misunderstanding is that you think I'm saying that based on that, everyone with any form of bias should be drawn and quartered. Instead, what I'm saying is that we should all do what we can to understand our biases better, and actively try to limit them in order to improve society for all of us. Every single one of us, including you and I (regardless of our race and gender), is being harmed by this too. It's just that certain groups have been harmed by it much more than others, so we're more sensitive to those instances and call them out more.
"*" If you're interested in reading more on the science behind it, I really recommend the book "Thinking, Fast and Slow". I'll openly admit that I was far more in line with your views before I read it, but it really changed my mind and over time I started to notice more and more instances of this kind of bias. Once you start noticing it, it's really hard to unsee. It's difficult, because with myself it made me notice all the times that I'd let my biases cause me to mistreat people, but it was worth it in the end.
Edit: formatting
This sub is very hostile to opinions and information they disagree with. A general sign of low education/intelligence/strong system 1 control
I know you're kind of joking/light-hearted, which is why I've chosen to respond to you.
Winning a super bowl in your first year as a head coach is obviously a strong indication that you're going to be a good coach. However, there are coaches who have won Super Bowls and then been proven to be less than great coaches in the NFL. Barry Switzer, Don McCafferty, Jon Gruden, Pete Carroll (?), Mike Ditka...All won a single super bowl. None of them have an amazing coaching legacy. Pete Carroll's is still undecided. I guess Gruden's technically is too.
Doug Pederson seems like a really good coach to me. However, there's no way there's enough information on him yet to say that he's a net positive reflection on Andy Reid's coaching tree.