This is the syllogism they are using.
A) Black people use this social program B) Republicans want to cut this social program
Therefore, Republicans want to cut social programs because Black people use them.
So if you want to cut welfare you hate black people. If you point out that the large majority of people who use welfare in America are not black, so why you hear, "DIBRPORITONALLHY BLACK." These people have no idea what a proportion is or what it should mean. Ask someone, "What is the proper proportion that something should be of another thing?" They have no idea and can't unpack it.
Bonus points, if instead of talking about disproportionate black people they instead say "people of color" you can ask them how they got so racist that they would use an old-timey racist phrase like "colored people" AND lump in all races into one group as if Cubans, and Italians, and Arabs and Indians and African Americans and Koreans are all the same, have no special identity that matters to them and their only feature is not being white. Sounds like a white supremacist talking point, doesn't it?
Edit: People should actually just read this book instead of launching misinformed and unrelated arguments against what I said. It turns out the professional academic goes into a little more detail and cites extensive sources, unlike my exceedingly brief reddit comment. Literally, go read a book. It's very short, because half of it is citations.
Feku & Co: Exclusive purveyors of mis-truths in India, and masters in How to Lie with Statistics
1) When did Optical-Fibre technology start picking pace in India?
2) Can Feku & Co, unmatched-experts in pakoda-economics, speak about S-curve in technology adoption
3) Here is a relevant news piece from year 2012: <strong>Super optic highway to connect 2.5 lakh villages</strong>: 3 pilot projects with 61 gram panchayats across Ajmer, Vizag and North Tripura by October, Oct'12
> make his claims but not provide souce code showing how a bias could be hidden in an algorithm without it being immediately obvious to many coders at google
Because with machine learning and AI, even the developers don't understand how the decisions are made.
You should read Weapons of Math Destruction by Cathy O'Neil, which goes into how biased training data, programmers, etc can result in biased algorithms. It's pretty fascinating.
I had a highschool math teacher who always told us to be warry of statistics and even had a book he shared.
How to Lie with Statistics https://www.amazon.com/dp/0393310728/ref=cm_sw_r_cp_api_i_mfP4DbK80WD65
The Book of Why by Judea Pearl
"Correlation is not causation." This mantra, chanted by scientists for more than a century, has led to a virtual prohibition on causal talk. Today, that taboo is dead. The causal revolution, instigated by Judea Pearl and his colleagues, has cut through a century of confusion and established causality--the study of cause and effect--on a firm scientific basis. His work explains how we can know easy things, like whether it was rain or a sprinkler that made a sidewalk wet; and how to answer hard questions, like whether a drug cured an illness. Pearl's work enables us to know not just whether one thing causes another: it lets us explore the world that is and the worlds that could have been. It shows us the essence of human thought and key to artificial intelligence. Anyone who wants to understand either needs The Book of Why.
> My boss is an understanding person and knows that we're stressed, but the larger organization seems uninterested in reorganizing to lessen our burden.
That's all you really need to know. You expressed a concern about the health of the team(s), and the broader org said "no, this is fine". They can live with all the benefits and consequences that come with that decision. All you need to know is whether or not you can live with all the benefits and consequences of that decision.
> Are most jobs like this?
I would say no, but practices that promote burnout aren't exactly uncommon -- toil is one example.
It's not uncommon for organizational practices/structures to foster high levels of burnout, but most orgs who give a shit will tend to fix those problems because turnover tends to be more expensive than simply fixing the problems that cause the turnover. Kinda sorta depends on the business's priorities, though. Showing the value of strategic investment in technical resources is ... difficult at times. I like the approach taken by Accelerate -- numbers and figures are what your manager needs to be focusing on, though it is hard to do when you're drowning already and engagement from leadership is low to non-existent anyway.
If every republican read how to lie with statistics -Darrell Huff (1954) Fox viewership would drop. Hell, Democrats or anyone for that matter should read this. It makes trusting a news source a lot harder when you immediately pick out devious tricks to engineer partial truths.
One of the best books I ever read was "How to Lie With Statistics". Although the examples are dated, the basics are still very valid.
I agree with the overall point of learning and continuous improvement, but I think a lot of common sense and research indicates that Mean Time to Restore is a very important metric to measure and improve. And if I had to choose, I would definitely pick Mean Time to Restore over Mean Time to Retrospective. If you can measure both, great.
As an example, Time to Restore is one of four metrics included in Software Delivery and Operational Performance which predicts organizational performance, as shown in the State of DevOps reports and the related Accelerate book.
It's not an oxymoron. You can have two sets of true facts about a situation, which depending on which things are emphasized, gives you an entirely separate narrative of how things are going.
One could say, for instance, "Murders in Chicago are up 1200% this year! Highest number of deaths on record!" And that would be one set of facts.
One could also say, "There were 12 murders in Chicago, compared to 1 last year. The 10 year average is about 10 murders per year, so this isn't outside of the expected range of deaths. Also, it's a city of 3 million, so this is in fact a fairly low amount of murders on a per capita basis."
Those are two sets of facts reporting on the same incident, both true but you can tell they paint entirely different narratives about how to feel about the situation.
What this shows is that you cannot simply say, "Well, just report the facts, and everything else will work itself out." There is a meta element to this, where you choose which things to emphasize, and which to de-emphasize, and by this choice of emphasis you inevitably are shaping the way things are understood, because humans understand events not by a bulleted list of facts, but as a narrative, a story.
These things you have to take into consideration when you're talking about media and the spread of information, especially political information.
For more fun like this, check out How to Lie with Statistics.
It's not a big deal to be lazy about getting directions to the bank. Algorithms are genuinely dangerous right now, already, because people feed poorly curated data into an equation that's almost entirely unrelated to what the powerful people pretend it's doing, and then nobody bothers to check whether it's getting good results until a lot of lives are ruined by things ranging from being denied loans to getting accidentally put on the no-fly list to being constantly tailed by police with a grudge because the math says red cars in your zipcode are all criminals.
You can absolutely go faster. Not by saying "go faster," but there are practices and organizational techniques that make quantifiable differences in how fast you can deliver software. A lot of it's stuff we've already heard of: good version control practices, CI/CD, test automation, rapid feedback cycles, limiting work in progress, good communication, keeping processes lightweight, etc. There's even a research-backed book that delves into this.
Possibly the worst set of graphs I've ever seen
​
https://www.amazon.com/How-Lie-Statistics-Darrell-Huff/dp/0393310728
THIS! Here's the link on Amazon
with you. He’s great, and “underachieving” in the tournament is really this:
edit: also, thanks for McKoy. Most UNC fans don’t realize what a contributor he’s going to be.
I agree with the overall point of learning and continuous improvement, but I think a lot of common sense and research indicates that Mean Time to Restore is a very important metric to measure and improve. And if I had to choose, I would definitely pick Mean Time to Restore over Mean Time to Retrospective. If you can measure both, great.
As an example, Time to Restore is one of four metrics included in Software Delivery and Operational Performance which predicts organizational performance, as shown in the State of DevOps reports and the related Accelerate book.
Here, you obviously need to read this https://www.amazon.com/How-Lie-Statistics-Darrell-Huff/dp/0393310728?ref_=d6k_applink_bb_dls_failed
The fact is that masks were made political and there are politically motivated people funding and doing research. u/Kryloxy is correct that there are studies that show masks are somewhat useful and a bunch that show they aren't. I think the best summary of this whole situation comes from the University of Oxford's Centre for Evidence-Based Medicine.
"The increasing polarised and politicised views on whether to wear masks in public during the current COVID-19 crisis hides a bitter truth on the state of contemporary research and the value we pose on clinical evidence to guide our decisions...There is considerable uncertainty as to the value of wearing masks. For instance, high rates of infection with cloth masks could be due to harms caused by cloth masks...The numerous systematic reviews that have been recently published all include the same evidence base so unsurprisingly, broadly reach the same conclusions. However, recent reviews using lower quality evidence found masks to be effective...This abandonment of the scientific modus operandi and lack of foresight has left the field wide open for the play of opinions, radical views and political influence."
Thanks; and you are welcome. I've read (and highly recommend) Taleb's <em>Fooled By Randomness</em>, however, and recognize that a repeated monte carlo simulation of the past 20 years of my life would produce vastly different results. In truth, I have very little idea how it all happened, but I can at least - thus far - confidently recommend the four things I wrote above in the post :)
> The actual shame here is Newsweek reporting on that tiny slope at the end of the graph.
My business statistics teacher in college was garbage, but the best thing he ever did was mandate that we buy the book, How To Lie With Statistics as one of the textbooks.
It's a short book, written in 1954, and yet it is remarkably applicable to today. It goes through the most common ways advertisers, politicians, salespeople, etc use misleading graphs or charts or cherry-picked statistics to lie. And once you've read it, you notice it EVERYWHERE.
But what is your response to my comments?
Also, who are the "experts" who published the diagram. Finally, go read this book:
https://www.amazon.com/How-Lie-Statistics-Darrell-Huff/dp/0393310728
That's not how causality works.
Causal modelling is a whole rabbit hole. Check out this book for an intro: Judea Pearl - The Book of Why
How to Lie With Statistics by Darrell Huff
It's a pretty quick read about how true information can be used in misleading ways.
Don't walk, run to the....cartoon guide to stats. No joke! Saved my bacon in grad school. Not sure how it will look as a sited work.
Recomendo esse livro e esse livro.
Não vão fazer você ganhar dinheiro, mas vão te ajudar a não perder.
Helpful and interesting, but I'm still left wondering what I'm supposed to do with the information, you know? I mean, yes, I'm consistently seeing evidence of leptokurtosis (now that I'm looking) as well as skew (which was less surprising to me), but I feel like a monkey that just got its hands on a wrench. Clearly I'm supposed to do something with this, but bashing it against a rock doesn't appear to be doing anything.
I've also been seeing a number of backtests (namely by u/spintwig), showing that selling 30 delta options appears to be profitable overall, which would line up with the leptokurtosis and skew I'm seeing. But other than seeing that both appear to be pointing in the same direction, I feel too dumb to know what else I'm supposed to do, like holding two pieces of a jigsaw puzzle but not knowing what picture I'm putting together. Like maybe it's indicating a more profitable strategy is a 38 delta strategy, or a 25 delta. Or maybe a 38 delta short and a 1 delta long spread (to try and capture the lower volume of intermediate days, but higher volume of large days, that the leptokurtosis is showing). IDK.
So I know someone else has put this together before. I just don't know who.
And I'm in the middle of reading Fooled by Randomness. It's interesting, but I can't say I'm the biggest fan of his writing style. I was planning on finishing this book before deciding if I wanted to read any of his other works.
It's an interesting subject.
One factor is that the black population tends to be younger, which has an impact.
Thomas Sowell went into this in more depth in his book, "Discrimination and Disparities."
The book changed my perspective, which had previously been that our institutions were riddled with so much racial bias and corruption as to be almost useless, to something more tempered.
There's definitely discrimination at play. But for a real, more permanent, more effective, and faster solution, we will need to look deeper. But there are other factors that may make it appear worse than it is. And these factors should inform the solution.
For instance, banks owned by blacks are actually less likely to make loans to black borrowers, and that's the case in both single-factor and multi-factor analysis.
The point is, to win the fight against racism, not fight people or companies that simply happen to look racist, when, in fact, they may be applying policies that have nothing to do with racism.
Because the situation is not always as it seems at first glance.
Send me a message if you'd like to talk about this, because my family and I have been discussing it, back and forth, for some time.
Discrimination and Disparities https://www.amazon.com/dp/1541645634/ref=cm_sw_r_cp_api_i_4kCuFb699QVY1
Her entire argument is trashed by a much more intelligent man than her. Thomas Sowell is the boss when it comes to these topics.
Quote him and watch them call him a racist. Then drop a picture of him. 😂
It's a short book that does a good job of showing how people manipulate the presentation of data to achieve their goals. If nothing else, it will raise your awareness of the issue and perhaps make you more skeptical of what you see and read.
I went looking for the book you recommended, as I will be doing a lot of statistics as a research student starting this fall. I found How to Lie With Statistics by Darrell Huff and Statistics Done Wrong by Alex Reinhardt.
Could you clarify which one you're referring to? I would be interested in reading a copy while we're still in the summer months.
I think every nuggets fan should read this book