axbom’s avataraxbom’s Twitter Archive—№ 28,635

                                              1. "What does the research say?" When people repeatedly ask this it seems they are asking "What are the facts?". But shouldn't they also ask: * Who paid for the research? * Who performed the research? * Who was part of the research? * Has it been replicated? 🤔
                                            1. …in reply to @axbom
                                              In 2010 the term "replication crisis" was coined when scientists kept finding that the results of many scientific studies are difficult or impossible to replicate/reproduce on subsequent investigation, either by independent researchers or by the original researchers themselves.[
                                          1. …in reply to @axbom
                                            Here's the Wikipedia article on that term, replication crisis. en.wikipedia.org/wiki/Replication_crisis
                                        1. …in reply to @axbom
                                          Here's an interesting article from 2015 regarding this phenomenon: Scientists Replicated 100 Psychology Studies, and Fewer Than Half Got the Same Results smithsonianmag.com/science-nature/scientists-replicated-100-psychology-studies-and-fewer-half-got-same-results-180956426/
                                      1. …in reply to @axbom
                                        Although all scientific fields are under scrutiny, psychology is at the center of the controversy and social psychology specifically.
                                    1. …in reply to @axbom
                                      I find this interesting because I attend conferences where many of these popularised social psychology experiments are at the heart of the message from many speakers providing advice on how to design for humans.
                                  1. …in reply to @axbom
                                    My fear is that designers place too much trust in the work of others, essentially drawing conclusions about how to design - based on studies that can not be replicated. Thus turning the design itself into a huge [unethical] social experiment... That's all for now. :)
                                1. …in reply to @axbom
                                  The oft-quoted ”nudge” example of making hotel plates smaller to make people consume less food is dangerous because: 1) It assumes all people should consume less. 2) It does not follow R_Thaler’s rule of transparency. 3) It is also wrong. theconversation.com/do-smaller-plates-make-you-eat-less-no-74181
                              1. …in reply to @axbom
                                It turns out all the studies claiming small plates lead to less consumption come from the same research group, one that happens to be under scrutiny for self-plagiariam and data misrepresentation. theguardian.com/science/head-quarters/2017/mar/02/fresh-concerns-raised-over-academic-conduct-of-major-us-nutrition-and-behaviour-lab
                            1. …in reply to @axbom
                              I am however confident I will, for many years to come, witness speakers on stage claiming the plate - consumption causation and then using that as evidence of some unrelated design solution being the way to go.
                          1. …in reply to @axbom
                            Another important article, this time on medical science: Evidence-Based Lies. medium.com/@BlakeGossard/evidence-based-lies-1ec8db16cc8a?source=linkShare-77d3f63ebe80-1521827685
                            oh my god twitter doesn’t include alt text from images in their API
                        1. …in reply to @axbom
                          Noteworthy quote from this article. ”News headlines abound proclaiming that new “data” support this or that position. In many cases, these “data” are derived from surveys, which are highly susceptible to basically every bias known to science.”
                          oh my god twitter doesn’t include alt text from images in their API
                      1. …in reply to @axbom
                        Another one for the mix, on the lack of insights about cross-cultural psychology and on the dangers of assuming human traits are global and ageless. ht letterpress_se theconversation.com/how-knowledge-about-different-cultures-is-shaking-the-foundations-of-psychology-92696?utm_campaign=Echobox&utm_medium=Social&utm_source=Facebook#link_time=1520615158
                    1. …in reply to @axbom
                      The problem with Science. Good podcast from guardianscience weekly. overcast.fm/+F8mLIo2DM
                  1. …in reply to @axbom
                    Another must-listen podcast A Neuroscientist Explains: psychology’s replication crisis overcast.fm/+MFGeMgJbg
                1. …in reply to @axbom
                  It’s safe to say that the field of psychology has some cleaning up to do: ”Professor Philip Zimbardo used the media to turn his Stanford Prison Experiment into the best-known psychology study of all time. It was a sham.” medium.com/@benzblum/the-lifespan-of-a-lie-d869212b1f62?source=linkShare-77d3f63ebe80-1529466849
              1. …in reply to @axbom
                Another listening recommendation. ”The marshmallow test is one of the most well-known studies in all of psychology, but a new replication suggests we've been learning the wrong lesson from its findings for decades.” overcast.fm/+CuhumMpu4
            1. …in reply to @axbom
              Please listen to this episode of Radiolab with a walkthrough of the famous Milgram experiments. In this case it's not about poorly performed experiments but many years of misinterpretation (over-simplification). overcast.fm/+J5QRFk ht jocke
          1. …in reply to @axbom
            Boom. 800 scientists call for scrapping the concept of 'statistical significance'. nature.com/articles/d41586-019-00857-9 via cjforms
        1. …in reply to @axbom
          It appears in sociology, criminology and psychology textbooks, and in the media as a truism – but what if we take a closer look at the research around "the bystander effect"? aeon.co/essays/it-looks-like-human-beings-might-be-good-samaritans-after-all?utm_source=Aeon+Newsletter&utm_campaign=9cf68716e1-EMAIL_CAMPAIGN_2019_03_28_12_17&utm_medium=email&utm_term=0_411a82e59d-9cf68716e1-70597045
      1. …in reply to @axbom
        The replication crisis is good for science. We just need to acknowledge it and talk about it more. theconversation.com/the-replication-crisis-is-good-for-science-103736
    1. …in reply to @axbom
      Bystander Effect debunked lisaehlin/1147818230687420416?s=21
  1. …in reply to @axbom
    More on the weaknesses of the bystander effect (paywall): wsj.com/articles/bystanders-who-intervene-in-an-attack-11563464806
    1. …in reply to @axbom
      1. …in reply to @axbom
        1. …in reply to @axbom
          Backfire effect. That's when people's opinions are contradicted by facts, but the opinion doesn't change - instead it only grows stronger. But this review suggests the backfire effect is in fact quite rare, and not the norm. fullfact.org/blog/2019/mar/does-backfire-effect-exist/
          1. …in reply to @axbom
            If you clench a pencil between your teeth, this will force a smile that makes you feel more positive emotions. Hmmm. But will it really? A registered replication report (RRR) appears to debunk this common notion from social psychology. theeconomyofmeaning.com/2016/08/20/famous-psychology-study-killed-by-replication-does-a-pencil-in-your-mouth-make-you-feel-happy/
            1. …in reply to @axbom
              1. …in reply to @axbom
                People set goals to walk 10,000 steps today because a Japanese watch company in the 1960s made the wearable step-counter manpo-kei, which translates as “10,000-step meter”. The number 10,000? It just ”felt” good. And now it feels good to Fitbit et al. theguardian.com/lifeandstyle/2018/sep/03/watch-your-step-why-the-10000-daily-goal-is-built-on-bad-science?CMP=Share_iOSApp_Other
                1. …in reply to @axbom
                  More debunked behavioral truths. Under scrutiny: Loss Aversion. "The popular idea that avoiding losses is a bigger motivator than achieving gains is not supported by the evidence" via JesperBylund blogs.scientificamerican.com/observations/why-the-most-important-idea-in-behavioral-decision-making-is-a-fallacy/
                  1. …in reply to @axbom
                    A paper mill is a company that fabricates scientific papers on demand. They sell these to people who need to have a scientific paper published in an international journal. MicrobiomDigest and team found >400 papers seemingly part of a mill. scienceintegritydigest.com/2020/02/21/the-tadpole-paper-mill/ ht beantin
                    1. …in reply to @axbom
                      'Stockholm syndrome' is a misogynistic invention by a psychiatrist who found his authority questioned. And which was then popularised by the media. As explained by jessradio in her book about domestic abuse: 'See What You Made Me Do'. sezmohammed/1252500993972948992?s=20 t.co/DKyYrXdG1g
                      1. …in reply to @axbom
                        This tweet was detached from the thread. Adding it here so you don't miss it. axbom/1237099412515684352?s=21 axbom/1237099412515684352
                        1. …in reply to @axbom
                          Interesting and relevant recent interview with rcbregman, author of Humankind - on mehdirhasan's Deconstructed. They talk about some of the problematic experiments mentioned earlier in this thread (Stanford Prison, Milgram) and more. overcast.fm/+dcvDvcqYc
                          1. …in reply to @axbom
                            Also mentioned in the Deconstructed podcast is how the fictional(!) Golding book Lord of the Flies has contributed to ideas of inevitable human destructive behavior. This story shows another, real-life, outcome: theguardian.com/books/2020/may/09/the-real-lord-of-the-flies-what-happened-when-six-boys-were-shipwrecked-for-15-months
                            1. …in reply to @axbom
                              ”Studies in top science, psychology and economics journals that fail to hold up when others repeat them are cited, on average, more than 100 times as often in follow-up papers than work that stands the test of time.” theguardian.com/science/2021/may/21/research-findings-that-are-probably-wrong-cited-far-more-than-robust-ones-study-finds?CMP=Share_iOSApp_Other
                              1. …in reply to @axbom
                                Keeping an eye on machine learning papers. random_walker/1422222355296829441
                                1. …in reply to @axbom
                                  Sometimes replication is impossible because the original study is based on fake data. This debunking of Dan Ariely studies (yes, plural - read the full article) is worth your time. buzzfeednews.com/article/stephaniemlee/dan-ariely-honesty-study-retraction
                                  1. …in reply to @axbom
                                    "Something is wrong with science that’s causing established results to fail. One proposed and long overdue remedy has been an overhaul of the use of statistics." m.nautil.us/issue/74/networks/the-flawed-reasoning-behind-the-replication-crisis
                                    1. …in reply to @axbom
                                      This long-read by aubreyclayton on how statistics and eugenics are intertwined is the perfect addition to this multi-year thread on misleading science. Highly recommended. via mer__edith m.nautil.us/issue/92/frontiers/how-eugenics-shaped-statistics
                                      1. …in reply to @axbom
                                        "An eight-year-long study reveals that only about half of early-stage cancer experiments are able to produce the same results as the initial experiment." smithsonianmag.com/smart-news/its-difficult-to-replicate-cancer-research-a-new-report-details-180979187/