There's a well-known phenomenon whereby people who have taken a position on an issue will, when questioned, entrench their views and interpret evidence in such a way as to favour that position. They can be gradually led into supporting more and more extreme views; this is a well-known effect of some recommendation engines. But what if we misled them about the position they actually took in a more positive way? In this study, "By making people believe that they wrote down different responses moments earlier, we were able to make them endorse and express less polarized political views." That sounds great, but is it ethical? Via Futurity.
Today: 0 Total: 16 [Share]
] [