> [!NOTE] > Nassim Taleb > There are limitations of human knowledge when working with the unknown. Because our minds need to reduce information, we are more likely to try to eliminate the unknown. Due to our detections of false patterns, what is random will appear less random and more certain - our overactive brains are more likely to impose the wrong, simplistic narrative than no narrative at all. We scorn the abstract and invisible in favor of the tangible and vivid. The mind was not designed to deal with complexity and nonlinear uncertainties. Neither is science capable of dealing effectively with interdependent systems (climate, economic life, human body), in spite of its hyped-up successes in the linear domain (physics and engineering), which give it a prestige that has endangered us. Counter to the common discourse, more information means more delusions: our detection of false patterns is growing faster as a side effect of modernity and the information age. Our mental architecture is at an increased mismatch with the world in which we live. This leads to trouble: when the map does not correspond to the territory, there is a certain category of fool, with an ability to discount what they did not see, the unobserved, imagining the territory as fitting their map. > > We are robust when errors in the representation of the unknown and understanding of random effects do not lead to adverse outcomes - fragile otherwise.