WHEN , IN THE late 1970s, desktop computers suddenly made it simple to solve non-linear equations, scientists in ﬁelds from ﬂuid dynamics to ecosystem studies began modelling their subjects with them. This was an earth-shaking development. Earlier, scientists could only model their subjects with relatively simpler, quicker-to-solve linear equations. With linear equations, ‘the sum of two equations is again a solution’. 1 A small cause will create a small effect, and a large cause, a large effect. They were the kinds of equations that Isaac Newton used to perfect his physics. Combined with René Descartes’ philosophy, Newton’s physics created a world view in which independent objects interacted according to a set of ‘Universal Laws of Nature’ in linear processes of cause-and-effect. Johannes Kepler had called the resulting world a ‘Clockwork Universe’. 2 However, as successful the resulting scientiﬁc paradigm proved to be, most of life is non-linear. After all, it took only a few small shifts in the genes of bacteria in Chinese fowl sometime around 1917 to cause the inﬂuenza epidemic that ravaged America and Europe at the end of World War I. Small causes can have enormous effects. But non-linear equations were much more time-consuming to solve. With the desktop computer revolution of the late 1970s, it was suddenly possible for scientists to model their topics with non-linear equations, with which small causes can have large effects. The scientists using them quickly made two discoveries. First, non-linear equations created a more accurate picture of how things in the world behaved.By Dmitri Bondarenko and Ken Baskin in History and Cultural History. Big History emerged as part of this non-linear way of understanding the world. What we discovered in working together is that the insights of complexity theory, which studies the
Author Archive for Alessandro Cerboni
In cognitive science, the rational analysis framework allowsmodelling of how physical and social environments imposeinformation-processing demands onto cognitive systems. Inhumans, for example, past social contact among individualspredicts their future contact with linear and power functions.These features of the human environment constrain theoptimal way to remember information and probably shapehow memory records are retained and retrieved. We offera primer on how biologists can apply rational analysis tostudy animal behaviour. Using chimpanzees ( Pan troglodytes )as a case study, we modelled 19 years of observationaldata on their social contact patterns. Much like humans,the frequency of past encounters in chimpanzees linearlypredictedfutureencounters,andtherecencyofpastencounterspredicted future encounters with a power function. Consistentwith the rational analyses carried out for human memory,these ﬁndings suggest that chimpanzee memory performanceshould reﬂect those environmental regularities. In re-analysingexisting chimpanzee memory data, we found that chimpanzeememory patterns mirrored their social contact patterns. Ourﬁndings hint that human and chimpanzee memory systemsmay have evolved to solve similar information-processingproblems. Overall, rational analysis offers novel theoreticaland methodological avenues for the comparative studyof cognition
Abstract Despite the clear fitness conse?uences of animal decisions, the science of animal decision ma-ing in evolutionary iology is underdeveloped compared to decision science in human psychology. Specifically, the field lac-s a conceptual frameor- that defines and descries the relevant components of a decision, leading to imprecise language and concepts. The @Audgment and decision ma-ing (
What do predictive analytics and behavioral economics have in common? Quite a bit, as it turns out.
Two overdue sciences Near the end of Thinking, Fast and Slow, Daniel Kahneman wrote, “Whatever else it produces, an organization is a factory that produces judgments and decisions.”2 Judgments and decisions are at the core of two of the most significant intellectual trends of our time, and the one-word titles of their most successful popularizations have become their taglines. “Moneyball” is shorthand for applying data analytics to make more economically efficient decisions in business, health care, the public sector, and beyond. “Nudge” connotes the application of findings from psychology and behavioral economics to prompt people to make decisions that are consistent with their long-term goals. Other than the connection with decisions, the two domains might seem to have little in common. After all, analytics is typically discussed in terms of computer technology, machine learning algorithms, and (of course) big data. Behavioral nudges, on the other hand, concern human psychology. What do they have in common? When the ultimate goal is behavior change, predictive analytics and the science of behavioral nudges can serve as two parts of a greater, more effective whole.
Executive cognitive functions like working memory determine the success or failure of a wide variety of different cognitive tasks. Estimation of constructs like working memory load or memory capacity from neurophysiological or psychophysiological signals would enable adaptive systems to respond to cognitive states experienced by an operator and trigger responses designed to support task performance (e.g. by simplifying the exercises of a tutor system, or by shutting down distractions from the mobile phone). The determination of cognitive states like working memory load is also useful for automated testing/assessment, for usability evaluation and for tutoring applications. While there exists a huge body of research work on neural and physiological correlates of cognitive functions like working memory activity, fewer publications deal with the application of this research with respect to single-trial detection and real-time estimation of cognitive functions in complex, realistic scenarios. Single-trial classifiers based on brain activity measurements such as EEG, fNIRS or physiological signals such as EDA, ECG, BVP or Eyetracking have the potential to classify affective or cognitive states based upon short segments of data. For this purpose, signal processing and machine learning techniques need to be developed and transferred to real-world user interfaces.In this research topic, we are looking for: (1) studies in complex, realistic scenarios that specifically deal wit
It might sound strange, but unusual cues that actually distract you can make top reminders for that to-do list. When trying to save more money or pay off a debt, a carefully thought-out plan can really improve personal finances. But it all takes time – and it can be so easy to forget the detail, or otherwise not follow through with a strategy. There’s no foolproof way to remember everything, even when it’s to do with money; memories fade and present bias keeps the focus on the here and now. However, research by Todd Rogers at Harvard and Katherine Milkman of the University of Pennsylvania reveals that certain types of reminder work better than others – sometimes improving the ability to remember tasks on schedule by as much as 32%. Matching the right cues Rogers and Milkman found that making up unrelated, striking cues and matching them up with tasks we need to be reminded about can help. In a first small experiment, the team aimed to get people to remember a promise to donate to charity.
Think of Halloween and scares probably are front of mind. Trick or treat. Ghouls and carved pumpkins. In share markets, it appears investor behaviour can also be skewed by scares that accompany dark nights and downbeat emotions associated with the end of summer in the Northern Hemisphere. The phrases “sell in May” (ideally, when prices are high) and “buy on Halloween” (if prices are in a dip) are well known. This “Halloween indicator” will not work every year but research suggests there is some evidence to back it up, on average. It is one of a range of calendar effects that some believe exist in financial markets.