Obtaining Tsallis entropy at the onset of chaos


Tsallis entropy aims to extend traditional statistical mechanics, but some physicists believe the theory is incompatible with the fundamental principles of thermodynamics.

Dr Alberto Robledo, however, shows for the first time how Tsallis entropy can explain natural phenomena that turn out to be surprisingly linked to the transitions from regular to chaotic behaviours, a result that has eluded researchers so far. His discovery could lead to a deeper understanding of how thermodynamic systems behave.

Read more in Research Outreach

Read the original research: doi.org/10.3390/e24121761


Image Source: DepositPhotos / Issaro.now2




Hello and welcome to Research Pod! Thank you for listening and joining us today.


In this episode, we look at the work of Dr Alberto Robledo of Instituto de Física, Universidad Nacional Autónoma de México, who aims to extend traditional statistical mechanics approaches to hitherto unexplained natural phenomena. Robledo reveals for the first time how a concept in physics, called Tsallis entropy, can explain many important phenomena in nature.


Statistical mechanics is a branch of physics that accomplishes the feat of understanding and predicting how thermodynamic systems with large numbers of particles evolve over time. Instead of tracking the motion of every atom and molecule in a system, it treats it as a collection of possible microstates or configurations, each with a certain probability of occurring. By analysing these probabilities, physicists can make predictions about different properties of the system, including its temperature, pressure, and entropy – a measure of its randomness and disorder.


Statistical mechanics has been used extensively and with success within physics for well over a century. But in more recent decades, scientists have started to employ the tools and techniques developed within statistical mechanics in various fields outside physics, such as biology, ecology, economy, and social sciences. This is nowadays called the science of complex systems, where the role of particles is played by other basic entities or individuals. Parallel to these developments, researchers began to critically examine the foundations of this so far unchanged branch of physics first laid out in the 19th century. The possibility of uncovering a limit of validity for traditional statistical mechanics, together with its modifications beyond it, might open new fields of applicability.


In the late 1980s, Brazilian physicist Constantino Tsallis presented a modified form of the mathematical expression used to calculate a system’s entropy from the probability of its possible states. There is an additional parameter in the expression that could be varied, but when given the value unity the traditional formula is recovered. Many features of the ordinary theory remain in the extended version, but its range of validity could only be guessed, such as highly correlated systems, long-lived memory, or longrange interactions.


Despite this, exploration of the properties derived from Tsallis entropy led to some difficulties and controversies, with some physicists questioning its validity altogether. This disagreement has sparked decades of debate, and some researchers even argue that the theory is incompatible with the fundamental principles of thermodynamics.


One particularly intriguing aspect of this debate is the question of how systems transition into a state of chaos. When systems become chaotic, even miniscule changes to their starting conditions can lead to drastically different outcomes over time, making it incredibly difficult to predict their behaviours in the long term. Nonetheless, and precisely because of this, chaotic regimes are compatible with traditional statistical mechanics, but the exact boundary at which a system transitions from regular to chaotic behaviour is not.


Transitions into chaos underlie many important phenomena in nature, resulting from nonlinearity where the response of a system isn’t directly proportional to its inputs. As the system’s components interact with each other, this can cause complex patterns, including spirals, fractals, and even social structures to emerge spontaneously. Yet despite their relevance, these transitions have never been truthfully described by the Tsallis entropy equation.


Through their research, Robledo and his colleagues explore these transitions to chaos in systems named ‘low-dimensional nonlinear iterated maps.’ They describe the evolution of systems governed by relatively small numbers of variables, typically only one in the iconic logistic and circle maps, over discrete steps in time. In doing so, they ultimately aim to prove that Tsallis entropy is indeed the correct expression that replaces that of traditional statistical mechanics.


Transitions into, or, indeed out of, chaos in nonlinear maps are particularly interesting as they cause the breakdown of two key principles of statistical mechanics – present when the behaviour is chaotic. The first of these is called ‘ergodicity’, referring to the property where a system will eventually visit all its accessible states over long periods of time, with the amount of time spent in each state proportional to its probability. In nonlinear maps, ergodicity can break down in two possible ways.


The first is through ‘periodic orbits’, where a system revisits specific sets of states in repeating patterns, preventing the system from visiting all its possible states. When a system transitions into chaos, these stable states can branch into a stable ‘2-cycle’, containing two distinct states which undergo their own separate new set of states. From here, these separate states can themselves branch off into their own 2-cycles, creating a stable ‘4-cycle’, which may continue evolving in a cascade of branching states as the system becomes more and more chaotic. This type of chaotic transition is known as ‘period doubling’. A second, even more complex example of ergodicity breakdown is called ‘quasi-periodicity’, where the states of an evolving system never repeat, but follow intricate, non-repeating patterns.


The second aspect of statistical mechanics to break down in nonlinear maps undergoing chaotic transitions is named ‘mixing’. In traditional statistical mechanics, it describes how nearby points in a system will spread out and become increasingly uncorrelated as time progresses. In nonlinear maps, a breakdown in this property can lead to a behaviour named ‘intermittency’. Here, systems alternate between chaotic bursts and periods of regular behaviour – during which nearby points can remain close together.


For Robledo and colleagues, these interesting behaviours make nonlinear maps an ideal platform for examining the properties of the transitions to chaos – and more importantly, for assessing whether they can be accurately described using the Tsallis entropy formula. As they began to explore these systems in more detail, the researchers were surprised to find that these transitions hadn’t yet been studied in detail, as previous studies had shifted their interest to examining other types of nonlinear systems.


Through a series of recent studies, Robledo and colleagues aimed to unearth the novel behaviours of systems placed under these circumstances and discover the best conceivable way to describe their evolution in mathematical terms.


In his latest research, Robledo presents a key result of this work: that transitions to chaos in nonlinear maps can be best described by a modified form of a formula named the ‘Landau equation’. Derived from the principles of statistical mechanics, physicists often use this equation to describe how phase change takes place in time, like the condensation of a gas into a liquid.


The researchers examined the ‘Lyapunov function’ of the Landau equation – a concept often used to study the stability of dynamic systems. By assigning a real value to each possible state of the system, the function measures its ‘distance’ from a point of equilibrium, at which the system is stable and no longer evolves over time – so the longer the distance, the more unstable the system. Through these calculations, the researchers discovered that the Lyapunov function of the Landau equation is simply an expression of the Tsallis entropy formula. Crucially, the equation can be used to express period doubling, quasi-periodicity, and intermittency – all three known types of transitions into chaos in nonlinear maps.


Robledo’s discovery could have important implications for the ongoing debate over the validity of Tsallis entropy. By having proved that the questioned entropy formula governs the transitions from regular to chaotic behaviours, Robledo’s finding sets a direction for a firm understanding of this issue. In turn, Robledo presents a compelling case that Tsallis entropy is consistent with the fundamental principles of thermodynamics.


Robledo and colleagues’ research has already laid down the links of the onset of chaos with important questions and examples in condensed-matter physics and in complex systems phenomena. They ultimately hope that Tsallis’ theories may finally become more broadly accepted within the wider scientific community. In turn, his discoveries may pave the way for new breakthroughs in a wide range of fields within physics, and now including biology, ecology, computer science, and economics, where statistical mechanics is often sought to be applied.


That’s all for this episode – thanks for listening, and stay subscribed to Research Pod for more of the latest science.


See you again soon.


Leave a Reply

Your email address will not be published.

Researchpod Let's Talk

Share This

Copy Link to Clipboard