🇮🇷 Iran Proxy | https://www.wikipedia.org/wiki/Coupled_entropy
Jump to content

Coupled entropy

From Wikipedia, the free encyclopedia

Coupled entropy is a generalized measure of uncertainty that extends the classical Boltzmann–Gibbs–Shannon (BGS) entropy to systems exhibiting non-exponential probability structures.[1][2] It introduces a single nonlinear statistical coupling parameter () that deforms the logarithmic and exponential functions used to quantify uncertainty while preserving the classical form in the limit .

Coupled entropy was developed to provide a generalization flexible enough to characterize complex systems with scale-dependent or shape-dependent behavior, while maintaining a clear mathematical and physical interpretation. The framework naturally models non-exponential distributions, including both heavy-tailed and compact-support families, through a unified scale–shape representation.

The measure can be interpreted as isolating the linear component of uncertainty within a nonlinear system, though the precise relationship between total and nonlinear uncertainty remains an active area of research. By basing the generalization on an explicit coupling parameter, the formulation provides a transparent link between entropy, distribution shape, and the underlying sources of non-linearity in complex systems

Background

[edit]

Boltzmann–Gibbs–Shannon (BGS) entropy is the classical measure of uncertainty in probability theory, with Boltzmann introducing the logarithmic form for equiprobable states, Gibbs extending it to general probability distributions, and Shannon establishing its role in information theory. Because it is based on the standard logarithm, BGS entropy naturally describes systems whose probability distributions belong to the exponential family.

However, many complex systems deviate from exponential behavior due to nonlinear interactions, correlations across scales, or structural constraints such as compact support or heavy tails. Earlier generalized entropy frameworks [3], including the Rényi, Havrda-Charvá, Sharma-Tanej-Mital, Tsallis, and Hanel-Thurner entropies, introduced deformation parameters, but the physical interpretation of these parameters and their connection to identifiable sources of nonlinearity has remained unclear in many applications [4].

Coupled entropy frames the generalization on the degree of nonlinear statistical coupling (), which quantifies both the source of nonlinearity and the tail shape of the distribution. The coupling defines the deformation of the logarithmic and exponential functions.[1] Two important parameters retained from the exponential family are the potential of the variable (), which determines the shape of the distribution near the location and the dimension () of the random variable. These primary properties are related to the entropic index by the relationship , which from the relationship is understood to be the number of independent random variables in the same state i. This separation makes the role of nonlinearity explicit and mathematically transparent. The location parameter () is unchanged, though it may not be defined by the mean statistic. And the standard deviation, () is generalized to a scale parameter.

Scale invariance demonstration: σ=5 remains constant while coupling κ increases fluctuations in multiplicative noise process.

This perspective is consistent with the broader interpretation of entropy as a type of statistical average analogous to how the mean, median, and mode capture different aspects of central tendency. BGS entropy corresponds to evaluating the overall average uncertainty of a distribution. For the exponential distribution, , and for its generalization the Pareto Type II distribution, . The linear dependence on the nonlinear source of uncertainty, , dominates, motivating the need for complementary measures.

Derived below, the coupled entropy for the Pareto Type II or coupled exponential distribution is ; that is, the coupled entropy is 1 plus a generalized logarithm of the scale. This removes the linear dependency on the nonlinear coupling, thereby providing a modified measure of the uncertainty due to the scale. While not exact, an approximate relationship is that the BGS entropy is approximately plus the coupled entropy. So, the coupling is approximately the nonlinear uncertainty, and the coupled entropy is approximately the linear uncertainty.

Special Cases

[edit]

Coupled entropy reduces to several well-known entropy forms:

  • Shannon entropy for discrete probability distributions when ,
  • Boltzmann–Gibbs entropy for physical systems of independent, extensive components.

Mathematical formulation

[edit]
The informational scale is invariant to distribution shape in coupled exponentials (right), unlike in -exponentials (left), where scale depends on the shape parameter.

This section summarizes the mathematical framework presented in Nelson (2025).[1]

Coupled logarithm and exponential

[edit]

The coupled entropy framework is built on deformed exponential and logarithm functions parameterized by the coupling constant ():

where . These functions reduce to the standard exponential and natural logarithm as , recovering the classical Boltzmann–Gibbs–Shannon entropy framework.

For stretched distributions where the argument is , the framework extends to:

where represents the stretching exponent that influences the local distribution shape.

Discrete form

[edit]

For a one dimensional probability distribution , the coupled entropy is defined as:

where the independent equals (or escort) distribution is denoted by:

.

Continuous form

[edit]

For a continuous dimensional probability density function , the coupled entropy using the Lebesque-Stieltjes:

where the escort density is:

Parameter roles

[edit]

: nonlinear statistical coupling parameter quantifying both the source of nonlinearity and the tail shape

: stretching exponent influencing distribution shape near the location

: system dimension

: non-trace power factor for composability and extensivity

Relationship to other entropy measures

[edit]
Figure 3. Comparison of generalized entropies shows coupled entropy (green) balances between Tsallis (blue, too cold) and normalized Tsallis (red, too hot).

Coupled entropy can be compared with several established generalized entropy measures, particularly in terms of how each framework incorporates deformation parameters, nonlinearity, or deviations from classical Boltzmann–Gibbs statistics. It provides a unifying perspective for examining systems with heavy tails, nonlinear interactions, or scale–dependent effects.

Comparison with Tsallis entropy

[edit]

Although the coupled and Tsallis entropies are related via the independent equals probability , where , the Tsallis entropy has a different normalization, which impacts its ability to be a measure of uncertainty. The Tsallis entropy converges to 1, that is, it is "too cold", for the coupled exponential distribution as the coupling goes to infinity. The normalized Tsallis entropy was an attempt to improve the structure of the Tsallis entropy but is unstable and goes to infinity as the coupling goes to infinity for the coupled exponential distribution. These properties are shown in Figure 3.

Shannon entropy

[edit]

In the limit , the coupled entropy converges to the Boltzmann–Gibbs–Shannon (BGS) entropy for any value of the stretching exponent . That is, the coupling term not the stretching exponent determines departures from classical extensive statistics. Explicitly, in this limit:

recovering conventional Shannon entropy.

Non-exponential distributions

[edit]

Coupled entropy is naturally associated with non-exponential probability distributions, a class that includes both heavy-tailed and compact-support families. Examples include:

  • Student’s t distributions,
  • Pareto-type power laws,
  • stretched exponential distributions,
  • coupled-exponential distributions,
  • compact-support models used in machine learning for pruning or sparsity.

This viewpoint broadens the traditional distinction between “exponential” and “heavy-tailed” systems by recognizing that many models of interest in complex systems exhibit nonlinear scaling behavior, whether through long tails or finite support.

One-Dimensional Coupled Exponential Family

[edit]

The one-dimensional coupled exponential family (CEF) generalizes the classical exponential family to account for heavy-tailed and compact-support distributions via a nonlinear coupling parameter .[5][1]

For a random variable , the one-dimensional coupled exponential PDF is defined as:

where is the exponent of the variable function, is the base measure, and is the generalized partition function.

Coupled Stretched Exponential Distribution

[edit]

When in the one-dimensional coupled exponential family, the resulting distribution is called the coupled stretched exponential. The one dimensional PDF and survival function are:

Parameters:

  • : location
  • : scale
  • : nonlinear coupling parameter
  • : stretch exponent

Special Cases

[edit]
  • : Coupled Exponential Distribution (generalized Pareto)
  • : Coupled Gaussian Distribution (Student’s t)

Parameter Interpretation

[edit]
  • controls tail heaviness (exponential, heavy-tailed, or compact-support).
  • controls the stretch or curvature near the location.

Dynamical Origin

[edit]

Coupled stretched exponentials arise as stationary solutions of stochastic differential equations involving both additive and multiplicative noise,[6] such as

Under appropriate conditions on , the stationary solution takes the form [5]

where the coupling parameter is

and the scale is

This provides a physical interpretation of as the relative strength of multiplicative noise, and of through the growth of .

Relation to Coupled Entropy

[edit]

These distributions maximize the coupled entropy under independent-equals-moment constraints.[1]

Historical Context and Foundational Theory

[edit]

The mathematical foundation for the coupled exponential family was formalized in the paper .[7]

In this work, the coupled-product () was introduced as a nonlinear generalization of multiplication, enabling the construction of multivariate coupled-exponential functions. This operation provides the algebraic basis for defining probability distributions whose components interact through nonlinear statistical coupling.

The paper also showed how the coupled-product generates multivariate generalizations of the -exponential and how parameters such as scale and coupling separate naturally within this framework. While the stretched-exponential case () was not derived in that paper, the coupled-product formalism laid the structural foundation for later generalizations, including the coupled stretched exponential.

Applications

[edit]

Used in statistical modeling of heavy-tailed processes, turbulence, and robust machine learning.

Properties

[edit]
Figure 4. Coupled entropy uniquely aligns with scale parameter σ across different couplings, unlike other generalized entropies

Fundamental properties

[edit]
  • Reduces to the Boltzmann-Gibbs-Shannon entropy when the coupling is zero.
  • Sensitive to nonlinear coupling between microstates.
  • Provides a generalized measure of uncertainty applicable to heavy-tailed systems.
  • The asymptotic scaling can be adjusted to fulfill requirements such as extensivity.
  • Admits generalized additivity laws via the coupled logarithm.

Scale alignment

[edit]

As visualized, coupled entropy uniquely maintains alignment with the distribution's scale parameter σ across different coupling values, unlike other entropy measures which deviate from the characteristic scale.

Applications

[edit]

Coupled entropy has been applied in:

  • Statistical mechanics of complex and non-extensive systems,
  • Information theory with nonlinear dependencies,
  • Thermodynamics of coupled or interacting subsystems,
  • Modeling heavy-tailed noise and uncertainty,
  • Signal processing and robust inference,
  • Complex networks and interacting particle systems.

Recent or advanced applications

[edit]

Recent studies have applied the coupled entropy framework to advanced statistical and computational methods. For instance, Nelson, Oliveira, and Al‑Najafi (2025) utilized the curved geometry of coupled free energy in variational inference, demonstrating how coupled exponential distributions can enhance inference for heavy-tailed and complex systems.[8]

See also

[edit]

References

[edit]
  1. ^ a b c d e Nelson, Kenric P. (2025). "On the uniqueness of the coupled entropy". arXiv:2511.17684 [cond-mat.stat-mech].
  2. ^ Nelson, Kenric P.; Umarov, Sabir R.; Kon, Mark A. (15 February 2017). "On the average uncertainty for systems with nonlinear coupling". Physica A: Statistical Mechanics and Its Applications. 468: 30–43. arXiv:1510.06951. Bibcode:2017PhyA..468...30N. doi:10.1016/j.physa.2016.09.046.
  3. ^ Ilić, V. M.; Korbel, J.; Gupta, S.; Scarfone, A. M. (1 March 2021). "An overview of generalized entropic forms (a)". Europhysics Letters. 133 (5): 50005. arXiv:2102.10071. Bibcode:2021EL....13350005I. doi:10.1209/0295-5075/133/50005.{{cite journal}}: CS1 maint: article number as page number (link)
  4. ^ Hanel, R.; Thurner, S. (2011). "When do generalized entropies apply?". EPL. 93: 20006. doi:10.1209/0295-5075/93/20006.{{cite journal}}: CS1 maint: article number as page number (link)
  5. ^ a b Nelson, K.P. (2025). Coupled Entropy: A Goldilocks Generalization for Complex Systems. arXiv:2506.17229.
  6. ^ Anteneodo, C., & Tsallis, C. (2003). Multiplicative noise and nonlinear Fokker–Planck equations. Physica A.
  7. ^ Nelson, K. P. (2015). "A definition of the coupled-product for multivariate coupled-exponentials." Physica A, 422, 187–192.
  8. ^ Nelson, Kenric P.; Oliveira, Igor; Al‑Najafi, Amenah (2025). "Variational Inference Optimized Using the Curved Geometry of Coupled Free Energy". arXiv:2506.09091 [cs.LG].