central lead-lead collisions at center of mass energy 2.76 TeV per nucleon at Large Hadron Collider have been analysed, systematically, in the light of a **Tsallis** **non**-**extensive** **statistics**. The study reveals the possible formation of a deconfined partonic matter in such central nuclear collisions at ultrarelativistic energy where the number of participant nucleons is comparatively large.

The role played by **non** **extensive** thermodynamics in physical sys- tems has been under intense debate for the last decades. With many applications in several areas, the **Tsallis** **statistics** has been discussed in details in many works and triggered an interesting discussion on the most deep meaning of entropy and its role in complex systems. Some possible mechanisms that could give rise to **non** **extensive** **statistics** have been formulated along the last several years, in particular a frac- tal structure in thermodynamics functions was recently proposed as a possible origin for **non** **extensive** **statistics** in physical systems. In the present work we investigate the properties of such fractal thermody- namical system and propose a diagrammatic method for calculations of relevant quantities related to such system. It is shown that a system with the fractal structure described here presents temperature fluctua- tion following an Euler Gamma Function, in accordance with previous

Show more
22 Read more

In 1988, Constantino **Tsallis** proposed, in a paper entitled "Possible generalization of Boltzmann–Gibbs **statistics**" [1], a new concept of entropy, which is known today as "**Tsallis** entropy". This entropy was embedded in a generalization of the classical **statistics**, formulated for a **non**-**extensive** thermodynamics. For systems with long-range interactions or long time memory, **Tsallis** used an approach which was inspired by some multifractals concepts. As the scaling functions of the universal multifractals are depending on a multifractality index, the **Tsallis** entropy depends on a dimensionless parameter; when this parameter has the limit value of 1, the entropy is recovering the expression of Boltzmann-Gibbs entropy.

Show more
11 Read more

In concluding, we point out that in the present work we indicate that the **non** extensivity viewpoint is applicable to natural hazard processes. In the frame of a **non**-**extensive** ap- proach which is based on **Tsallis** entropy for the construction of the probability density function (PDF) and a phenomeno- logical exponential expression for the damage function, we analytically calculate the risk function of natural hazards (earthquakes, rockfalls, forestfires, landslides). For the low- est size (i.e. energy level) of the natural hazard the PDF can be deduced on the basis of the maximum entropy principle using BG **statistics**. In the low energy regime the correlation between the different parts of elements involved in the evolu- tion of natural hazards are short-ranged. As the size (i.e. en- ergy) increases, long range correlation becomes much more important, implying the necessity of using **Tsallis** entropy as an appropriate generalization of BG entropy. The power law behaviour for the PDF is derived as a special case, leading to b-values being functions of the **non**-extensivity parameter q. The analysis of risk function dependence on the parameters of hazard PDF and damage function for various hazards indi- cates that earthquakes, rockfalls and forest fires exhibit sim- ilar behaviour, in which the total risk arises from the largest events, while for landslides, in a first linear approximation, risk is coming from the smaller events. The latter result is strongly governed by the selection of an appropriate damage model (i.e. the exponent β ).

Show more
Entropy emerges as an important quantity in different areas that try to describe systems of in- creasing complexity. The formulation of new entropic forms that generalize the one proposed by Boltzmann constitutes an important research line. In particular, the **non**-additive entropy introduced by **Tsallis** [1] has found wide applicability in the last few years, see e.g. [2, 3]. However, the full understanding of the **non**-**extensive** **statistics** formulated by **Tsallis** has not been accomplished yet. Several connections between Boltzmann and **Tsallis** **statistics** have been proposed so far, see e.g. [4–6], but it seems that the physical meaning of the entropic in- dex q is not understood in the general case. In the present work, we make a detailed analysis of the connection based on a system featuring fractal structure in its thermodynamic prop- erties. Fractals are conceived as objects with an internal structure that can be considered as an ideal gas of a specific number of subsystems, which, in turn, are also fractals of the same kind. The self-similarity between fractals at different levels of the internal structure reveals the typical scale invariance. The results obtained in the present work offer a new perspective in the analysis of hadron structure.

Show more
In this work a possible answer to the question formulated above is proposed. It is shown that hadrons can present, in a very speciﬁc form, a fractal structure. Then it is shown that such fractals must be necessarily described through the **Tsallis** **statistics** [16], a generalization of the Boltzmann- Gibbs-Shannon **statistics**. After this, a generalization of the self-consistent theory developed by Hage- dorn accommodating the generalized **statistics** is presented, what can be called **Non** **Extensive** Self- Consistent Thermodynamics (NESCT). Comparison of the results of NESCT and experimental data, some of its consequences and applications are discussed.

Show more
In this paper an automated segmentation approach based on **Tsallis** entropy method. The existing method the parameter ‘q’ and threshold value which not very easy to converge. In this paper automation of the ‘q’ parameter is tried to obtain the threshold value. Three different methods are proposed and tested but the last method yields better results than

Statistical mechanical methods have been applied successfully to the study of neural network models of associative memory [2]. These models are biologically plausible and can be trained very quickly in some cases, compared with the popular neural networks such as multi–layered perceptron, which have been shown to work satisfactorily. However, this model of associative memory has still drawbacks as learning gets stuck at local minima. A variety of global optimization algorithms have also been introduced over the years to overcome the problem of local minima. One of the most popular methods is the Simulated annealing [3]. It uses Boltzmann–Gibbs (BG) **statistics** at two different steps, namely at the visitation step, which uses a Gaussian distribution, and at the acceptance step, that uses the Boltzmann factor [4, 5].

Show more
14 Read more

Several studies [22][23] indicate that the issues connected with the assumptions of CAPM (viz. efficient market hypothesis) can be addressed using statistical methods based on **Tsallis** entropy [24], which is a generalization of Shannon entropy to **non**-**extensive** systems. These methods were originally proposed to study classical and quantum chaos, physical systems far from equilibrium such as turbulent systems (**non**-linear), and long range interacting Hamiltonian systems. However, in the last several years, there has been considerable interest in applying these methods to analyze financial market dynamics as well. Such applications fall into the category of econophysics [25]. The rest of the paper is organized as follows. In Section 2, **Tsallis** relative entropy with some necessary background on **Tsallis** entropy and 𝑞 -Gaussian distribution is discussed. A relationship between TRE and the parameters of a 𝑞 -Gaussian distribution is derived. Section 3 deals with the data and methodology for constructing risk optimal portfolios and their results. The conclusions are given in Section 4.

Show more
22 Read more

In this paper, we have discussed the role of **Tsallis** entropic index in determining the bi-level and three- level thresholding. For some images, the values of thresholds can have a jump, when the entropic index is spanning interval (0,1). This is provoking an abrupt transition in the appearance of the corresponding output images. We can define this behavior as an “image transition”. Of course, the investigation of image transitions can be further extended to the general multi- level thresholding. The gray-level image transitions are analogous to order or texture transitions observed in physical systems.

Show more
10 Read more

The research design comprised a population-based retrospective study of completed suicides and hospital separations for **non**-fatal intentional self-harm which occurred in NSW from 2005 to 2013 (most recent data available at the time). Data was aggregated across this period to ensure sufficient numbers of suicide deaths within regions were available to enable a robust analysis; consequently, temporal cluster analysis was not possible. For all data, a SA2 nine-digit code for place of residence was used for spatial attribution. There were 548 SA2s in NSW at the time of data acquisition. Scan **statistics** were used to identify which SA2s formed statistically significant suicide clusters. These clusters were presented within LGA boundaries to minimise the risk of identifying individuals in areas with low counts.

Show more
10 Read more

In the previous section, the Cell Survival Curve (CSC) is derived in equation (11) using **Tsallis** **statistics**. In the model, the only free parameters are (N’, q). The dependence of CSC on different N’ and q values are shown in FIG 1 (Top) below. In general, the cell survival decreases more under a lower value of N’ and a higher value of q. The biophysical interpretations will be discussed further in this section.

The paper is organized as follows. Section 2 reviews nonextensive entropies, with empha- sis on the **Tsallis** case. Section 3 discusses Jensen differences and divergences. The concepts of q-differences and q-convexity are introduced in Section 4, where they are used to define and characterize some new divergence-type quantities. In Section 5, we define the Jensen-**Tsallis** q- difference and derive some of its properties; in that section, we also define k-th order Jensen-**Tsallis** q-differences for families of stochastic processes. The new family of entropic kernels is introduced and characterized in Section 6, which also introduces nonextensive kernels between stochastic pro- cesses. Experiments on text categorization are reported in Section 7. Section 8 concludes the paper and discusses future research.

Show more
41 Read more

Abstract. Existing Dutch guidelines for the design of the drinking water and hot water system of **non**- residential buildings are based on outdated assumptions on peak water demand or on unfounded assumptions on hot water demand. They generally overestimate peak demand values required for the design of an e ffi cient and reliable water system. Recently, a procedure was developed based on the end-use model SIMDEUM to derive design-demand-equations for peak demand values of both cold and hot water during various time steps for several types and sizes of **non**-residential buildings, viz. o ffi ces, hotels and nursing homes. In this paper, the design-demand-equations are validated with measurements of cold and hot water patterns on a per second base and with surveys. The good correlation between the simulated water demand patterns and the measured patterns indicates that the basis of the design-demand-equations, the SIMDEUM simulated standardised build- ings, is solid. Surveys were held to investigate whether the construction of the standardised buildings based on the dominant variable corresponds with practice. Surveys show that it is di ffi cult to find relationships to equip the standardised buildings with users and appliances. However, the validation proves that with a proper estimation of the number of users and appliances in only the dominant functional room of the standardised buildings, SIMDEUM renders a realistic cold and hot water diurnal demand pattern. Furthermore, the new design-demand-equations based on these standardised buildings give a better prediction of the measured peak values for cold water flow than the existing guidelines. Moreover, the new design-demand-equations can pre- dict hot water use well. In this paper it is illustrated that the new design-demand-equations lead to reliable and improved designs of building installations and water heater capacity, resulting in more hygienic and economi- cal installations.

Show more
16 Read more

In image processing, Shannon entropy was the first being used, but today, it is the **Tsallis** formulation of entropy that seems to be preferred [2-4]. For the elaboration of images, the entropy uses their histograms. For instance, in the bi-level segmentation of a gray-level image, a threshold is determined which separates the gray tones in two systems A and B, maximizing the entropy. Considering A and B independent, the entropy S ( A B ) is the generalized sum S A S B , where S A , S B are the corresponding entropies of the systems. In this paper, we will discuss the use of Shannon and **Tsallis** entropies for image thresholding. Among the other formulations of entropy [5], here we propose the thresholding using Kaniadakis entropy, which is a quite attractive entropy based on the relativistic formulation of the statistical mechanics [6,7].

Show more
Employees have played an important role in the success of enterprises in general and **non**-state enterprises in particular in the fourth industrial revolution. Hence, quality of an employee is very essential, especially for employees who worked in the new era of manufacturing. The purpose of this study is to analyze the effects of the fourth industrial revolution on employees in **non**- state enterprises. Secondary data in the period from 2013 to 2017 was collected from Thai Nguyen **statistics** office. Data was analyzed with descriptive and inferential **statistics**. The results of the article have shown that Industry 4.0 had both positive and negative effects on the employees in **non**-state enterprises. The wave of artificial intelligence can lead to mass unemployment in Thai Nguyen province, as a large number of Vietnamese labourers will not be able to adapt to new technologies. In the Industry 4.0, labourer’s **non**-state enterprise has many opportunities to approach and participate in the new technology as it has a youthful labours market with “golden population”. After the analysis, the researcher proposed recommendations in order to enhance qualities of employees and give employee’s requirements in **non**-state enterprises in the fourth industrial revolution.

Show more
13 Read more

US National Center for Health Statistics: Utilization of short-stay hospitals: Summary of non-medical statis tics, United States, 1970. US National Center for Health Statistics: Utilizat[r]

Abstract: The variance of Shannon information related to the random variable X, which is called varentropy, is a measurement that indicates, how the information content of X is scattered around its entropy and explains its various applications in information theory, computer sciences, and **statistics**. In this paper, we introduce a new generalized varentropy based on the **Tsallis** entropy and also obtain some results and bounds for it. We compare the varentropy with the **Tsallis** varentropy. Moreover, we explain the **Tsallis** varentropy of the order **statistics** and analyse this concept in residual (past) lifetime distributions and then introduce two new classes of distributions by them.

Show more
16 Read more

In this paper, a multi-level thresholding is presented for gray scale image dataset using FDPSO and **Tsallis** function. This procedure finds the optimal threshold for a chosen image with a chosen T value. The proposed segmentation technique is compared with other heuristic algorithms, such as BA and FA. The performance of FDPSO is evaluated using quantitative and qualitative measures, such as f(T), CPU time, STD, PSNR, MSSIM, NAE, NCC, AD and SC. The robustness of the segmentation scheme is verified by considering PN and GN corrupted image dataset. The experimental results confirm that the FDPSO assisted segmentation procedure offers better results in most of the cases when compared with BA and FA. Wilcoxon’s rank test also proves that, FDPSO

Show more
16 Read more

Transverse hadron spectra measured in high energy colli- sions in the last three decades, fit to the **Tsallis** distribution (TS) (see Refs. [1]-[12] for proton-proton ( pp), proton- antiproton ( p p) and nucleus-nucleus (AA) collisions and ¯ Refs. [13, 14] for e + e − collisions). On the theoretical side, there are many proposals on the emergence of the TS dis- tribution. In kinetic theory, the collision term of the Boltz- mann-equation [15]-[17], or the noise term of the Langevin- equation [18] can be generalised in a way, in which the TS distribution is the stationary sollution. In equilibrium thermodynamics, the Maximum Entropy Principle (MEP) together with a generalisation of the Shannon-entropy for- mula (the **Tsallis**-entropy, see Ref. [19]) also lead to the TS distribution, as a generalisation of the canonical Boltzmann- Gibbs distribution (BG). The TS distribution can also be derived from the MEP by introducing special interactions, while, leaving the original Shannon-entropy unaltered [11, 20].

Show more