[Todos] Fwd: Entropy MDPI, Special Issue "Information Theoretic Measures and Their Applications"
Asistentes de Secretaria de Fisica
secre2 en fisica.unlp.edu.ar
Lun Feb 25 20:09:55 -03 2019
-------- Mensaje original --------
ASUNTO:
Entropy MDPI, Special Issue "Information Theoretic Measures and Their
Applications"
FECHA:
2019-02-25 14:01
REMITENTE:
Fernando Fabian Montani <fmontani en gmail.com>
DESTINATARIO:
Secretaria del Dpto.de Física <secre2 en fisica.unlp.edu.ar>
Hola Alejandro y Cecilia
disculpa la molestia podrian compartir difundir esta informacion?
Por favor comparti esta y no la anterior p q habia un error de tipeo...
Desde ya muchas gracias,
Saludos,
Fernando
SPECIAL ISSUE "INFORMATION THEORETIC MEASURES AND THEIR APPLICATIONS"
* Special Issue Editors [1]
* Special Issue Information [2]
* Keywords [3]
* Published Papers [4]
A special issue of _Entropy_ [5] (ISSN 1099-4300).
https://www.mdpi.com/journal/entropy/special_issues/Information_Theoretic_Measures
Deadline for manuscript submissions: 31 December 2019
SHARE THIS SPECIAL ISSUE
[6] [7]
SPECIAL ISSUE EDITORS
_Guest Editor _
Dr. Osvaldo Anibal Rosso
1. Departamento de Bioingeniería, Insitituto Tecnológico de Buenos Aires
(ITBA), C1106ACD Av. Eduardo Madero 399, Ciudad Autónoma de Buenos
Aires, Argentina
2. Instituto de Física, Universidade Federal de Alagoas (UFAL), BR 104
Norte km 97, 57072-970 Maceió, Alagoas, Brazil
3. Facultad de Ingeniería y Ciencias Aplicadas, Universidad de Los
Andes, Santiago, Chile
E-Mail [8]
Interests: time-series analysis; information theory; time-frequency
transform; wavelet transform; entropy and complexity; non-linear
dynamics and chaos; complex networks, medical and biological
applications
_Guest Editor _
Dr. Fernando Fabian Montani
Instituto de Física de Líquidos y Sistemas Biológicos (IFLYSIB),
Universidad Nacional de La Plata & CONICET, La Plata, Argentina
E-Mail [8]
Interests: time-series analysis; information theory; brain and neuronal
dynamics, neural coding; entropy and complexity; nonlinear dynamics and
chaos; complex networks, medical and biological applications
SPECIAL ISSUE INFORMATION
Dear Colleagues,
The concept of entropy, an ever-growing physical magnitude that measured
the degree of decay of order in a physical system, was introduced by
Rudolf Clausius in 1865 through an elegant formulation of the second law
of thermodynamics. Seven years later, in 1872, Ludwig Boltzmann proved
the famous _H_-theorem, showing that the quantity _H_ always decreases
in time, and in the case of perfect gas in equilibrium, the quantity _H_
was related to Clausius' entropy _S. _The dynamical approach of
Boltzmann, together with the elegant theory of statistical ensembles at
equilibrium proposed by Josiah Willard Gibbs, led to the Boltzmann-Gibbs
theory of statistical mechanics, which represents one of the most
successful theoretical frameworks of physics. In fact, with the
introduction of entropy, thermodynamics became a model of theoretical
science.
In 1948, Claude E. Shannon developed a "statistical theory of
communication", taking ideas from both logic and statistics that in turn
opened new paths for research. The powerful notion of _information
_entropy played a major part in the development of new statistical
techniques, overhauling the Bayesian approach to probability and
statistics. It provided powerful new techniques and approaches on
several fields of science, extending and shedding new light on the
field.
In the space of few decades, chaos theory has jumped from the scientific
literature into the popular realm, being regarded as a new way of
looking at complex systems like brains or ecosystems. It is believed
that the theory manages to capture the disorganized order that pervades
our world. Chaos theory is a facet of the complex systems paradigm
having to do with determinism randomness. In 1959, Kolmogorov observed
that Shannon's probabilistic theory of information could be applied to
symbolic encodings of the phase-space descriptions of physical nonlinear
dynamical systems so that one might characterize a process in terms of
its Kolmogorov-Sinai entropy. Pesin's theorem in 1977 proved that for
certain deterministic nonlinear dynamical systems exhibiting chaotic
behavior, an estimation of the Kolmogorov-Sinai entropy is given by the
sum of the positive Lyapunov exponents for the process. Thus, a
nonlinear dynamical system may be viewed as an information source from
which information-related quantifiers may help to characterize and
visualize relevant details of the chaotic process.
The information content of a system is typically evaluated via a
probability distribution function (PDF) _P _describing the apportionment
of some measurable or observable quantity, generally a time series
_X_(t). Quantifying the information content of a given observable is
therefore largely tantamount to characterizing its probability
distribution. This is often done with a wide family of measures called
information theory quantifiers (i.e., Shannon entropy and mutual
information, Fisher information, statistical complexity, etc.). Thus,
information theory quantifiers are measures that are able to
characterize the relevant properties of the PDF associated with these
time series, and in this way, we should judiciously extract information
on the dynamical system under study.
The evaluation of the information theory quantifiers supposes some prior
knowledge about the system; specifically, a probability distribution
associated to the time series under analysis should be provided
beforehand. The determination of the most adequate PDF is a fundamental
problem, because the PDF _P_ and the sample space Ω are inextricably
linked. Many methods have been proposed for a proper selection of the
probability space (Ω, _P_). Among others, we can mention: Frequency
counting; procedures based on amplitude statistics; binary symbolic
dynamics; Fourier analysis; wavelet transform; and permutation patterns.
The suitability of each of the proposed methodologies depends on the
peculiarity of data, such as stationarity, length of the series, the
variation of the parameters, the level of noise contamination, etc. In
all these cases, global aspects of the dynamics can somehow be captured,
but the different approaches are not equivalent in their ability to
discern all relevant physical details.
_Mutual information_ rigorously quantifies, in units known as "bits",
how much _information_ the value of one variable reveals about the value
of another. This is a dimensionless quantity that can be thought of as
the reduction in uncertainty about one random variable given knowledge
of another. Fisher information, which predates the Shannon entropy, and
the more recent statistical complexities have also proved to be useful
and powerful tools in different scenarios, allowing in particular to
analyze time series and data series independently of their sources. The
Fisher information measure can be variously interpreted as a measure of
the ability to estimate a parameter, as the amount of information that
can be extracted from a set of measurements, and also as a measure of
the state of disorder of a system or phenomenon.
Among the most recent entropy proposals, we can mention approximate
entropy; sample entropy; delayed permutation entropy and permutation
min-entropy. That is, different methodologies have been used to
understand the mechanisms behind information processing. Among those,
there are also methods of frequency analysis like wavelet transform
(WT), which distinguishes itself from others due to the high efficiency
when dealing with feature extraction. The "wavelet analysis" is the
appropriate mathematical tool to analyze signals in the time and
frequency domain. All these measures have important applications not
only in physics but also in quite distinct areas, such as biology,
medicine, economy, cognitive sciences, numerical and computational
sciences, bigdata analysis, complex networks, and neuroscience.
In summary, for the present Special Issue, manuscripts focused on any of
the abovementioned "_Information Theoretic Measures as Mutual
Information, _Permutation Entropy Approaches, Sample Entropy, Wavelet
Entropy and its Evaluations_"_, as well as, its interdisciplinaries
applications are more than welcome.
Dr. Osvaldo Anibal Rosso
Dr. Fernando Fabian Montani
_Guest Editors_
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com [9] by
registering [10] and logging in to this website [11]. Once you are
registered, click here to go to the submission form [12]. Manuscripts
can be submitted until the deadline. All papers will be peer-reviewed.
Accepted papers will be published continuously in the journal (as soon
as accepted) and will be listed together on the special issue website.
Research articles, review articles as well as short communications are
invited. For planned papers, a title and short abstract (about 100
words) can be sent to the Editorial Office for announcement on this
website.
Submitted manuscripts should not have been published previously, nor be
under consideration for publication elsewhere (except conference
proceedings papers). All manuscripts are thoroughly refereed through a
single-blind peer-review process. A guide for authors and other relevant
information for submission of manuscripts is available on the
Instructions for Authors [13] page. _Entropy_ [14] is an international
peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors [13] page before submitting a
manuscript. The Article Processing Charge (APC) [15] for publication in
this open access [16] journal is 1600 CHF (Swiss Francs). Submitted
papers should be well formatted and use good English. Authors may use
MDPI's English editing service [17] prior to publication or during
author revisions.
KEYWORDS
* Shannon entropy
* mutual information
* Fisher information
* statistical complexity
* information processing
* different PDF evaluations
* different dynamic states captured by information theoretical
approaches
PUBLISHED PAPERS
This special issue is now open for submission.
Submit to Special Issue [18]Review for _Entropy_ [19]Edit a Special
Issue [20]
[9]
[21]
_Entropy [5]_ EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland
RSS [22] E-Mail Table of Contents Alert [23]
Links:
------
[1]
https://www.mdpi.com/journal/entropy/special_issues/Information_Theoretic_Measures#editors
[2]
https://www.mdpi.com/journal/entropy/special_issues/Information_Theoretic_Measures#info
[3]
https://www.mdpi.com/journal/entropy/special_issues/Information_Theoretic_Measures#keywords
[4]
https://www.mdpi.com/journal/entropy/special_issues/Information_Theoretic_Measures#published
[5] https://www.mdpi.com/journal/entropy
[6]
https://twitter.com/home?status=%23mdpientropy+Information+Theoretic+Measures+and+Their+Applications+http%3A%2F%2Fwww.mdpi.com%2Fsi%2F25407++%40Entropy_MDPI
[7]
http://www.linkedin.com/shareArticle?mini=true&url=http%3A%2F%2Fwww.mdpi.com%2Fsi%2F25407&title=Information%20Theoretic%20Measures%20and%20Their%20Applications%26source%3Dhttps%3A%2F%2Fwww.mdpi.com%26summary%3DDear%20Colleagues%2C%0D%0AThe%20concept%20of%20entropy%2C%20an%20ever-growing%20physical%20magnitude%20that%20measured%20the%20degree%20of%20decay%20of%20order%20in%20a%20physical%20system%2C%20was%20introduced%20by%20Rudolf%20Clausius%20in%201865%20through%20an%20elegant%20formulation%20of%20the%20second%20law%20of%20%5B...%5D
[8]
https://www.mdpi.com/journal/entropy/special_issues/Information_Theoretic_Measures
[9] https://www.mdpi.com/
[10] https://www.mdpi.com/user/register/
[11] https://www.mdpi.com/user/login/
[12] https://www.mdpi.com/user/manuscripts/upload/?journal=entropy
[13] https://www.mdpi.com/journal/entropy/instructions
[14] https://www.mdpi.com/journal/entropy/
[15] https://www.mdpi.com/about/apc/
[16] https://www.mdpi.com/about/openaccess/
[17] https://www.mdpi.com/authors/english
[18]
https://susy.mdpi.com/user/manuscripts/upload?form[journal_id]=5&form[special_issue_id]=25407
[19] https://susy.mdpi.com/volunteer/journals/review
[20]
https://www.mdpi.com/journalproposal/sendproposalspecialissue/entropy
[21]
https://serve.mdpi.com/www/my_files/cliiik.php?oaparams=0bannerid=5072zoneid=2cb=638a78f040oadest=https%3A%2F%2Fwww.mdpi.com%2Fjournal%2Fentropy%2Fspecial_issues%2Frelativistic_quantum_information
[22] https://www.mdpi.com/rss/journal/entropy
[23] https://www.mdpi.com/journal/entropy/toc-alert
------------ próxima parte ------------
Se ha borrado un adjunto en formato HTML...
URL: <http://mail.fisica.unlp.edu.ar/pipermail/todos/attachments/20190225/90a832b0/attachment-0001.html>
Más información sobre la lista de distribución Todos