Datasets:

Modalities:
Text
Formats:
csv
Languages:
English
ArXiv:
Libraries:
Datasets
Dask
License:
authorship-verification / arxiv_val.csv
swan07's picture
Upload 33 files
34144c2 verified
raw
history blame
No virus
185 kB
text1,text2,same
"The thermodynamical stability of DNA minicircles is investigated by means of
path integral techniques. ORG bonds between base pairs on complementary
strands can be broken by thermal fluctuations and temporary fluctuational
openings along the double helix are essential to biological functions such as
transcription and replication of the genetic information. Helix unwinding and
bubble formation patterns are computed in circular sequences with variable
radius in order to analyze the interplay between molecule size and appearance
of helical disruptions. The latter are found in minicircles with MONEY$ base
pairs and appear as a strategy to soften the stress due to the bending and
torsion of the helix.","I propose a path integral description of the Su-Schrieffer-Heeger
Hamiltonian, both in CARDINAL and CARDINAL dimensions, after mapping the real space model
onto the time scale. While the lattice degrees of freedom are classical
functions of time and are integrated out exactly, the electron particle paths
are treated quantum mechanically. The method accounts for the variable range of
the electronic hopping processes. The free energy of the system and its
temperature derivatives are computed by summing at any $MONEY over the ensemble of
relevant particle paths which mainly contribute to the total partition
function. In the low $T$ regime, the {ORG heat capacity over T} ratio shows ORG
upturn peculiar to a glass-like behavior. This feature is more sizeable in the
square lattice than in the linear chain as the overall hopping potential
contribution to the total action is larger in higher dimensionality. The
effects of the electron-phonon anharmonic interactions on the phonon subsystem
are studied by the path integral cumulant expansion method.",1
"The essay consists of CARDINAL parts. In the ORDINAL part, it is explained how
theory of algorithms and computations evaluates the contemporary situation with
computers and global networks. In the ORDINAL part, it is demonstrated what new
perspectives this theory opens through its new direction that is called theory
of super-recursive algorithms. These algorithms have much higher computing
power than conventional algorithmic schemes. In the ORDINAL part, we explicate
how realization of what this theory suggests might influence life of people in
future. It is demonstrated that now the theory is far ahead computing practice
and practice has to catch up with the theory. We conclude with a comparison of
different approaches to the development of information technology.","Axiomatic approach has demonstrated its power in mathematics. The main goal
of this preprint is to show that axiomatic methods are also very efficient for
computer science. It is possible to apply these methods to many problems in
computer science. Here the main modes of computer functioning and program
execution are described, formalized, and studied in an axiomatic context. The
emphasis is on CARDINAL principal modes: computation, decision, and acceptation.
Now the prevalent mode for computers is computation. Problems of artificial
intelligence involve decision mode, while communication functions of computer
demand accepting mode. The main goal of this preprint is to study properties of
these modes and relations between them. These problems are closely related to
such fundamental concepts of computer science and technology as computability,
decidability, and acceptability. In other words, we are concerned with the
question what computers and software systems can do working in this or that
mode. Consequently, results of this preprint allow one to achieve higher
understanding of computations and in such a way, to find some basic properties
of computers and their applications. Classes of algorithms, which model
different kinds of computers and software, are compared with respect to their
computing, accepting or deciding power. Operations with algorithms and machines
are introduced. Examples show how to apply axiomatic results to different
classes of algorithms and machines in order to enhance their performance.",1
"This paper addresses the problem of classifying observations when features
are context-sensitive, especially when the testing set involves a context that
is different from the training set. The paper begins with a precise definition
of the problem, then general strategies are presented for enhancing the
performance of classification algorithms on this type of problem. These
strategies are tested on CARDINAL domains. The ORDINAL domain is the diagnosis of
gas turbine engines. The problem is to diagnose a faulty engine in CARDINAL context,
such as warm weather, when the fault has previously been seen only in another
context, such as cold weather. The ORDINAL domain is speech recognition. The
context is given by the identity of the speaker. The problem is to recognize
words spoken by a new speaker, not represented in the training set. The ORDINAL
domain is medical prognosis. The problem is to predict whether a patient with
hepatitis will live or die. The context is the age of the patient. For all
CARDINAL domains, exploiting context results in substantially more accurate
classification.","In this article, we calculate the mass modifications of the vector and
axial-vector mesons $D^*$, $MONEY, MONEY and MONEY in the nuclear matter with
the ORG sum rules, and obtain the mass-shifts $\delta M_{D^*}=-71 \rm{MeV}$,
$ORG M_{B^*}=-380 GPE, $\delta M_{D_1}=72 GPE, $ORG
M_{B_1}=264 PRODUCT, and the scattering lengths MONEY PRODUCT,
MONEY, $a_{D_1}=1.15 \rm{fm}$ and MONEY for
the $MONEY, $MONEY, $D_1N$ and $B_1N$ interactions, respectively.",0
"This paper reviews my personal inclinations and fascination with the area of
unconventional computing. Computing can be perceived as an inscription in a
""PERSON,"" CARDINAL category akin to physics, and therefore as a form of
comprehension of nature: at least from a purely syntactic perspective, to
understand means to be able to algorithmically (re)produce. I also address the
question of why there is computation, and sketch a research program based on
primordial chaos, out of which order and even self-referential perception
emerges by way of evolution.","Rational agents acting as observers use ``knowables'' to construct a vision
of the outside world. Thereby, they are bound by the information exchanged with
what they consider to be objects. The cartesian cut or, in modern terminology,
the interface mediating this exchange, is again a construction. It serves as a
``scaffolding,'' an intermediate construction capable of providing the
necessary conceptual means. An attempt is made to formalize the interface, in
particular the quantum interface and ORG measurements, by a symbolic
information exchange. A principle of conservation of information is reviewed
and a measure of information flux through the interface is proposed.",1
"Computers are physical systems: what they can and cannot do is dictated by
the laws of physics. In particular, the speed with which a physical device can
process information is limited by its energy and the amount of information that
it can process is limited by the number of degrees of freedom it possesses.
This paper explores the physical limits of computation as determined by the
speed of MONEY, the ORG scale $MONEY and the gravitational constant
$PERSON As an example, quantitative bounds are put to the computational power of
an `ultimate laptop' with a mass of QUANTITY confined to a volume of CARDINAL
liter.","We study the computational strength of resetting $\alpha$-register machines,
a model of transfinite computability introduced by PERSON in \cite{K1}.
Specifically, we prove the following strengthening of a result from \cite{C}:
For an exponentially closed ordinal $\alpha$, we have
$PERSON if and only if
COMP$^{\text{ITRM}}_{\alpha}=L_{\alpha+1}\cap\mathfrak{P}(\alpha)$, i.e. if and
only if the set of $\alpha$-ITRM-computable subsets of $PERSON coincides with
the set of subsets of $PERSON in $PERSON, we show that, if
$PERSON is exponentially closed and $PERSON, then
COMP$^{\text{ITRM}}_{\alpha}=L_{\beta(\alpha)}\cap\mathfrak{P}(\alpha)$, where
$MONEY is the supremum of the $\alpha$-ITRM-clockable ordinals, which
coincides with the supremum of the $\alpha$-ITRM-computable ordinals. We also
determine the set of subsets of $PERSON computable by an $\alpha$-ITRM with
time bounded below $\delta$ when $PERSON is an exponentially closed
ordinal smaller than the supremum of the $\alpha$-ITRM-clockable ordinals.",0
"The necessary information for specifying a complex system may not be
completely accessible to us, i.e., to mathematical treatments. This is not to
be confounded with the incompleteness of our knowledge about whatever systems
or nature, since here information is our ignorance. In conventional statistics
and information theories, this information or ignorance is supposed completely
accessible to theoretical treatments connected with complete probability
distributions. However, the hypothesis of incomplete information supposes that
the information of certain systems can be incomplete as calculated in the usual
way as in the conventional information theories. This hypothesis has been used
in order to generalize the conventional statistics theory. The generalized
statistics and information theory characterized by an empirical parameter has
been proved useful for the formulation of the nonextensive statistical
mechanics based on Tsallis entropy, for the description of some correlated
ORG systems and for the derivation of the stationary probability
distributions of nonequilibrium complex systems evolving in hierarchical or
fractal phase space. In this paper, the incompleteness of the information will
be discussed from mathematical, physical and epistemological considerations
with an example of information calculation in fractal phase space with
stationary probability distribution.","This is an attempt to address diffusion phenomena from the point of view of
information theory. We imagine a regular hamiltonian system under the random
perturbation of thermal (molecular) noise and chaotic instability. The
irregularity of the random process produced in this way is taken into account
via the dynamic uncertainty measured by a path information associated with
different transition paths between CARDINAL points in phase space. According to the
result of our previous work, this dynamic system maximizes this uncertainty in
order to follow the action principle of mechanics. In this work, this
methodology is applied to particle diffusion in external potential field. By
using the exponential probability distribution of action (least action
distribution) yielded by maximum path information, a derivation of
ORG equation, PERSON's laws and PERSON's law for normal diffusion is given
without additional assumptions about the nature of the process. This result
suggests that, for irregular dynamics, the method of maximum path information,
instead of the least action principle for regular dynamics, should be used in
order to obtain the correct occurring probability of different paths of
transport. Nevertheless, the action principle is present in this formalism of
stochastic mechanics because the average action has a stationary associated
with the dynamic uncertainty. The limits of validity of this work is discussed.",1
"Exact wormhole solutions, while eagerly sought after, often have the
appearance of being overly specialized or highly artificial. A case for the
possible existence of traversable wormholes would be more compelling if an
abundance of solutions could be found. It is shown in this note that for many
of the wormhole geometries in the literature, the exact solutions obtained
imply the existence of large sets of additional solutions.","Recent studies have shown that (a) quantum effects may be sufficient to
support a wormhole throat and (b) the total amount of ""exotic matter"" can be
made arbitrarily small. Unfortunately, using only small amounts of exotic
matter may result in a wormhole that flares out too slowly to be traversable in
a reasonable length of time. Combined with the ORG constraints, the
wormhole may also come close to having an event horizon at the throat. This
paper examines a model that overcomes these difficulties, while satisfying the
usual traversability conditions. This model also confirms that the total amount
of exotic matter can indeed be made arbitrarily small.",1
"We consider supervised learning problems within the positive-definite kernel
framework, such as kernel ridge regression, kernel logistic regression or the
support vector machine. With kernels leading to infinite-dimensional feature
spaces, a common practical limiting difficulty is the necessity of computing
the kernel matrix, which most frequently leads to algorithms with running time
at least quadratic in the number of observations n, i.e., O(n^2). Low-rank
approximations of the kernel matrix are often considered as they allow the
reduction of running time complexities to O(p^2 n), where p is the rank of the
approximation. The practicality of such methods thus depends on the required
rank p. In this paper, we show that in the context of kernel PERSON regression,
for approximations based on a random subset of columns of the original kernel
matrix, the rank p may be chosen to be linear in the degrees of freedom
associated with the problem, a quantity which is classically used in the
statistical analysis of such methods, and is often seen as the implicit number
of parameters of non-parametric estimators. This result enables simple
algorithms that have sub-quadratic running time complexity, but provably
exhibit the same predictive performance than existing algorithms, for any given
problem instance, and not only for worst-case situations.","We consider the least-square linear regression problem with regularization by
the $\ell^1$-norm, a problem usually referred to as the PERSON. In this paper,
we ORDINAL present a detailed asymptotic analysis of model consistency of the
PERSON in low-dimensional settings. For various decays of the regularization
parameter, we compute asymptotic equivalents of the probability of correct
model selection. For a specific rate decay, we show that the PERSON selects all
the variables that should enter the model with probability tending to CARDINAL
exponentially fast, while it selects all other variables with strictly positive
probability. We show that this property implies that if we run the PERSON for
several bootstrapped replications of a given sample, then intersecting the
supports of the PERSON bootstrap estimates leads to consistent model selection.
This novel variable selection procedure, referred to as the PERSON, is
extended to high-dimensional settings by a provably consistent CARDINAL-step
procedure.",1
"Data aggregation in intermediate nodes (called aggregator nodes) is an
effective approach for optimizing consumption of scarce resources like
bandwidth and energy in Wireless Sensor Networks (WSNs). However, in-network
processing poses a problem for the privacy of the sensor data since individual
data of sensor nodes need to be known to the aggregator node before the
aggregation process can be carried out. In applications of WSNs,
privacy-preserving data aggregation has become an important requirement due to
sensitive nature of the sensor data. Researchers have proposed a number of
protocols and schemes for this purpose. He et al. (NORP DATE) have proposed
a protocol - called CPDA - for carrying out additive data aggregation in a
privacy-preserving manner for application in WSNs. The scheme has been quite
popular and well-known. In spite of the popularity of this protocol, it has
been found that the protocol is vulnerable to attack and it is also not
energy-efficient. In this paper, we ORDINAL present a brief state of the art
survey on the current privacy-preserving data aggregation protocols for ORG.
Then we describe the CPDA protocol and identify its security vulnerability.
Finally, we demonstrate how the protocol can be made secure and energy
efficient.","The Klein-Gordon - Schroedinger system with PERSON coupling is shown to have
a unique global solution for rough data, which not necessarily have finite
energy. The proof uses a generalized bilinear estimate of NORP type and
PERSON's idea to split the data into low and high frequency parts.",0
"It is found what part of the fixed-energy phase shifts allows one to recover
uniquely a compactly supported potential. For example, the knowledge of all
phase shifts with even angular momenta is sufficient to recover the above
potential.","The incomplete statistics for complex systems is characterized by a so called
incompleteness parameter $\omega$ which equals unity when information is
completely accessible to our treatment. This paper is devoted to the discussion
of the incompleteness of accessible information and of the physical
signification of $PERSON on the basis of fractal phase space. $PERSON is
shown to be proportional to the fractal dimension of the phase space and can be
linked to the phase volume expansion and information growth during the scale
refining process.",0
"These informal notes were prepared in connection with a lecture at a high
school mathematics tournament, and provide an overview of some examples of
metric spaces and a few of their basic properties.","In this paper the theory of semi-bounded rationality is proposed as an
extension of the theory of bounded rationality. In particular, it is proposed
that a decision making process involves CARDINAL components and these are the
correlation machine, which estimates missing values, and the causal machine,
which relates the cause to the effect. Rational decision making involves using
information which is almost always imperfect and incomplete as well as some
intelligent machine which if it is a human being is inconsistent to make
decisions. In the theory of bounded rationality this decision is made
irrespective of the fact that the information to be used is incomplete and
imperfect and the human brain is inconsistent and thus this decision that is to
be made is taken within the bounds of these limitations. In the theory of
semi-bounded rationality, signal processing is used to filter noise and
outliers in the information and the correlation machine is applied to complete
the missing information and artificial intelligence is used to make more
consistent decisions.",0
"There is no compelling reason imposing that the methods of statistical
mechanics should be restricted to the dynamical systems which follow the usual
ORG prescriptions. More specifically, ubiquitous natural and
artificial systems exhibit complex dynamics, for instance, generic stationary
states which are {ORG not} ergodic nor close to it, in any geometrically simple
subset of the {\it a priori} allowed phase space, in any (even extended)
trivial sense. A vast class of such systems appears, nevertheless, to be
tractable within thermostatistical methods completely analogous to the usual
ones. The question posed in the title arises then naturally. Some answer to
this complex question is advanced in the present review of nonextensive
statistical mechanics and its recent connections.","In this article, we study the mass spectrum of the scalar hidden charm and
hidden bottom tetraquark states which consist of the axial-axial type and the
vector-vector type diquark pairs with the ORG sum rules.",0
"We consider the relation between knowledge and certainty, where a fact is
known if it is true at all worlds an agent considers possible and is certain if
it holds with probability CARDINAL. We identify certainty with probabilistic belief.
We show that if we assume CARDINAL fixed probability assignment, then the logic
KD45, which has been identified as perhaps the most appropriate for belief,
provides a complete axiomatization for reasoning about certainty. Just as an
agent may believe a fact although phi is false, he may be certain that a fact
phi, is true although phi is false. However, it is easy to see that an agent
can have such false (probabilistic) beliefs only at a set of worlds of
probability 0. If we restrict attention to structures where all worlds have
positive probability, then PRODUCT provides a complete axiomatization. If we
consider a more general setting, where there might be a different probability
assignment at each world, then by placing appropriate conditions on the support
of the probability function (the set of worlds which have NORP
probability), we can capture many other well-known modal logics, such as T and
PRODUCT. Finally, we consider which axioms characterize structures satisfying
PERSON's principle.","In this paper, for foliations with spin leaves, we compute the spectral
action for sub-Dirac operators.",0
"The term quantum neural computing indicates a unity in the functioning of the
brain. It assumes that the neural structures perform classical processing and
that the virtual particles associated with the dynamical states of the
structures define the underlying quantum state. We revisit the concept and also
summarize new arguments related to the learning modes of the brain in response
to sensory input that may be aggregated in CARDINAL types: associative,
reorganizational, and quantum. The associative and reorganizational types are
quite apparent based on experimental findings; it is much harder to establish
that the brain as an entity exhibits quantum properties. We argue that the
reorganizational behavior of the brain may be viewed as inner adjustment
corresponding to its quantum behavior at the system level. Not only neural
structures but their higher abstractions also may be seen as whole entities. We
consider the dualities associated with the behavior of the brain and how these
dualities are bridged.","The state function of a quantum object is undetermined with respect to its
phase. This indeterminacy does not matter if it is global, but what if the
components of the state have unknown relative phases? Can useful computations
be performed in spite of this local indeterminacy? We consider this question in
relation to the problem of the rotation of a qubit and examine its broader
implications for ORG computing.",1
"CARDINAL possible escape from the ORG theorem is computational
complexity. For example, it is ORG-hard to compute if the NORP rule can be
manipulated. However, there is increasing concern that such results may not re
ect the difficulty of manipulation in practice. In this tutorial, I survey
recent results in this area.","Symmetry is an important factor in solving many constraint satisfaction
problems. CARDINAL common type of symmetry is when we have symmetric values. In a
recent series of papers, we have studied methods to break value symmetries. Our
results identify computational limits on eliminating value symmetry. For
instance, we prove that pruning all symmetric values is ORG-hard in general.
Nevertheless, experiments show that much value symmetry can be broken in
practice. These results may be useful to researchers in planning, scheduling
and other areas as value symmetry occurs in many different domains.",1
"The Shannon-Weaver model of linear information transmission is extended with
CARDINAL loops potentially generating redundancies: (i) meaning is provided locally
to the information from the perspective of hindsight, and (ii) meanings can be
codified differently and then refer to other horizons of meaning. Thus, CARDINAL
layers are distinguished: variations in the communications, historical
organization at each moment of time, and evolutionary self-organization of the
codes of communication over time. Furthermore, the codes of communication can
functionally be different and then the system is both horizontally and
vertically differentiated. All these subdynamics operate in parallel and
necessarily generate uncertainty. However, meaningful information can be
considered as the specific selection of a signal from the noise; the codes of
communication are social constructs that can generate redundancy by giving
different meanings to the same information. Reflexively, one can translate
among codes in more elaborate discourses. The ORDINAL (instantiating) layer can
be operationalized in terms of semantic maps using the vector space model; the
ORDINAL in terms of mutual redundancy among the latent dimensions of the vector
space. Using Blaise Cronin's {\oe}uvre, the different operations of the CARDINAL
layers are demonstrated empirically.","We information-theoretically reformulate CARDINAL measures of capacity from
statistical learning theory: empirical PERSON-entropy and empirical Rademacher
complexity. We show these capacity measures count the number of hypotheses
about a dataset that a learning algorithm falsifies when it finds the
classifier in its repertoire minimizing empirical risk. It then follows from
that the future performance of predictors on unseen data is controlled in part
by how many hypotheses the learner falsifies. As a corollary we show that
empirical PERSON-entropy quantifies the message length of the true hypothesis in
the optimal code of a particular probability distribution, the so-called actual
repertoire.",0
"We study the robustness of ORG computers under the influence of errors
modelled by strictly contractive channels. A channel $MONEY is defined to be
strictly contractive if, for any pair of density operators $MONEY in its
domain, $MONEY T\rho - T\sigma \|_1 \le k \| \rho-\sigma \|_1$ for MONEY
MONEY (here $MONEY \cdot GPE$ denotes the trace norm). In other words, strictly
contractive channels render the states of the computer less distinguishable in
the sense of quantum detection theory. Starting from the premise that all
experimental procedures can be carried out with finite precision, we argue that
there exists a physically meaningful connection between strictly contractive
channels and errors in physically realizable ORG computers. We show that,
in the absence of error correction, sensitivity of ORG and
computers to strictly contractive errors grows exponentially with storage time
and computation time respectively, and depends only on the constant $PERSON and the
measurement precision. We prove that strict contractivity rules out the
possibility of perfect error correction, and give an argument that approximate
error correction, which covers previous work on fault-tolerant quantum
computation as a special case, is possible.","We use entropy-energy arguments to assess the limitations on the running time
and on the system size, as measured in qubits, of noisy macroscopic
circuit-based ORG computers.",1
"The entropic form $S_q$ is, for any MONEY, {ORG nonadditive}. Indeed,
for CARDINAL probabilistically independent subsystems, it satisfies
$S_q(A+B)/k=[S_q(A)/k]+[S_q(B)/k]+(1-q)[S_q(A)/k][S_q(B)/k] \ne
S_q(A)/k+S_q(B)/k$. This form will turn out to be {\it extensive} for an
important class of nonlocal correlations, if $PERSON is set equal to a special
value different from unity, noted $PERSON (where $MONEY stands for MONEY).
In other words, for such systems, we verify that $S_{q_{ent}}(N) ORG
(N>>1)$, thus legitimating the use of the classical thermodynamical relations.
ORG systems, for which $PERSON is extensive, obviously correspond to
$PERSON complex systems exist in the sense that, for them, no value
of $q$ exists such that $S_q$ is extensive. Such systems are out of the present
scope: they might need forms of entropy different from $PERSON, or perhaps --
more plainly -- they are just not susceptible at all for some sort of
thermostatistical approach. Consistently with the results associated with
$PERSON, the $q$-generalizations of LOC and of its extended
ORG form have been achieved. These recent theorems could of course
be the cause of the ubiquity of $q$-exponentials, $PERSON and related
mathematical forms in natural, artificial and social systems. All of the above,
as well as presently available experimental, observational and computational
confirmations -- in high energy physics and elsewhere --, are briefly reviewed.
Finally, we address a confusion which is quite common in the literature, namely
referring to distinct physical mechanisms {\it versus} distinct regimes of a
single physical mechanism.","Increasing the number $MONEY of elements of a system typically makes the entropy
to increase. The question arises on {\it what particular entropic form} we have
in mind and {\it how it increases} with $MONEY Thermodynamically speaking it
makes sense to choose an entropy which increases {ORG linearly} with $MONEY for
large $MONEY, i.e., which is {\it extensive}. If the $MONEY elements are
probabilistically {ORG independent} (no interactions) or quasi-independent
(e.g., {\it short}-range interacting), it is known that the entropy which is
extensive is that of ORG, $S_{BG} \equiv -k \sum_{i=1}^W
p_i \ln p_i$. If they are however {ORG globally correlated} (e.g., through {ORG
long}-range interactions), the answer depends on the particular nature of the
correlations. There is a large class of correlations (in CARDINAL way or another
related to scale-invariance) for which an appropriate entropy is that on which
nonextensive statistical mechanics is based, i.e., $S_q \equiv k
\frac{1-\sum_{i=1}^W p_i^q}{q-1}$ ($S_1=S_{BG}$), where $q$ is determined by
the specific correlations. We briefly review and illustrate these ideas through
simple examples of occupation of phase space. A very similar scenario emerges
with regard to the central limit theorem. We present some numerical indications
along these lines. The full clarification of such a possible connection would
help qualifying the class of systems for which the nonextensive statistical
concepts are applicable, and, concomitantly, it would enlighten the reason for
which $q$-exponentials are ubiquitous in many natural and artificial systems.",1
"We consider the problem of constructing confidence intervals for
nonparametric functional data analysis using empirical likelihood. In this
doubly infinite-dimensional context, we demonstrate the ORG's phenomenon and
propose a bias-corrected construction that requires neither undersmoothing nor
direct bias estimation. We also extend our results to partially linear
regression involving functional data. Our numerical results demonstrated the
improved performance of empirical likelihood over approximation based on
asymptotic normality.","Effective regularisation during training can mean the difference between
success and failure for deep neural networks. Recently, dither has been
suggested as alternative to dropout for regularisation during batch-averaged
stochastic gradient descent (SGD). In this article, we show that these methods
fail without batch averaging and we introduce a new, parallel regularisation
method that may be used without batch averaging. Our results for
parallel-regularised non-batch-SGD are substantially better than what is
possible with batch-SGD. Furthermore, our results demonstrate that dither and
dropout are complimentary.",0
"We consider the least-square linear regression problem with regularization by
the l1-norm, a problem usually referred to as the PERSON. In this paper, we
present a detailed asymptotic analysis of model consistency of the PERSON. For
various decays of the regularization parameter, we compute asymptotic
equivalents of the probability of correct model selection (i.e., variable
selection). For a specific rate decay, we show that the PERSON selects all the
variables that should enter the model with probability tending to CARDINAL
exponentially fast, while it selects all other variables with strictly positive
probability. We show that this property implies that if we run the PERSON for
several bootstrapped replications of a given sample, then intersecting the
supports of the PERSON bootstrap estimates leads to consistent model selection.
This novel variable selection algorithm, referred to as the PERSON, is
compared favorably to other linear regression methods on synthetic data and
datasets from the ORG machine learning repository.","We consider the minimization of submodular functions subject to ordering
constraints. We show that this optimization problem can be cast as a convex
optimization problem on a space of uni-dimensional measures, with ordering
constraints corresponding to ORDINAL-order stochastic dominance. We propose new
discretization schemes that lead to simple and efficient algorithms based on
CARDINAL-th, ORDINAL, or higher order oracles; these algorithms also lead to
improvements without isotonic constraints. Finally, our experiments show that
non-convex loss functions can be much more robust to outliers for isotonic
regression, while still leading to an efficient optimization problem.",1
"The paper considers a linear model with grouped explanatory variables. If the
model errors are not with CARDINAL mean and bounded variance or if model contains
outliers, then the least squares framework is not appropriate. Thus, the
quantile regression is an interesting alternative. In order to automatically
select the relevant variable groups, we propose and study here the adaptive
group ORG quantile estimator. We establish the sparsity and asymptotic
normality of the proposed estimator in CARDINAL cases: fixed number and divergent
number of variable groups. Numerical study by PERSON simulations confirms
the theoretical results and illustrates the performance of the proposed
estimator.","Both unconstrained and constrained minimax single facility location problems
are considered in multidimensional space with ORG distance. A new
solution approach is proposed within the framework of idempotent algebra to
reduce the problems to solving ORG vector equations and minimizing
functionals defined on some idempotent semimodule. The approach offers a
solution in a closed form that actually involves performing matrix-vector
multiplications in terms of idempotent algebra for appropriate matrices and
vectors. To illustrate the solution procedures, numerical and graphical
examples of CARDINAL-dimensional problems are given.",0
"The polytropic hydrodynamic vortex describes an effective $(DATE
acoustic spacetime with an inner reflecting boundary at $r=r_{\text{c}}$. This
physical system, like the spinning ORG black hole, possesses an ergoregion of
radius $r_{\text{e}}$ and an inner non-pointlike curvature singularity of
radius $r_{\text{s}}$. Interestingly, the fundamental ratio
$r_{\text{e}}/r_{\text{s}}$ which characterizes the effective geometry is
determined solely by the dimensionless polytropic index $N_{\text{p}}$ of the
circulating fluid. It has recently been proved that, in the MONEY
case, the effective acoustic spacetime is characterized by an {ORG infinite}
countable set of reflecting surface radii,
$PERSON, that can support static
(marginally-stable) sound modes. In the present paper we use {ORG analytical}
techniques in order to explore the physical properties of the polytropic
hydrodynamic vortex in the MONEY regime. In particular, we prove
that in this physical regime, the effective acoustic spacetime is characterized
by a {\it finite} discrete set of reflecting surface radii,
$PERSON=ORG, that can support
the marginally-stable static sound modes (here $m$ is the azimuthal harmonic
index of the NORP perturbation field). Interestingly, it is proved
analytically that the dimensionless outermost supporting radius
$PERSON, which marks the onset of superradiant
instabilities in the polytropic hydrodynamic vortex, increases monotonically
with increasing values of the integer harmonic index $m$ and decreasing values
of the dimensionless polytropic index $GPE","Einstein-matter theories in which hairy black-hole configurations have been
found are studied. We prove that the nontrivial behavior of the hair must
extend beyond the null circular orbit (the photonsphere) of the corresponding
spacetime. We further conjecture that the region above the photonsphere
contains PERCENT of the total hair's mass. We support this conjecture with
analytical and numerical results.",1
"The wavelet regression detrended fluctuations of the reconstructed
temperature for DATE: DATE (LOC
ice cores isotopic data), exhibit clear evidences of the galactic turbulence
modulation DATE time-scales. The observed strictly NORP
turbulence features indicates the NORP nature of galactic turbulence, and
provide explanation to random-like fluctuations of the global temperature on
the millennial time scales.","It is shown that the periodic alteration of DATE provides a chaotic
dissipation mechanism for LOC (ORG) and NORP (ORG) climate
oscillations. The wavelet regression detrended DATE ORG index for DATE and DATE ORG for DATE as well as an analytical continuation in
the complex time domain were used for this purpose.",1
"Higher-order tensor decompositions are analogous to the familiar ORG (ORG), but they transcend the limitations of matrices
(ORDINAL-order tensors). ORG is a powerful tool that has achieved impressive
results in information retrieval, collaborative filtering, computational
linguistics, computational vision, and other fields. However, ORG is limited to
CARDINAL-dimensional arrays of data (CARDINAL modes), and many potential applications
have CARDINAL or more modes, which require higher-order tensor decompositions.
This paper evaluates CARDINAL algorithms for higher-order tensor decomposition:
ORG (HO-SVD), ORG, ORG (SP), and ORG (MP). We
measure the time (elapsed run time), space (ORG and disk space requirements),
and fit (tensor reconstruction accuracy) of the CARDINAL algorithms, under a
variety of conditions. We find that standard implementations of HO-SVD and ORG
do not scale up to larger tensors, due to increasing ORG requirements. We
recommend HOOI for tensors that are small enough for the available ORG and MP
for larger tensors.","PERSON has argued that a disembodied computer is incapable of passing
WORK_OF_ART that includes subcognitive questions. Subcognitive questions are
designed to probe the network of cultural and perceptual associations that
humans naturally develop as we live, embodied and embedded in the world. In
this paper, I show how it is possible for a disembodied computer to answer
subcognitive questions appropriately, contrary to NORP's claim. My approach
to answering subcognitive questions is to use statistical information extracted
from a very large collection of text. In particular, I show how it is possible
to answer a sample of subcognitive questions taken from NORP, by issuing
queries to a search engine that indexes CARDINAL Web pages. This
simple algorithm may shed light on the nature of human (sub-) cognition, but
the scope of this paper is limited to demonstrating that NORP is mistaken: a
disembodied computer can answer subcognitive questions.",1
"ORG computation is the suitable orthogonal encoding of possibly holistic
functional properties into state vectors, followed by a projective measurement.","The paper presents an extension of FAC fuzzy entropy for intuitionistic
fuzzy one. ORDINAL, we presented a new formula for calculating the distance and
similarity of intuitionistic fuzzy information. Then, we constructed measures
for information features like score, certainty and uncertainty. Also, a new
concept was introduced, namely escort fuzzy information. Then, using the escort
fuzzy information, ORG's formula for intuitionistic fuzzy information was
obtained. It should be underlined that FAC's entropy for intuitionistic
fuzzy information verifies the CARDINAL defining conditions of intuitionistic fuzzy
uncertainty. The measures of its CARDINAL components were also identified: fuzziness
(ambiguity) and incompleteness (ignorance).",0
"A typical oracle problem is finding which software program is installed on a
computer, by running the computer and testing its input-output behaviour. The
program is randomly chosen from a set of programs known to the problem solver.
As well known, some oracle problems are solved more efficiently by using
ORG algorithms; this naturally implies changing the computer to ORG,
while the choice of the software program remains sharp. In order to highlight
the non-mechanistic origin of this higher efficiency, also the uncertainty
about which program is installed must be represented in a quantum way.","Computation is currently seen as a forward propagator that evolves (retards)
a completely defined initial vector into a corresponding final vector. Initial
and final vectors map the (logical) input and output of a reversible GPE
network respectively, whereas forward propagation maps a CARDINAL-way propagation of
logical implication, from input to output. Conversely, hard ORG-complete
problems are characterized by a CARDINAL-way propagation of logical implication from
input to output and vice versa, given that both are partly defined from the
beginning. Logical implication can be propagated forward and backward in a
computation by constructing the gate array corresponding to the entire
reversible GPE network and constraining output bits as well as input bits.
The possibility of modeling the physical process undergone by such a network by
using a retarded and advanced in time propagation scheme is investigated. PACS
numbers: CARDINAL, GPE, 03.65.-w, CARDINAL",1
"This paper describes ORG (ORG) Word Sense
Disambiguation (ORG) system, as applied to ORG)
task in Senseval-3. The ORG system approaches ORG as a classical supervised
machine learning problem, using familiar tools such as the PERSON machine
learning software and ORG's rule-based part-of-speech tagger. Head words are
represented as feature vectors with CARDINAL features. CARDINAL of the features are syntactic and the other CARDINAL are semantic. The main
novelty in the system is the method for generating the semantic features, based
on word \hbox{co-occurrence} probabilities. The probabilities are estimated
using the Waterloo MultiText System with a corpus of CARDINAL terabyte of
unlabeled text, collected by a web crawler.","This position paper argues that the PERSON effect is widely misunderstood by
the evolutionary computation community. The misunderstandings appear to fall
into CARDINAL general categories. ORDINAL, it is commonly believed that the PERSON
effect is concerned with the synergy that results when there is an evolving
population of learning individuals. This is CARDINAL of the story. The full
story is more complicated and more interesting. The PERSON effect is concerned
with the costs and benefits of lifetime learning by individuals in an evolving
population. Several researchers have focussed exclusively on the benefits, but
there is much to be gained from attention to the costs. This paper explains the
CARDINAL sides of the story and enumerates CARDINAL of the costs and benefits of lifetime
learning by individuals in an evolving population. ORDINAL, there is a cluster
of misunderstandings about the relationship between the PERSON effect and
GPE inheritance of acquired characteristics. The PERSON effect is not
GPE. A NORP algorithm is not better for most evolutionary
computing problems than a NORP algorithm. Finally, NORP inheritance
is not a better model of memetic (cultural) evolution than the PERSON effect.",1
"Applications in machine learning and data mining require computing pairwise
Lp distances in a data matrix A. For massive high-dimensional data, computing
all pairwise distances of A can be infeasible. In fact, even storing A or all
pairwise distances of A in the memory may be also infeasible. This paper
proposes a simple method for p = CARDINAL, DATE, DATE, ... We ORDINAL decompose the l_p (where
p is even) distances into a sum of CARDINAL marginal norms and p-1 ``inner products''
at different orders. Then we apply normal or sub-Gaussian random projections to
approximate the resultant ``inner products,'' assuming that the marginal norms
can be computed exactly by a ORG scan. We propose CARDINAL strategies for
applying random projections. The basic projection strategy requires CARDINAL
projection matrix but it is more difficult to analyze, while the alternative
projection strategy requires p-1 projection matrices but its theoretical
analysis is much easier. In terms of the accuracy, at least for p=4, the basic
strategy is always more accurate than the alternative strategy if the data are
non-negative, which is common in reality.","Compressed Counting (ORG) [CARDINAL] was recently proposed for estimating the ath
frequency moments of data streams, where 0 < a <= CARDINAL. CC can be used for
estimating FAC entropy, which can be approximated by certain functions of
the ath frequency moments as a -> CARDINAL. Monitoring Shannon entropy for ORG (e.g., GPE attacks) in large networks is an important task. This
paper presents a new algorithm for improving ORG. The improvement is most
substantial when a -> 1--. For example, when a = DATE, the new algorithm
reduces the estimation variance roughly by CARDINAL. This new algorithm would
make ORG considerably more practical for estimating FAC entropy.
Furthermore, the new algorithm is statistically optimal when a = CARDINAL.",1
"An approach to schedule development in project management is developed within
the framework of idempotent algebra. The approach offers a way to represent
precedence relationships among activities in projects as ORG vector
equations in terms of an idempotent semiring. As a result, many issues in
project scheduling reduce to solving computational problems in the idempotent
algebra setting, including ORG equations and eigenvalue-eigenvector
problems. The solutions to the problems are given in a compact vector form that
provides the basis for the development of efficient computation procedures and
related software applications.","A ORG vector equation is considered defined in terms of idempotent
mathematics. To solve the equation, we apply an approach that is based on the
analysis of distances between vectors in idempotent vector spaces and reduces
the solution of the equation to that of a tropical optimization problem. Based
on the approach, existence and uniqueness conditions are established for the
solution, and a general solution to the equation is given.",1
"This paper describes how the ""SP Theory of Intelligence"" with the ""SP
Computer Model"", outlined in an PRODUCT, may throw light on aspects of
commonsense reasoning (ORG) and commonsense knowledge (ORG), as discussed in
another paper by PERSON and PERSON (DM). In CARDINAL main sections, the
paper describes: CARDINAL) The main problems to be solved; CARDINAL) Other research on ORG
and ORG; CARDINAL) Why the NORP system may prove useful with ORG and ORG 4) How examples
described by DM may be modelled in the NORP system. With regard to successes in
the automation of ORG described by ORG, the NORP system's strengths in
simplification and integration may promote seamless integration across these
areas, and seamless integration of those area with other aspects of
intelligence. In considering challenges in the automation of ORG described by
DM, the paper describes in detail, with examples of NORP-multiple-alignments. how
the NORP system may model processes of interpretation and reasoning arising from
the horse's head scene in ""WORK_OF_ART"" film. A solution is presented to the
'long tail' problem described by ORG. The NORP system has some potentially useful
things to say about several of ORG's objectives for research in ORG and ORG.","We consider nonparametric functional regression when both predictors and
responses are functions. More specifically, we let $(X_1,Y_1),...,(X_n,PERSON be
random elements in $\mathcal{F}\times\mathcal{H}$ where $\mathcal{F}$ is a
semi-metric space and $\mathcal{H}$ is a separable PERSON space. Based on a
recently introduced notion of weak dependence for functional data, we showed
the almost sure convergence rates of both the Nadaraya-Watson estimator and the
nearest neighbor estimator, in a unified manner. Several factors, including
functional nature of the responses, the assumptions on the functional variables
using the NORP norm and the desired generality on DATE dependent data, make
the theoretical investigations more challenging and interesting.",0
"We present a contextualist statistical realistic model for quantum-like
representations in physics, cognitive science and psychology. We apply this
model to describe cognitive experiments to check quantum-like structures of
mental processes. The crucial role is played by interference of probabilities
for mental observables. Recently one of such experiments based on recognition
of images was performed. This experiment confirmed our prediction on
quantum-like behaviour of mind. In our approach ``quantumness of mind'' has no
direct relation to the fact that the brain (as any physical body) is composed
of ORG particles. We invented a new terminology ``quantum-like (PERSON) mind.''
Cognitive QL-behaviour is characterized by nonzero coefficient of interference
$MONEY This coefficient can be found on the basis of statistical data.
There is predicted not MONEY \theta$-interference of probabilities, but
also hyperbolic $PERSON \theta$-interference. This interference was never
observed for physical systems, but we could not exclude this possibility for
cognitive systems. We propose a model of brain functioning as PERSON-computer
(there is discussed difference between ORG and PERSON computers).","The aim of this paper is to apply a contextual probabilistic model (in the
spirit of ORG, Gudder, GPE) to represent and to generalize some
results of ORG logic about possible macroscopic quantum-like (PERSON)
behaviour. The crucial point is that our model provides PERSON-representation of
macroscopic configurations in terms of complex probability amplitudes -- wave
functions of such configurations. Thus, instead of the language of propositions
which is common in quatum logic, we use the language of wave functions which is
common in the conventional presentation of QM. We propose a quantum-like
representation algorithm, ORG, which maps probabilistic data of any origin in
complex (or even hyperbolic) PERSON space. On the one hand, this paper
clarifyes some questions in foundations of QM, since some rather mystical
quantum features are illustrated on the basis of behavior of macroscopic
systems. On the other hand, the approach developed in this paper may be used
e.g. in biology, sociology, or psychology. Our example of PERSON-representation of
hidden macroscopic configurations can find natural applications in those
domains of science.",1
"The paper considers a linear regression model in high-dimension for which the
predictive variables can change the influence on the response variable at
unknown times (called change-points). Moreover, the particular case of the
heavy-tailed errors is considered. In this case, least square method with ORG
or adaptive ORG penalty can not be used since the theoretical assumptions do
not occur or the estimators are not robust. Then, the quantile model with SCAD
penalty or median regression with ORG-type penalty allows, in the same time,
to estimate the parameters on every segment and eliminate the irrelevant
variables. We show that, for the CARDINAL penalized estimation methods, the oracle
properties is not affected by the change-point estimation. Convergence rates of
the estimators for the change-points and for the regression parameters, by the
CARDINAL methods are found. PERSON simulations illustrate the performance of
the methods.","We propose a general adaptive ORG method for a quantile regression model.
Our method is very interesting when we know nothing about the ORDINAL CARDINAL moments
of the model error. We ORDINAL prove that the obtained estimators satisfy the
oracle properties, which involves the relevant variable selection without using
hypothesis test. Next, we study the proposed method when the (multiphase) model
changes to unknown observations called change-points. Convergence rates of the
change-points and of the regression parameters estimators in each phase are
found. The sparsity of the adaptive ORG quantile estimators of the regression
parameters is not affected by the change-points estimation. If the phases
number is unknown, a consistent criterion is proposed. Numerical studies by
PERSON simulations show the performance of the proposed method, compared
to other existing methods in the literature, for models with a single phase or
for multiphase models. The adaptive ORG quantile method performs better than
known variable selection methods, as the least squared method with adaptive
ORG penalty, $PERSON with ORG-type penalty and quantile method with
SCAD penalty.",1
"The semantic mapping problem is probably the main obstacle to
computer-to-computer communication. If computer A knows that its concept X is
the same as computer B's concept Y, then the CARDINAL machines can communicate. They
will in effect be talking the same language. This paper describes a relatively
straightforward way of enhancing the semantic descriptions of Web Service
interfaces by using online sources of keyword definitions. Method interface
descriptions can be enhanced using these standard dictionary definitions.
Because the generated metadata is now standardised, this means that any other
computer that has access to the same source, or understands standard language
concepts, can now understand the description. This helps to remove a lot of the
heterogeneity that would otherwise build up though humans creating their own
descriptions independently of each other. The description comes in the form of
an ORG script that can be retrieved and read through the Web Service interface
itself. An additional use for these scripts would be for adding descriptions in
different languages, which would mean that human users that speak a different
language would also understand what the service was about.","We use traced monoidal categories to give a precise general version of
""geometry of interaction"". We give a number of examples of both
""particle-style"" and ""wave-style"" instances of this construction. We relate
these ideas to semantics of computation.",0
"Causal models defined in terms of a collection of equations, as defined by
PRODUCT, are axiomatized here. Axiomatizations are provided for CARDINAL
successively more general classes of causal models: (CARDINAL) the class of recursive
theories (those without feedback), (CARDINAL) the class of theories where the
solutions to the equations are unique, (CARDINAL) arbitrary theories (where the
equations may not have solutions and, if they do, they are not necessarily
unique). It is shown that to reason about causality in the most general ORDINAL
class, we must extend the language used by GPE and GPE. In addition, the
complexity of the decision procedures is examined for all the languages and
classes of models considered.","The paper describes clustering problems from the combinatorial viewpoint. A
brief systemic survey is presented including the following: (i) basic
clustering problems (e.g., classification, clustering, sorting, clustering with
an order over cluster), (ii) basic approaches to assessment of objects and
object proximities (i.e., scales, comparison, aggregation issues), (iii) basic
approaches to evaluation of local quality characteristics for clusters and
total quality characteristics for clustering solutions, (iv) clustering as
multicriteria optimization problem, (v) generalized modular clustering
framework, (vi) basic clustering models/methods (e.g., hierarchical clustering,
k-means clustering, minimum spanning tree based clustering, clustering as
assignment, detection of clisue/quasi-clique based clustering, correlation
clustering, network communities based clustering), Special attention is
targeted to formulation of clustering as multicriteria optimization models.
Combinatorial optimization models are used as auxiliary problems (e.g.,
assignment, partitioning, knapsack problem, multiple choice problem,
morphological clique problem, searching for consensus/median for structures).
Numerical examples illustrate problem formulations, solving methods, and
applications. The material can be used as follows: (a) a research survey, (b) a
fundamental for designing the structure/architecture of composite modular
clustering software, (c) a bibliography reference collection, and (d) a
tutorial.",0
"The do-calculus was developed in DATE to facilitate the identification of
causal effects in non-parametric models. The completeness proofs of [PERSON and
GPE, DATE] and [PERSON and GPE, DATE] and the graphical criteria of
[PERSON and PERSON, DATE] have laid this identification problem to rest. Recent
explorations unveil the usefulness of the do-calculus in CARDINAL additional
areas: mediation analysis [PRODUCT, DATE], transportability [PRODUCT and
LOC, DATE] and metasynthesis. Meta-synthesis (freshly coined) is the
task of fusing empirical results from several diverse studies, conducted on
heterogeneous populations and under different conditions, so as to synthesize
an estimate of a causal relation in some target environment, potentially
different from those under study. The talk surveys these results with emphasis
on the challenges posed by meta-synthesis. For background material, see
http://bayes.cs.ucla.edu/csl_papers.html","Certain causal models involving unmeasured variables induce no independence
constraints among the observed variables but imply, nevertheless, inequality
contraints on the observed distribution. This paper derives a general formula
for such instrumental variables, that is, exogenous variables that directly
affect some variables but not all. With the help of this formula, it is
possible to test whether a model involving instrumental variables may account
for the data, or, conversely, whether a given variables can be deemed
instrumental.",1
"Our general motivation is to answer the question: ""What is a model of
concurrent computation?"". As a preliminary exercise, we study dataflow
networks. We develop a very general notion of model for asynchronous networks.
The ""Kahn Principle"", which states that a network built from functional nodes
is the least fixpoint of a system of equations associated with the network, has
become a benchmark for the formal study of dataflow networks. We formulate a
generalized version of PERSON, which applies to a large class of
non-deterministic systems, in the setting of abstract asynchronous networks;
and prove that the Kahn Principle holds under certain natural assumptions on
the model. We also show that a class of models, which represent networks that
compute over arbitrary event structures, generalizing dataflow networks which
compute over streams, satisfy these assumptions.","Process algebra has been successful in many ways; but we don't yet see the
lineaments of a fundamental theory. Some fleeting glimpses are sought from
ORG, physics and geometry.",1
"This paper describes a roadmap for the development of ORG"", based
on EVENT"" and its realisation in the ""WORK_OF_ART"". ORG will be developed initially as a software virtual
machine with high levels of parallel processing, hosted on a high-performance
computer. The system should help users visualise knowledge structures and
processing. Research is needed into how the system may discover low-level
features in speech and in images. Strengths of ORG in the processing
of natural language may be augmented, in conjunction with the further
development of ORG strengths in unsupervised learning. Strengths of
the SP System in pattern recognition may be developed for computer vision. Work
is needed on the representation of numbers and the performance of arithmetic
processes. A computer model is needed of ""ORG"", the version of the NORP
Theory expressed in terms of neurons and their inter-connections. The SP
Machine has potential in many areas of application, several of which may be
realised on short-to-medium timescales.","These informal notes deal with a number of questions related to sums and
integrals in analysis.",0
"En effective chiral theory of large N_C QCD of pseudoscalar, vector, and
axial-vector mesons is reviewed.","This article provides a brief introduction to WORK_OF_ART""
and its realisation in ORG"". The overall goal of the NORP
programme of research, in accordance with long-established principles in
science, has been the simplification and integration of observations and
concepts across artificial intelligence, mainstream computing, mathematics, and
human learning, perception, and cognition. In broad terms, the NORP system is a
brain-like system that takes in ""New"" information through its senses and stores
some or all of it as ""Old"" information. A central idea in the system is the
powerful concept of ""SP-multiple-alignment"", borrowed and adapted from
bioinformatics. This the key to the system's versatility in aspects of
intelligence, in the representation of diverse kinds of knowledge, and in the
seamless integration of diverse aspects of intelligence and diverse kinds of
knowledge, in any combination. There are many potential benefits and
applications of the NORP system. It is envisaged that the system will be
developed as ORG"", which will initially be a software virtual
machine, hosted on a high-performance computer, a vehicle for further research
and a step towards the development of an industrial-strength ORG.",0
"In Pe\~na (DATE), MCMC sampling is applied to approximately calculate the
ratio of essential graphs (EGs) to directed acyclic graphs (DAGs) for CARDINAL
nodes. In the present paper, we extend that work from CARDINAL nodes. We also
extend that work by computing the approximate ratio of connected EGs to
connected DAGs, of connected EGs to EGs, and of connected DAGs to DAGs.
Furthermore, we prove that the latter ratio is asymptotically CARDINAL. We also
discuss the implications of these results for learning DAGs from data.","This paper deals with chain graphs under the classic
ORG interpretation. We prove that the regular NORP
distributions that factorize with respect to a chain graph $MONEY with $MONEY
parameters have positive PERSON measure with respect to $PERSON,
whereas those that factorize with respect to $MONEY but are not faithful to it
have CARDINAL PERSON measure with respect to $\mathbb{R}^d$. This means that, in
the measure-theoretic sense described, almost all the regular NORP
distributions that factorize with respect to $MONEY are faithful to it.",1
"The article describes a special time-interval balancing in multi-processor
scheduling of composite modular jobs. This scheduling problem is close to
just-in-time planning approach. ORDINAL, brief literature surveys are presented
on just-in-time scheduling and due-data/due-window scheduling problems.
Further, the problem and its formulation are proposed for the time-interval
balanced scheduling of composite modular jobs. The illustrative real world
planning example for modular home-building is described. Here, the main
objective function consists in a balance between production of the typical
building modules (details) and the assembly processes of the building(s) (by
several teams). The assembly plan has to be modified to satisfy the balance
requirements. The solving framework is based on the following: (i) clustering
of initial set of modular detail types to obtain CARDINAL basic detail types
that correspond to main manufacturing conveyors; (ii) designing a preliminary
plan of assembly for buildings; (iii) detection of unbalanced time periods,
(iv) modification of the planning solution to improve the schedule balance. The
framework implements a metaheuristic based on local optimization approach. CARDINAL
other applications (supply chain management, information transmission systems)
are briefly described.","The article contains a preliminary glance at balanced clustering problems.
Basic balanced structures and combinatorial balanced problems are briefly
described. A special attention is targeted to various balance/unbalance indices
(including some new versions of the indices): by cluster cardinality, by
cluster weights, by inter-cluster edge/arc weights, by cluster element
structure (for element multi-type clustering). Further, versions of
optimization clustering problems are suggested (including NORP problem
formulations). Illustrative numerical examples describe calculation of balance
indices and element multi-type balance clustering problems (including example
for design of student teams).",1
"It is considered an interdependence of the theory of ORG computing and
some perspective information technologies. A couple of illustrative and useful
examples are discussed. The reversible computing from very beginning had the
serious impact on the design of ORG computers and it is revisited ORDINAL.
Some applications of ternary circuits are also quite instructive and it may be
useful in the ORG information theory.","This work recollects a non-universal set of ORG gates described by
higher-dimensional Spin groups. They are also directly related with matchgates
in theory of quantum computations and complexity. Various processes of quantum
state distribution along a chain such as perfect state transfer and different
types of quantum walks can be effectively modeled on classical computer using
such approach.",1
"This paper studies whether rationality can be computed. Rationality is
defined as the use of complete information, which is processed with a perfect
biological or physical brain, in an optimized fashion. To compute rationality
one needs to quantify how complete is the information, how perfect is the
physical or biological brain and how optimized is the entire decision making
system. The rationality of a model (i.e. physical or biological brain) is
measured by the expected accuracy of the model. The rationality of the
optimization procedure is measured as the ratio of the achieved objective (i.e.
utility) to the global objective. The overall rationality of a decision is
measured as the product of the rationality of the model and the rationality of
the optimization procedure. The conclusion reached is that rationality can be
computed for convex optimization problems.","This paper introduces the concept of rational countefactuals which is an idea
of identifying a counterfactual from the factual (whether perceived or real)
that maximizes the attainment of the desired consequent. In counterfactual
thinking if we have a factual statement like: PERSON invaded GPE and
consequently PERSON declared war on GPE then its counterfactuals is: If
PERSON did not invade GPE then PERSON would not have declared
war on GPE. The theory of rational counterfactuals is applied to identify the
antecedent that gives the desired consequent necessary for rational decision
making. The rational countefactual theory is applied to identify the values of
variables Allies, Contingency, Distance, ORG, Capability, Democracy, as
well as Economic Interdependency that gives the desired consequent Peace.",1
"Clarithmetics are number theories based on computability logic (see
http://www.csc.villanova.edu/~japaridz/CL/ ). Formulas of these theories
represent interactive computational problems, and their ""truth"" is understood
as existence of an algorithmic solution. Various complexity constraints on such
solutions induce various versions of clarithmetic. The present paper introduces
a parameterized/schematic version PRODUCT). By tuning the CARDINAL
parameters P1,P2,P3 in an essentially mechanical manner, CARDINAL automatically
obtains sound and complete theories with respect to a wide range of target
tricomplexity classes, i.e. combinations of time (set by ORG), space (set by PERSON)
and so called amplitude (set by CARDINAL) complexities. Sound in the sense that every
theorem T of the system represents an interactive number-theoretic
computational problem with a solution from the given tricomplexity class and,
furthermore, such a solution can be automatically extracted from a proof of NORP
And complete in the sense that every interactive number-theoretic problem with
a solution from the given tricomplexity class is represented by some theorem of
the system. Furthermore, through tuning the ORDINAL parameter CARDINAL, at the cost of
sacrificing recursive axiomatizability but not simplicity or elegance, the
above extensional completeness can be strengthened to intensional completeness,
according to which every formula representing a problem with a solution from
the given tricomplexity class is a theorem of the system. This article is
published in CARDINAL parts. The present Part I introduces the system and proves its
completeness, while Part II is devoted to proving soundness.","An effective theory of large FAC of mesons has been used to study CARDINAL
K_{l4} decay modes. It has been found that the matrix elements of the
axial-vector current dominate the K_{l4} decays. PCAC is satisfied. A
relationship between CARDINAL form factors of axial-vector current has been found.
ORG phase shifts are originated in \rho-->\pi\pi. The decay rates are
calculated in the chiral limit. In this study there is no adjustable parameter.",0
"This paper introduces ORG (ORG), a method for
measuring relational similarity. ORG has potential applications in many areas,
including information extraction, word sense disambiguation, machine
translation, and information retrieval. Relational similarity is correspondence
between relations, in contrast with attributional similarity, which is
correspondence between attributes. When CARDINAL words have a high degree of
attributional similarity, we call them synonyms. When CARDINAL pairs of words have a
high degree of relational similarity, we say that their relations are
analogous. For example, the word pair mason/stone is analogous to the pair
carpenter/wood. Past work on semantic similarity measures has mainly been
concerned with attributional similarity. Recently ORG (VSM)
of information retrieval has been adapted to the task of measuring relational
similarity, achieving a score of PERCENT on a collection of CARDINAL college-level
multiple-choice word analogy questions. In the ORG approach, the relation
between a pair of words is characterized by a vector of frequencies of
predefined patterns in a large corpus. ORG extends the ORG approach in CARDINAL
ways: (CARDINAL) the patterns are derived automatically from the corpus (they are not
predefined), (CARDINAL) ORG (ORG) is used to smooth the
frequency data (it is also used this way in ORG), and (CARDINAL)
automatically generated synonyms are used to explore reformulations of the word
pairs. ORG achieves PERCENT on the CARDINAL analogy questions, statistically equivalent
to the average human score of PERCENT. On the related problem of classifying
noun-modifier relations, ORG achieves similar gains over the ORG, while using a
smaller corpus.","In this paper, we review CARDINAL heuristic strategies for handling
context-sensitive features in supervised machine learning from examples. We
discuss CARDINAL methods for recovering lost (implicit) contextual information. We
mention some evidence that hybrid strategies can have a synergetic effect. We
then show how the work of several machine learning researchers fits into this
framework. While we do not claim that these strategies exhaust the
possibilities, it appears that the framework includes all of the techniques
that can be found in the published literature on contextsensitive learning.",1
"In this paper we develop a method for report level tracking based on
PERSON clustering using ORG spin neural networks where clusters of
incoming reports are gradually fused into existing tracks, CARDINAL cluster for each
track. Incoming reports are put into a cluster and continuous reclustering of
older reports is made in order to obtain maximum association fit within the
cluster and towards the track. Over time, the oldest reports of the cluster
leave the cluster for the fixed track at the same rate as new incoming reports
are put into it. Fusing reports to existing tracks in this fashion allows us to
take account of both existing tracks and the probable future of each track, as
represented by younger reports within the corresponding cluster. This gives us
a robust report-to-track association. Compared to clustering of all available
reports this approach is computationally faster and has a better
report-to-track association than simple step-by-step association.","In this paper, we introduce for the ORDINAL time the notions of neutrosophic
measure and neutrosophic integral, and we develop the DATE notion of
neutrosophic probability. We present many practical examples. It is possible to
define the neutrosophic measure and consequently the neutrosophic integral and
neutrosophic probability in many ways, because there are various types of
indeterminacies, depending on the problem we need to solve. Neutrosophics study
the indeterminacy. Indeterminacy is different from randomness. It can be caused
by physical space materials and type of construction, by items involved in the
space, etc.",0
"The bias/variance tradeoff is fundamental to learning: increasing a model's
complexity can improve its fit on training data, but potentially worsens
performance on future samples. Remarkably, however, the human brain
effortlessly handles a wide-range of complex pattern recognition tasks. On the
basis of these conflicting observations, it has been argued that useful biases
in the form of ""generic mechanisms for representation"" must be hardwired into
cortex (NORP et al).
This note describes a useful bias that encourages cooperative learning which
is both biologically plausible and rigorously justified.","Methods from convex optimization are widely used as building blocks for deep
learning algorithms. However, the reasons for their empirical success are
unclear, since modern convolutional networks (convnets), incorporating
rectifier units and PERSON-pooling, are neither smooth nor convex. Standard
guarantees therefore do not apply. This paper provides the ORDINAL convergence
rates for gradient descent on rectifier convnets. The proof utilizes the
particular structure of rectifier networks which consists in binary
active/inactive gates applied on top of an underlying linear network. The
approach generalizes to PERSON-pooling, dropout and maxout. In other words, to
precisely the neural networks that perform best empirically. The key step is to
introduce gated games, an extension of convex games with similar convergence
properties that capture the gating function of rectifiers. The main result is
that rectifier convnets converge to a critical point at a rate controlled by
the gated-regret of the units in the network. Corollaries of the main result
include: (i) a game-theoretic description of the representations learned by a
neural network; (ii) a logarithmic-regret algorithm for training neural nets;
and (iii) a formal setting for analyzing conditional computation in neural nets
that can be applied to recently developed models of attention.",1
"The mutual information of CARDINAL random variables i and j with joint
probabilities t_ij is commonly used in learning NORP nets as well as in
many other fields. The chances t_ij are usually estimated by the empirical
sampling frequency n_ij/n leading to a point estimate I(n_ij/n) for the mutual
information. To answer questions like ""is I(n_ij/n) consistent with CARDINAL?"" or
""what is the probability that the true mutual information is much larger than
the point estimate?"" one has to go beyond the point estimate. In the NORP
framework one can answer these questions by utilizing a (ORDINAL order) prior
distribution p(t) comprising prior information about t. From the prior p(t) CARDINAL
can compute the posterior p(t|n), from which the distribution p(I|n) of the
mutual information can be calculated. We derive reliable and quickly computable
approximations for GPE). We concentrate on the mean, variance, skewness, and
kurtosis, and non-informative priors. For the mean we also give an exact
expression. Numerical issues and the range of validity are discussed.","The NORP framework is a well-studied and successful framework for
inductive reasoning, which includes hypothesis testing and confirmation,
parameter estimation, sequence prediction, classification, and regression. But
standard statistical guidelines for choosing the model class and prior are not
always available or fail, in particular in complex situations. PERSON
completed the NORP framework by providing a rigorous, unique, formal, and
universal choice for the model class and the prior. We discuss in breadth how
and in which sense universal (non-i.i.d.) sequence prediction solves various
(philosophical) problems of traditional NORP sequence prediction. We show
that PERSON's model possesses many desirable properties: Strong total and
weak instantaneous bounds, and in contrast to most classical continuous prior
densities has no zero p(oste)rior problem, i.e. can confirm universal
hypotheses, is reparametrization and regrouping invariant, and avoids the
old-evidence and updating problem. It even performs well (actually better) in
non-computable environments.",1
"The CARDINAL ORG PRODUCT of the Science Citation Index DATE and the
ORG Citation Index DATE were combined in order to analyze and map
journals and specialties at the edges and in the overlap between the CARDINAL
databases. For journals which belong to the overlap (e.g., Scientometrics), the
merger mainly enriches our insight into the structure which can be obtained
from the CARDINAL databases separately; but in the case of scientific journals which
are more marginal in either database, the combination can provide a new
perspective on the position and function of these journals (e.g., Environment
and Planning B-Planning and Design). The combined database additionally enables
us to map citation environments in terms of the various specialties
comprehensively. Using the vector-space model, visualizations are provided for
specialties that are parts of the overlap (information science, science &
technology studies). On the basis of the resulting visualizations,
""betweenness""--a measure from social network analysis--is suggested as an
indicator for measuring the interdisciplinarity of journals.","The talk describes a general approach of a genetic algorithm for multiple
objective optimization problems. A particular dominance relation between the
individuals of the population is used to define a fitness operator, enabling
the genetic algorithm to adress even problems with efficient, but
convex-dominated alternatives. The algorithm is implemented in a multilingual
computer program, solving vehicle routing problems with time windows under
multiple objectives. The graphical user interface of the program shows the
progress of the genetic algorithm and the main parameters of the approach can
be easily modified. In addition to that, the program provides powerful decision
support to the decision maker. The software has proved it's excellence at the
finals of ORG ORG, held at the NORP college/
ORG.",0
"The ORG effect for superconductors in spacetimes with torsion is
revisited.CARDINAL new physical interpretaions are presented.The ORDINAL considers the
ORG theory yields a new symmetry-breaking vacuum depending on
torsion.In the ORDINAL interpretation a gravitational ORG torsional effect
where when the LOC field vanishes, torsion and electromagnetic fields need
not vanish and outside the ORG tubes a torsion vector analogous to the
ORG potential is obtained.The analogy is even stronger if we think that in
this case the torsion vector has to be derivable from a torsion
potential.Another solution of ORG equation is shown to lead
naturally to the geometrization of the electromagnetism in terms of the torsion
field.","The necessity of a newly proposed (PRD 70 (2004) 64004) NORP
acoustic spacetime structure called acoustic torsion of sound wave equation in
fluids with vorticity are discussed. It is shown that this structure, although
not always necessary is present in fluids with vorticity even when the
perturbation is rotational. This can be done by solving the LOC et PERSON(NORP D (DATE)) gauge invariant equations for sound, superposed to a general
background flow, needs to support a NORP acoustic geometry in
effective spacetime. PERSON et PERSON have previously shown that a NORP
structure cannot be associated to this gauge invariant general system.",1
"ORG of consciousness has been dismissed as an illusion. By
showing that computers are capable of experiencing, we show that they are at
least rudimentarily conscious with potential to eventually reach
superconsciousness. The main contribution of the paper is a test for confirming
certain subjective experiences in a tested agent. We follow with analysis of
benefits and problems with conscious machines and implications of such
capability on future of computing, machine rights and artificial intelligence
safety.","Despite significant developments in LOC, surprisingly little
attention has been devoted to the concept of proof verifier. In particular, the
mathematical community may be interested in studying different types of proof
verifiers (people, programs, oracles, communities, superintelligences) as
mathematical objects. Such an effort could reveal their properties, their
powers and limitations (particularly in human mathematicians), minimum and
maximum complexity, as well as self-verification and self-reference issues. We
propose an initial classification system for verifiers and provide some
rudimentary analysis of solved and open problems in this important domain. Our
main contribution is a formal introduction of the notion of unverifiability,
for which the paper could serve as a general citation in domains of theorem
proving, as well as software and ORG verification.",1
"How to artificially encode observer in universal information coding structure
like DNA ? It requires naturally creating information Bits and natural encoding
triplet code enables recognizing other encoded information. These Bits become
standard units of different information languages in modern communications.
Fundamental interactions build structure of ORG. Numerous multilevel
inter-species interactions selforganize biosystems. Human interactions unify
these and many others. Physical reality is only interactions identified or not
yet. Each interaction is elementary yes-no action of impulse which models a
natural Bit. Natural interactive process, transferring ORG, models information
process.
Information is universal physical substance a phenomenon of interaction which
not only originates information but transfers it sequentially. Mutually
interacting processes enable creating new elements like chemical chain
reactions. The elements enclosing components of reaction memorize the
interactive yes-no result similar to encoding. Energy quantity and quality of
specific interaction determine sequence of transferring information, its
encoding, and limit the code length. The introduced formalism of natural
emergence information and its encoding also shows advantage over non-natural
encoding. The impulse sequential natural encoding merges memory with the time
of memorizing information and compensates the cost by running time intervals of
encoding. Information process binds the encoding impulse reversible
microprocesses in multiple impulses macroprocess of information irreversible
dynamics establishing interactive integrated information dynamics. The encoding
process integrates geometrical triplet coding structure rotating double helix
of sequencing cells ORG, which commands cognition, intelligence including
conscience.
The results validate computer simulation, and experiments.","This paper shows that universal ORG computers possess decoherent
histories in which complex adaptive systems evolve with high probability.",0
"Although deep learning based speech enhancement methods have demonstrated
good performance in adverse acoustic environments, their performance is
strongly affected by the distance between the speech source and the microphones
since speech signals fade quickly during the propagation through air. In this
paper, we propose \textit{deep ad-hoc beamforming} to address the far field
speech processing problem. It contains CARDINAL novel components. ORDINAL, it combines
\textit{ad-hoc microphone arrays} with deep-learning-based multichannel speech
enhancement, where an ad-hoc microphone array is a set of randomly distributed
microphones collaborating with each other. This combination reduces the
probability of the occurrence of far-field NORP environments significantly.
ORDINAL, it opens a new ORG selection}---to the
deep-learning-based multichannel speech enhancement, and groups the microphones
around the speech source to a local microphone array by a channel selection
algorithm. The channel selection algorithm ORDINAL predicts the quality of the
received speech signal of each channel by a deep neural network. Then, it
groups the microphones that have high speech quality and strong cross-channel
signal correlation into a local microphone array. We developed several channel
selection algorithms from the simplest one-best channel selection to a
machine-learning-based channel selection. We conducted an extensive experiment
in scenarios where the locations of the speech sources are far-field, random,
and blind to the microphones. Results show that our method outperforms
representative deep-learning-based speech enhancement methods by a large margin
in both diffuse noise reverberant environments and point source noise
reverberant environments.","The GPE-Sabatier method for solving inverse scattering problem with
fixed-energy phase shifts for a sperically symmetric potential is discussed. It
is shown that this method is fundamentally wrong: in general it cannot be
carried through, the basic ansatz of GPE is wrong: the transformation
kernel does not have the form postulated in this ansatz, in general, the method
is inconsistent, and some of the physical conclusions, e.g., existence of the
transparent potentials, are not proved. A mathematically justified method for
solving the CARDINAL-dimensional inverse scattering problem with fixed-energy data
is described. This method is developed by ORDINAL for exact data and for noisy
discrete data, and error estimates for this method are obtained. Difficulties
of the numerical implementation of the inversion method based on the
Dirichlet-to-Neumann map are pointed out and compared with the difficulty of
the implementation of the PERSON's inversion method.",0
"Computability logic (CL) (see ORG) is a
recently launched program for redeveloping logic as a formal theory of
computability, as opposed to the formal theory of truth that logic has more
traditionally been. Formulas in it represent computational problems, ""truth""
means existence of an algorithmic solution, and proofs encode such solutions.
Within the line of research devoted to finding axiomatizations for ever more
expressive fragments of ORG, the present paper introduces a new deductive system
CL12 and proves its soundness and completeness with respect to the semantics of
ORG. Conservatively extending classical predicate calculus and offering
considerable additional expressive and deductive power, CL12 presents a
reasonable, computationally meaningful, constructive alternative to classical
logic as a basis for applied theories. To obtain a model example of such
theories, this paper rebuilds the traditional, classical-logic-based ORG
arithmetic into a computability-logic-based counterpart. Among the purposes of
the present contribution is to provide a starting point for what, as the author
wishes to hope, might become a new line of research with a potential of
interesting findings -- an exploration of the presumably quite unusual
metatheory of CL-based arithmetic and other ORG-based applied systems.","The present paper constructs CARDINAL new systems of clarithmetic (arithmetic
based on computability logic --- see ORG):
CLA8, GPE and CLA10. System CLA8 is shown to be sound and extensionally
complete with respect to PA-provably recursive time computability. This is in
the sense that an arithmetical problem A has a t-time solution for some
PA-provably recursive function t iff A is represented by some theorem of CLA8.
FAC is shown to be sound and intensionally complete with respect to
constructively PA-provable computability. This is in the sense that a sentence
X is a theorem of GPE iff, for some particular machine M, PA proves that M
computes (the problem represented by) X. And system CLA10 is shown to be sound
and intensionally complete with respect to not-necessarily-constructively
PA-provable computability. This means that a sentence X is a theorem of CLA10
iff PA proves that X is computable, even if PA does not ""know"" of any
particular machine M that computes PERSON",1
"In this article, we assume the $Z_c(4200)$ as the color octet-octet type
axial-vector molecule-like state, and construct the color octet-octet type
axial-vector current to study its mass and width with the ORG sum rules. The
numerical values $M_{Z_c(4200)}=4.19 \pm 0.08\,\rm{GeV}$ and
$\Gamma_{Z_c(4200)}\approx 334\,\rm{MeV}$ are consistent with the experimental
data $MONEY)} = CARDINAL} \,\rm{MeV}$ and
$MONEY)} = CARDINAL, and support
assigning the $PERSON to be the color octet-octet type molecule-like state
with $PERSON, we discuss the possible assignments of the
$Z_c(3900)$, $Z_c(4200)$ and $PERSON as the diquark-antidiquark type
tetraquark states with $J^{PC}=1^{+-}$.","In this paper, we reexamine ORG, which demonstrates a
basic incompatibility between computationalism and materialism. We discover
that the incompatibility is only manifest in singular classical-like universes.
If we accept that we live in a ORG, then the incompatibility goes away,
but in that case another line of argument shows that with computationalism, the
fundamental, or primitive materiality has no causal influence on what is
observed, which must must be derivable from basic arithmetic properties.",0
"The notion of quantum Turing machines is a basis of quantum complexity
theory. We discuss a general model of multi-tape, multi-head ORG
machines with multi final states that also allow tape heads to stay still.","The linear space hypothesis is a practical working hypothesis, which
originally states the insolvability of a restricted CARDINAL NORP formula
satisfiability problem parameterized by the number of NORP variables. From
this hypothesis, it naturally follows that the degree-3 directed graph
connectivity problem (CARDINAL) parameterized by the number of vertices in a
given graph cannot belong to PsubLIN, composed of all parameterized decision
problems computable by polynomial-time, sub-linear-space deterministic Turing
machines. This hypothesis immediately implies GPE and it was used as a
solid foundation to obtain new lower bounds on the computational complexity of
various ORG search and ORG optimization problems. The state complexity of
transformation refers to the cost of converting CARDINAL type of finite automata to
another type, where the cost is measured in terms of the increase of the number
of inner states of the converted automata from that of the original automata.
We relate the linear space hypothesis to the state complexity of transforming
restricted CARDINAL-way nondeterministic finite automata to computationally equivalent
CARDINAL-way alternating finite automata having narrow computation graphs. For this
purpose, we present state complexity characterizations of CARDINAL and PsubLIN.
We further characterize a nonuniform version of the linear space hypothesis in
terms of the state complexity of transformation.",1
"Originally, quantum probability theory was developed to analyze statistical
phenomena in ORG, where classical probability theory does not
apply, because the lattice of measurable sets is not necessarily distributive.
On the other hand, it is well known that the lattices of concepts, that arise
in data analysis, are in general also non-distributive, albeit for completely
different reasons. In his recent book, PERSON argues that many of the
logical tools developed for ORG systems are also suitable for applications
in information retrieval. I explore the mathematical support for this idea on
an abstract vector space model, covering several forms of data analysis
(information retrieval, data mining, collaborative filtering, formal concept
analysis...), and roughly based on an idea from categorical quantum mechanics.
It turns out that quantum (i.e., noncommutative) probability distributions
arise already in this rudimentary mathematical framework. We show that a
ORG-type inequality must be satisfied by the standard similarity measures, if
they are used for preference predictions. The fact that already a very general,
abstract version of the vector space model yields simple counterexamples for
such inequalities seems to be an indicator of a genuine need for quantum
statistics in data analysis.","In the practice of information extraction, the input data are usually
arranged into pattern matrices, and analyzed by the methods of ORG algebra
and statistics, such as principal component analysis. In some applications, the
tacit assumptions of these methods lead to wrong results. The usual reason is
that the matrix composition of ORG algebra presents information as flowing
in waves, whereas it sometimes flows in particles, which seek the shortest
paths. This wave-particle duality in computation and information processing has
been originally observed by PERSON. In this paper we pursue a particle view
of information, formalized in *distance spaces*, which generalize metric
spaces, but are slightly less general than Lawvere's *generalized metric
spaces*. In this framework, the task of extracting the 'principal components'
from a given matrix of data boils down to a bicompletio}, in the sense of
enriched category theory. We describe the bicompletion construction for
distance matrices. The practical goal that motivates this research is to
develop a method to estimate the hardness of attack constructions in security.",1
"There are CARDINAL kinds of similarity. Relational similarity is
correspondence between relations, in contrast with attributional similarity,
which is correspondence between attributes. When CARDINAL words have a high degree
of attributional similarity, we call them synonyms. When CARDINAL pairs of words
have a high degree of relational similarity, we say that their relations are
analogous. For example, the word pair mason:stone is analogous to the pair
carpenter:wood. This paper introduces ORG (ORG), a
method for measuring relational similarity. ORG has potential applications in
many areas, including information extraction, word sense disambiguation, and
information retrieval. Recently ORG (VSM) of information
retrieval has been adapted to measuring relational similarity, achieving a
score of PERCENT on a collection of CARDINAL college-level multiple-choice word analogy
questions. In the ORG approach, the relation between a pair of words is
characterized by a vector of frequencies of predefined patterns in a large
corpus. ORG extends the ORG approach in CARDINAL ways: (CARDINAL) the patterns are
derived automatically from the corpus, (CARDINAL) ORG
(ORG) is used to smooth the frequency data, and (CARDINAL) automatically generated
synonyms are used to explore variations of the word pairs. ORG achieves PERCENT on
the CARDINAL analogy questions, statistically equivalent to the average human score
of PERCENT. On the related problem of classifying semantic relations, ORG achieves
similar gains over the ORG.","ORG (for ORG) is a computer model of culture that enables
us to investigate how various factors such as barriers to cultural diffusion,
the presence and choice of leaders, or changes in the ratio of innovation to
imitation affect the diversity and effectiveness of ideas. It consists of
neural network based agents that invent ideas for actions, and imitate
neighbors' actions. The model is based on a theory of culture according to
which what evolves through culture is not memes or artifacts, but the internal
models of the world that give rise to them, and they evolve not through a
NORP process of competitive exclusion but a NORP process involving
exchange of innovation protocols. ORG shows an increase in mean fitness of
actions over time, and an increase and then decrease in the diversity of
actions. Diversity of actions is positively correlated with population size and
density, and with barriers between populations. Slowly eroding borders increase
fitness without sacrificing diversity by fostering specialization followed by
sharing of fit actions. Introducing a leader that broadcasts its actions
throughout the population increases the fitness of actions but reduces
diversity of actions. Increasing the number of leaders reduces this effect.
ORG are underway to simulate the conditions under which an agent
immigrating from one culture to another contributes new ideas while still
fitting in.",0
"In biology, information flows from the environment to the genome by the
process of natural selection. But it has not been clear precisely what sort of
information metric properly describes natural selection. Here, I show that
ORG information arises as the intrinsic metric of natural selection and
evolutionary dynamics. Maximizing the amount of ORG information about the
environment captured by the population leads to ORG's fundamental theorem of
natural selection, the most profound statement about how natural selection
influences evolutionary dynamics. I also show a relation between ORG
information and FAC information (entropy) that may help to unify the
correspondence between information and dynamics. Finally, I discuss possible
connections between the fundamental role of ORG information in statistics,
biology, and other fields of science.","The consistency of the species abundance distribution across diverse
communities has attracted widespread attention. In this paper, I argue that the
consistency of pattern arises because diverse ecological mechanisms share a
common symmetry with regard to measurement scale. By symmetry, I mean that
different ecological processes preserve the same measure of information and
lose all other information in the aggregation of various perturbations. I frame
these explanations of symmetry, measurement, and aggregation in terms of a
recently developed extension to the theory of maximum entropy. I show that the
natural measurement scale for the species abundance distribution is log-linear:
the information in observations at small population sizes scales
logarithmically and, as population size increases, the scaling of information
grades from logarithmic to linear. Such log-linear scaling leads naturally to a
gamma distribution for species abundance, which matches well with the observed
patterns. Much of the variation between samples can be explained by the
magnitude at which the measurement scale grades from logarithmic to ORG.
This measurement approach can be applied to the similar problem of allelic
diversity in population genetics and to a wide variety of other patterns in
biology.",1
"Using formal tools in computer science to describe games is an interesting
problem. We give games, exactly CARDINAL person games, an axiomatic foundation based
on the process algebra ORG (Algebra of Communicating Process). A fresh operator
called opponent's alternative composition operator (GPE) is introduced into ORG
to describe game trees and game strategies, called GameACP. And its sound and
complete axiomatic system is naturally established. To model the outcomes of
games (the co-action of the player and the opponent), correspondingly in
GameACP, the execution of GameACP processes, another operator called playing
operator (ORG) is extended into GameACP. We also establish a sound and complete
axiomatic system for ORG. To overcome the new occurred non-determinacy
introduced by GameACP, we extend truly concurrent process algebra APTC for
games called GameAPTC. Finally, we give the correctness theorem between the
outcomes of games and the deductions of GameACP and GameAPTC processes.","We introduce parallelism into the basic algebra of games to model concurrent
game algebraically. Parallelism is treated as a new kind of game operation. The
resulted algebra of concurrent games can be used widely to reason the parallel
systems.",1
"This paper describes a novel perspective on the foundations of mathematics:
how mathematics may be seen to be largely about 'information compression via
the matching and unification of patterns' (ICMUP). ICMUP is itself a novel
approach to information compression, couched in terms of non-mathematical
primitives, as is necessary in any investigation of the foundations of
mathematics. This new perspective on the foundations of mathematics has grown
out of an extensive programme of research developing the ""WORK_OF_ART"" and its realisation in ORG"", a system in which
a generalised version of ICMUP -- the powerful concept of NORP-multiple-alignment
-- plays a central role. These ideas may be seen to be part of ORG""
comprising CARDINAL areas of interest, with information compression as a unifying
theme. The paper describes the close relation between mathematics and
information compression, and describes examples showing how variants of ICMUP
may be seen in widely-used structures and operations in mathematics. Examples
are also given to show how the mathematics-related disciplines of logic and
computing may be understood as ICMUP. There are many potential benefits and
applications of these ideas.","This article presents an overview of the idea that ""information compression
by multiple alignment, unification and search"" (ICMAUS) may serve as a unifying
principle in computing (including mathematics and logic) and in such aspects of
human cognition as the analysis and production of natural language, fuzzy
pattern recognition and best-match information retrieval, concept hierarchies
with inheritance of attributes, probabilistic reasoning, and unsupervised
inductive learning. The ORG concepts are described together with an outline
of the SP61 software model in which the ORG concepts are currently realised.
A range of examples is presented, illustrated with output from the SP61 model.",1
"DATE, robotics is an auspicious and fast-growing branch of technology that
involves the manufacturing, design, and maintenance of robot machines that can
operate in an autonomous fashion and can be used in a wide variety of
applications including space exploration, weaponry, household, and
transportation. More particularly, in space applications, a common type of
robots has been of widespread use in DATE. It is called planetary
rover which is a robot vehicle that moves across the surface of a planet and
conducts detailed geological studies pertaining to the properties of the
landing cosmic environment. However, rovers are always impeded by obstacles
along the traveling path which can destabilize the rover's body and prevent it
from reaching its goal destination. This paper proposes an ORG model that
allows rover systems to carry out autonomous path-planning to successfully
navigate through challenging planetary terrains and follow their goal location
while avoiding dangerous obstacles. The proposed ORG is a multilayer network
made out of CARDINAL layers: an input, a hidden, and an output layer. The network
is trained in offline mode using back-propagation supervised learning
algorithm. A software-simulated rover was experimented and it revealed that it
was able to follow the safest trajectory despite existing obstacles. As future
work, the proposed ORG is to be parallelized so as to speed-up the execution
time of the training process.","We define a ""nit"" as a radix n measure of ORG information which is based
on state partitions associated with the outcomes of n-ary observables and
which, for n>2, is fundamentally irreducible to a binary coding. Properties of
this measure for entangled many-particle states are discussed. k particles
specify k nits in such a way that k mutually commuting measurements of
observables with n possible outcomes are sufficient to determine the
information.",0
"We pursue a model-oriented rather than axiomatic approach to the foundations
of ORG, with the idea that new models can often suggest new
axioms. This approach has often been fruitful in ORG. Rather than seeking to construct a simplified toy model, we aim for a
`big toy model', in which both quantum and classical systems can be faithfully
represented - as well as, possibly, more exotic kinds of systems.
To this end, we show how PERSON spaces can be used to represent physical systems
of various kinds. In particular, we show how quantum systems can be represented
as PERSON spaces over the unit interval in such a way that the PERSON morphisms
correspond exactly to the physically meaningful symmetries of the systems - the
unitaries and antiunitaries. In this way we obtain a full and faithful functor
from the groupoid of PERSON spaces and their symmetries to PERSON spaces. We also
consider whether it is possible to use a finite value set rather than the unit
interval; we show that CARDINAL values suffice, while the CARDINAL standard
possibilistic reductions to CARDINAL values both fail to preserve fullness.","Galles and GPE claimed that ""for recursive models, the causal model
framework does not add any restrictions to counterfactuals, beyond those
imposed by PERSON's [possible-worlds] framework."" This claim is examined
carefully, with the goal of clarifying the exact relationship between causal
models and PERSON's framework. Recursive models are shown to correspond
precisely to a subclass of (possible-world) counterfactual structures. On the
other hand, a slight generalization of recursive models, models where all
equations have unique solutions, is shown to be incomparable in expressive
power to counterfactual structures, despite the fact that the PERSON and GPE
arguments should apply to them as well. The problem with the PERSON and GPE
argument is identified: an axiom that they viewed as irrelevant, because it
involved disjunction (which was not in their language), is not irrelevant at
all.",0
"Point clouds are sets of points in CARDINAL or CARDINAL dimensions. Most kernel
methods for learning on sets of points have not yet dealt with the specific
geometrical invariances and practical constraints associated with point clouds
in computer vision and graphics. In this paper, we present extensions of graph
kernels for point clouds, which allow to use kernel methods for such ob jects
as shapes, line drawings, or any CARDINAL-dimensional point clouds. In order to
design rich and numerically efficient kernels with as few free parameters as
possible, we use kernels between covariance matrices and their factorizations
on graphical models. We derive polynomial time dynamic programming recursions
and present applications to recognition of handwritten digits and NORP
characters from few training examples.","We consider the least-square regression problem with regularization by a
block CARDINAL-norm, i.e., a sum of LOC norms over spaces of dimensions larger
than one. This problem, referred to as the group PERSON, extends the usual
regularization by the CARDINAL-norm where all spaces have dimension one, where it is
commonly referred to as the PERSON. In this paper, we study the asymptotic model
consistency of the group PERSON. We derive necessary and sufficient conditions
for the consistency of group PERSON under practical assumptions, such as model
misspecification. When the linear predictors and LOC norms are replaced
by functions and reproducing kernel PERSON norms, the problem is usually
referred to as multiple kernel learning and is commonly used for learning from
heterogeneous data sources and for non linear variable selection. Using tools
from functional analysis, and in particular covariance operators, we extend the
consistency results to this infinite dimensional case and also propose an
adaptive scheme to obtain a consistent model estimate, even when the necessary
condition required for the non adaptive scheme is not satisfied.",1
"These informal notes discuss a few basic notions and examples, with emphasis
on constructions that may be relevant for analysis on metric spaces.","In these informal notes, we continue to explore p-adic versions of NORP
groups and some of their variants, including the structure of the corresponding
ORG sets.",1
"We generalize the approach of PERSON and PERSON (DATE) for multiple
changepoint problems where the number of changepoints is unknown. The approach
is based on dynamic programming recursion for efficient calculation of the
marginal probability of the data with the hidden parameters integrated out. For
the estimation of the hyperparameters, we propose to use PERSON when
training data are available. We argue that there is some advantages of using
samples from the posterior which takes into account the uncertainty of the
changepoints, compared to the traditional ORG estimator, which is also more
expensive to compute in this context. The samples from the posterior obtained
by our algorithm are independent, getting rid of the convergence issue
associated with the MCMC approach. We illustrate our approach on limited
simulations and some real data set.","We present a new model of computation, described in terms of monoidal
categories. It conforms ORG, and captures the same
computable functions as the standard models. It provides a succinct categorical
interface to most of them, free of their diverse implementation details, using
the ideas and structures that in the meantime emerged from research in
semantics of computation and programming. The salient feature of the language
of monoidal categories is that it is supported by a sound and complete
graphical formalism, string diagrams, which provide a concrete and intuitive
interface for abstract reasoning about computation. The original motivation and
the ultimate goal of this effort is to provide a convenient high level
programming language for a theory of computational resources, such as CARDINAL-way
functions, and trapdoor functions, by adopting the methods for hiding the low
level implementation details that emerged from practice. In the present paper,
we make a ORDINAL step towards this ambitious goal, and sketch a path to reach
it. This path is pursued in CARDINAL sequel papers, that are in preparation.",0
"In this paper, we introduce a novel situation aware approach to improve a
context based recommender system. To build situation aware user profiles, we
rely on evidence issued from retrieval situations. A retrieval situation refers
to the social spatio temporal context of the user when he interacts with the
recommender system. A situation is represented as a combination of social
spatio temporal concepts inferred from ontological knowledge given social
group, location and time information. User's interests are inferred from past
user's interaction with the recommender system related to the identified
situations. They are represented using concepts issued from a domain ontology.
We also propose a method to dynamically adapt the system to the user's
interest's evolution.","The state complexity of a GPE) automaton intuitively measures the
size of the description of the automaton. ORG and PERSON [STOC DATE, GPE
CARDINAL-286] were concerned with nonuniform families of finite automata and they
discussed the behaviors of nonuniform complexity classes defined by families of
such finite automata having polynomial-size state complexity. In a similar
fashion, we introduce nonuniform state complexity classes using families of
quantum finite automata. Our primarily concern is CARDINAL-way quantum finite
automata empowered by garbage tapes. We show inclusion and separation
relationships among nonuniform state complexity classes of various CARDINAL-way
finite automata, including deterministic, nondeterministic, probabilistic, and
quantum finite automata of polynomial size. For CARDINAL-way quantum finite automata
equipped with garbage tapes, we discover a close relationship between the
nonuniform state complexity of such a polynomial-size quantum finite automata
family and the parameterized complexity class induced by quantum
logarithmic-space computation assisted by polynomial-size advice.",0
"It is generally accepted that human vision is an extremely powerful
information processing system that facilitates our interaction with the
surrounding world. However, despite extended and extensive research efforts,
which encompass many exploration fields, the underlying fundamentals and
operational principles of visual information processing in human brain remain
unknown. We still are unable to figure out where and how along the path from
eyes to the cortex the sensory input perceived by the retina is converted into
a meaningful object representation, which can be consciously manipulated by the
brain. Studying the vast literature considering the various aspects of brain
information processing, I was surprised to learn that the respected scholarly
discussion is totally indifferent to the basic keynote question: ""What is
information?"" in general or ""What is visual information?"" in particular. In the
old days, it was assumed that any scientific research approach has ORDINAL to
define its basic departure points. Why was it overlooked in brain information
processing research remains a conundrum. In this paper, I am trying to find a
remedy for this bizarre situation. I propose an uncommon definition of
""information"", which can be derived from ORG and
PERSON's notion of Algorithmic Information. Embracing this new definition
leads to an inevitable revision of traditional dogmas that shape the state of
the art of brain information processing research. I hope this revision would
better serve the challenging goal of human visual information processing
modeling.","As per leading IT experts, DATE's large enterprises are going through
business transformations. They are adopting service-based IT models such as ORG
to develop their enterprise information systems and applications. In fact, ORG
is an integration of loosely-coupled interoperable components, possibly built
using heterogeneous software technologies and hardware platforms. As a result,
traditional testing architectures are no more adequate for verifying and
validating the quality of ORG systems and whether they are operating to
specifications. This paper ORDINAL discusses the various state-of-the-art methods
for testing SOA applications, and then it proposes a novel automated,
distributed, cross-platform, and regression testing architecture for ORG
systems. The proposed testing architecture consists of several testing units
which include test engine, test code generator, test case generator, test
executer, and test monitor units. Experiments conducted showed that the
proposed testing architecture managed to use parallel agents to test
heterogeneous web services whose technologies were incompatible with the
testing framework. As future work, testing non-functional aspects of ORG
applications are to be investigated so as to allow the testing of such
properties as performance, security, availability, and scalability.",0
"This paper deals with chain graphs under the alternative
ORG (AMP) interpretation. In particular, we present a
constraint based algorithm for learning an ORG chain graph a given probability
distribution is faithful to. We also show that the extension of PERSON's
conjecture to ORG chain graphs does not hold, which compromises the development
of efficient and correct score+search learning algorithms under assumptions
weaker than faithfulness.","Fundamental discrepancy between ORDINAL order logic and statistical inference
(global versus local properties of universe) is shown to be the obstacle for
integration of logic and probability in GPE. logic of ORG. To overcome the
counterintuitiveness of GPE. behaviour, a CARDINAL-valued logic is proposed.",0
"A proposal for building an index of the Web that separates the infrastructure
part of the search engine - the index - from the services part that will form
the basis for myriad search engines and other services utilizing Web data on
top of a public infrastructure open to everyone.","At Alife VI, PERSON proposed some evolutionary statistics as a means of
classifying different evolutionary systems. Ecolab, whilst not an artificial
life system, is a model of an evolving ecology that has advantages of
mathematical tractability and computational simplicity. The PERSON statistics
are well defined for ORG, and this paper reports statistics measured for
typical PERSON runs, as a function of mutation rate. The behaviour ranges from
class 1 (when mutation is switched off), through class CARDINAL at intermediate
mutation rates (corresponding to scale free dynamics) to class CARDINAL at high
mutation rates. The class CARDINAL/class CARDINAL transition corresponds to an error
threshold. Class 4 behaviour, which is typified by the Biosphere, is
characterised by unbounded growth in diversity. It turns out that PERSON is
governed by an inverse relationship between diversity and connectivity, which
also seems likely of the Biosphere. In GPE, the mutation operator is
conservative with respect to connectivity, which explains the boundedness of
diversity. The only way to get class CARDINAL behaviour in GPE is to develop an
evolutionary dynamics that reduces connectivity of time.",0
"Blind ORG computing is a new secure ORG computing protocol where a
client who does not have any sophisticated quantum technlogy can delegate her
ORG computing to a server without leaking any privacy. It is known that a
client who has only a measurement device can perform blind ORG computing
[T. Morimae and PERSON, Phys. Rev. A {\bf87}, 050301(R) (DATE)]. It has been
an open problem whether the protocol can enjoy the verification, i.e., the
ability of client to check the correctness of the computing. In this paper, we
propose a protocol of verification for the measurement-only blind ORG
computing.","Measurement-based ORG computation is a novel model of ORG computing
where universal quantum computation can be done with only local measurements on
each particle of a quantum many-body state, which is called a resource state.
CARDINAL large difference of the measurement-based model from the circuit model is
the existence of byproducts. In the circuit model, a desired unitary U can be
implemented deterministically, whereas the measurement-based model implements
BU, where B is an additional operator, which is called a byproduct. In order to
compensate byproducts, following measurement angles must be adjusted. Such a
feed-forwarding requires some classical processing and tuning of the
measurement device, which cause the delay of computation and the additional
decoherence. Is there any byproduct-free resource state? Here we show that if
we respect the no-signaling principle, which is one of the most fundamental
principles of physics, no universal resource state can avoid byproducts.",1
"In their position paper entitled ""Towards a new, complexity science of
learning and education"", PERSON (DATE) argue that educational research is
in crisis. In their opinion, the transdisciplinary and interdiscursive approach
of complexity science with its orientation towards self-organization,
emergence, and potentiality provides new modes of inquiry, a new lexicon and
assessment practices that can be used to overcome the current crisis. In this
contribution, I elaborate on how complexity science can further be developed
for understanding the dynamics of intentions and the communication of meaning
as these are central to the social-scientific enterprise.","In a recent paper entitled ""WORK_OF_ART and how to ORG,"" Schreiber (DATE, at arXiv:1202.3861)
proposed (i) a method to assess tied ranks consistently and (ii) fractional
attribution to percentile ranks in the case of relatively small samples (e.g.,
for n < 100). PERSON's solution to the problem of how to handle tied ranks
is convincing, in my opinion (cf. ORG, DATE). The fractional
attribution, however, is computationally intensive and cannot be done manually
for even moderately large batches of documents. Schreiber attributed scores
fractionally to the CARDINAL percentile rank classes used in the ORG and
WORK_OF_ART, and thus missed, in
my opinion, the point that fractional attribution at the level of CARDINAL
percentiles-or equivalently quantiles as the continuous random variable-is only
a linear, and therefore much less complex problem. Given the quantile-values,
the non-linear attribution to the CARDINAL classes or any other evaluation scheme is
then a question of aggregation. A new routine based on these principles
(including PERSON's solution for tied ranks) is made available as software
for the assessment of documents retrieved from WORK_OF_ART (at
http://www.leydesdorff.net/software/i3).",1
"The previously proposed PERSON relation $ E_c t_c >> \hbar {\cal C}$
for the energy used by a ORG computer, the total computation time and the
logical (""classical"") complexity of the problem is verified for the following
examples of quantum computations: preparation of the input state, CARDINAL
Hamiltonian versions of the Grover's algorithm, a model of ""quantum telephone
directory"", a quantum-optical device factorizing numbers and the PERSON's
algorithm.","The authors of the recent paper [CARDINAL] boldly claim to discover a new fully
quantum approach to foundation of statistical mechanics: ""Our conceptually
novel approach is free of mathematically ambiguous notions such as probability,
ensemble, randomness, etc.""
The aim of this note is to show that this approach is neither specific for
quantum systems nor really conceptually different from the standard textbook
arguments supporting microcanonical or canonical ensembles in statistical
mechanics.",1
"Keyphrases are useful for a variety of purposes, including summarizing,
indexing, labeling, categorizing, clustering, highlighting, browsing, and
searching. The task of automatic keyphrase extraction is to select keyphrases
from within the text of a given document. Automatic keyphrase extraction makes
it feasible to generate keyphrases for the huge number of documents that do not
have manually assigned keyphrases. A limitation of previous keyphrase
extraction algorithms is that the selected keyphrases are occasionally
incoherent. That is, the majority of the output keyphrases may fit together
well, but there may be a minority that appear to be outliers, with no clear
semantic relation to the majority or to each other. This paper presents
enhancements to the GPE keyphrase extraction algorithm that are designed to
increase the coherence of the extracted keyphrases. The approach is to use the
degree of statistical association among candidate keyphrases as evidence that
they may be semantically related. The statistical association is measured using
web mining. Experiments demonstrate that the enhancements improve the quality
of the extracted keyphrases. Furthermore, the enhancements are not
domain-specific: the algorithm generalizes well when it is trained on CARDINAL
domain (computer science documents) and tested on another (physics documents).","The existence of closed hypersurfaces of prescribed curvature in globally
hyperbolic NORP manifolds is proved provided there are barriers.",0
"Introduced below is a quantum database method, not only for retrieval but
also for creation. It uses a particular structure of true's and false's in a
state vector of n qubits, permitting up to CARDINAL words, vastly more than for
classical bits. Several copies are produced so that later they can be
destructively observed and a word determined with high probability. PERSON's
algorithm is proposed below to read out, nondestructively the unknown contents
of a given stored state vector using CARDINAL state vector.","Memory refinements are designed below to detect those sequences of actions
that have been repeated a given number n. Subsequently such sequences are
permitted to run without ORG involvement. This mimics human learning. Actions
are rehearsed and once learned, they are performed automatically without
conscious involvement.",1
"A technique for generating spherically symmetric dislocation solutions of a
direct Poincar\'{e} gauge theory of gravity based on homogeneous functions
which makes GPE torsion to vanish is presented.Static space supported
dislocation and time dependent solutions are supplied.Photons move along
geodesics in analogy to geodesics described by electrons around dislocations in
solid state physics.Tachyonic sectors are also found.","For the measurement of $PERSON signals in $MONEY events rigorous confidence bounds
on the true signal probability $p_{\rm exact}$ were established in a classical
paper by ORG [Biometrica 26, CARDINAL (DATE)]. Here, their bounds
are generalized to the ORG situation where cuts on the data tag signals with
probability $P_s$ and background data with likelihood MONEYP_s$. The GPE
program which, on input of $P_s$, $MONEY, the number of tagged data $PERSON and
the total number of data $MONEY, returns the requested confidence bounds as well
as bounds on the entire cumulative signal distribution function, is available
on the web. In particular, the method is of interest in connection with the
statistical analysis part of the ongoing PERSON search at the ORG experiments.",0
"In ORG it is of vital importance to manage uncertainty.
ORG data is almost always uncertain and incomplete, making it
necessary to reason and taking decisions under uncertainty. CARDINAL way to manage
the uncertainty in ORG is ORG. This thesis
contains CARDINAL results regarding multiple target tracks and intelligence
specification.","In a recent article we described a new type of deep neural network - a
WORK_OF_ART (ORG) - which is capable of learning 'on the fly'
like a brain by existing in a state of ORG
(PSGD). Here, by simulating the process of practice, we demonstrate both
selective memory and selective forgetting when we introduce statistical recall
biases during PSGD. Frequently recalled memories are remembered, whilst
memories recalled rarely are forgotten. This results in a 'use it or lose it'
stimulus driven memory process that is similar to human memory.",0
"The term complexity derives etymologically from the NORP plexus, which means
interwoven. Intuitively, this implies that something complex is composed by
elements that are difficult to separate. This difficulty arises from the
relevant interactions that take place between components. This lack of
separability is at odds with the classical scientific method - which has been
used since the times of PRODUCT, GPE, GPE, and ORG has also
influenced philosophy and engineering. In DATE, the scientific study
of complexity and complex systems has proposed a paradigm shift in science and
philosophy, proposing novel methods that take into account relevant
interactions.","Software capable of improving itself has been a dream of computer scientists
since the inception of the field. In this work we provide definitions for
Recursively Self-Improving software, survey different types of self-improving
software, review the relevant literature, analyze limits on computation
restricting recursive self-improvement and introduce RSI Convergence Theory
which aims to predict general behavior of RSI systems. Finally, we address
security implications from self-improving intelligent software.",0
"In former work, we showed that a quantum algorithm is the sum over the
histories of a classical algorithm that knows in advance PERCENT of the information
about the solution of the problem - each history is a possible way of getting
the advanced information and a possible result of computing the missing
information. We gave a theoretical justification of this PERCENT advanced
information rule and checked that it holds for a large variety of quantum
algorithms. Now we discuss the theoretical justification in further detail and
counter a possible objection. We show that the rule is the generalization of a
simple, well known, explanation of quantum nonlocality - where logical
correlation between measurement outcomes is physically backed by a
causal/deterministic/local process with causality allowed to go backward in
time with backdated state vector reduction. The possible objection is that
quantum algorithms often produce the solution of the problem in an apparently
deterministic way (when their unitary part produces an eigenstate of the
observable to be measured and measurement produces the corresponding eigenvalue
- the solution - with probability CARDINAL), while the present explanation of the
speed up relies on the nondeterministic character of quantum measurement. We
show that this objection would mistake the nondeterministic production of a
definite outcome for a deterministic production.","These informal notes deal with some basic properties of metric spaces,
especially concerning lengths of curves.",0
"PERSON unified ORG's razor and ORG' principle of multiple
explanations to CARDINAL elegant, formal, universal theory of inductive inference,
which initiated the field of algorithmic information theory. His central result
is that the posterior of his universal semimeasure M converges rapidly to the
true sequence generating posterior mu, if the latter is computable. Hence, M is
eligible as a universal predictor in case of unknown mu. We investigate the
existence and convergence of computable universal (semi)measures for a
hierarchy of computability classes: finitely computable, estimable, enumerable,
and approximable. For instance, M is known to be enumerable, but not finitely
computable, and to dominate all enumerable semimeasures. We define CARDINAL
classes of (semi)measures based on these CARDINAL computability concepts. Each
class may or may not contain a (semi)measure which dominates all elements of
another class. The analysis of these CARDINAL cases can be reduced to CARDINAL basic
cases, CARDINAL of them being new. The results hold for discrete and continuous
semimeasures. We also investigate more closely the types of convergence,
possibly implied by universality: in difference and in ratio, with probability
CARDINAL, in mean sum, and for PERSON random sequences. We introduce a
generalized concept of randomness for individual sequences and use it to
exhibit difficulties regarding these issues.","PERSON unified ORG's razor and ORG' principle of multiple
explanations to CARDINAL elegant, formal, universal theory of inductive inference,
which initiated the field of algorithmic information theory. His central result
is that the posterior of the universal semimeasure M converges rapidly to the
true sequence generating posterior mu, if the latter is computable. Hence, M is
eligible as a universal predictor in case of unknown mu. The ORDINAL part of the
paper investigates the existence and convergence of computable universal
(semi)measures for a hierarchy of computability classes: recursive, estimable,
enumerable, and approximable. For instance, M is known to be enumerable, but
not estimable, and to dominate all enumerable semimeasures. We present proofs
for discrete and continuous semimeasures. The ORDINAL part investigates more
closely the types of convergence, possibly implied by universality: in
difference and in ratio, with probability CARDINAL, in mean sum, and for PERSON
random sequences. We introduce a generalized concept of randomness for
individual sequences and use it to exhibit difficulties regarding these issues.
In particular, we show that convergence fails (holds) on generalized-random
sequences in gappy (dense) PERSON classes.",1
"Most work on computational complexity is concerned with time. However this
course will try to show that program-size complexity, which measures
algorithmic information, is of much greater philosophical significance. I'll
discuss how one can use this complexity measure to study what can and cannot be
achieved by formal axiomatic mathematical theories. In particular, I'll show
(a) that there are natural information-theoretic constraints on formal
axiomatic theories, and that program-size complexity provides an alternative
path to incompleteness from the one originally used by PERSON. Furthermore,
I'll show (b) that in pure mathematics there are mathematical facts that are
true for no reason, that are true by accident. These have to do with
determining the successive binary digits of the precise numerical value of the
halting probability PERSON for a ""self-delimiting"" universal Turing machine. I
believe that these meta-theorems (a,b) showing (a) that the complexity of
axiomatic theories can be characterized information-theoretically and (b) that
God plays dice in pure mathematics, both strongly suggest a quasi-empirical
view of mathematics. I.e., math is different from physics, but perhaps not as
different as people usually think. I'll also discuss the convergence of
theoretical computer science with theoretical physics, ORG's ideas on
complexity, PERSON book WORK_OF_ART, and how to attempt to
use information theory to define what a living being is.","Most traditional artificial intelligence (ORG) systems of DATE
are either very limited, or based on heuristics, or both. The new millennium,
however, has brought substantial progress in the field of theoretically optimal
and practically feasible algorithms for prediction, search, inductive inference
based on ORG's razor, problem solving, decision making, and reinforcement
learning in environments of a very general type. Since inductive inference is
at the heart of all inductive sciences, some of the results are relevant not
only for ORG and computer science but also for physics, provoking nontraditional
predictions based on ORG's thesis of the computer-generated universe.",0
"This article considers evidence from physical and biological sciences to show
machines are deficient compared to biological systems at incorporating
intelligence. Machines fall short on CARDINAL counts: ORDINAL, unlike brains,
machines do not self-organize in a recursive manner; ORDINAL, machines are
based on classical logic, whereas WORK_OF_ART's intelligence may depend on quantum
mechanics.","The general relativistic gravitomagnetic clock effect consists in the fact
that CARDINAL massive test bodies orbiting a central spinning mass in its equatorial
plane along CARDINAL identical circular trajectories, but in opposite directions,
take different times in describing a full revolution with respect to an
asymptotically inertial observer. In the field of the LOC such time shift
amounts to CARDINAL} s. Detecting it by means of a space based mission with
artificial satellites is a very demanding task because there are severe
constraints on the precision with which the radial and azimuthal positions of a
satellite must be known: ORG r= 10^{-2} cm and delta phi= CARDINAL}
milliarcseconds per revolution. In this paper we assess if the systematic
errors induced by various non-gravitational perturbations allow to meet such
stringent requirements. A couple of identical, passive laser-ranged satellites
of ORG type with their spins aligned with the LOC's one is considered. It
turns out that all the non vanishing non-gravitational perturbations induce
systematic errors in r and phi within the required constraints for a reasonable
assumption of the mismodeling in some satellite's and LOC's parameters and/or
by using dense satellites with small area-to-mass ratio. However, the error in
the LOC's ORG is by far the largest source of uncertainty in the azimuthal
location which is affected at a level of CARDINAL milliarcseconds per revolution.",0
"We realize constant-space quantum computation by measure-many CARDINAL-way quantum
finite automata and evaluate their language recognition power by analyzing
patterns of their exotic behaviors and by exploring their structural
properties. In particular, we show that, when the automata halt ""in finite
steps"" along all computation paths, they must terminate in worst-case liner
time. In the bounded-error probability case, the acceptance of the automata
depends only on the computation paths that terminate within exponentially many
steps even if not all computation paths may terminate. We also present a
classical simulation of those automata on CARDINAL-way multi-head probabilistic
finite automata with cut points. Moreover, we discuss how the recognition power
of the automata varies as the automata's acceptance criteria change to error
free, CARDINAL-sided error, bounded error, and unbounded error by comparing the
complexity of their computational powers. We further note that, with the use of
arbitrary complex transition amplitudes, CARDINAL-way unbounded-error quantum finite
automata and CARDINAL-way bounded-error CARDINAL-head quantum finite automata can recognize
certain non-recursive languages, whereas CARDINAL-way error-free quantum finite
automata recognize only recursive languages.","In the context of business information systems, e-commerce and access to
knowledge, the relevance of the information provided to use is a key fact to
the success of information systems. Therefore the quality of access is
determined by access to the right information at the right time, at the right
place. In this context, it is important to consider the users needs when access
to information and his contextual situation in order to provide relevant
information, tailored to their needs and context use. In what follows we
describe the prelude to a project that tries to combine all of these needs to
improve information systems.",0
"A pole in the D-pi S-wave analogous to the sigma and kappa is predicted at
DATE) MeV. The main objective of this paper is to provide
formulae for fitting it to data.","A combined fit is made to CARDINAL data on D->K-pi-pi, LASS data on K-pi elastic
scattering, and ORG data on J/Psi->K*(890)-K-pi.In all cases, PRODUCT-wave is fitted well with a kappa resonance and GPE); the kappa requires
an s-dependent width with an PERSON zero near threshold. The pole position of
the kappa is at M - iGamma/2 = (CARDINAL +30 -55) - i(342 +- 60) MeV. The PRODUCT
collaboration fitted their data using a form factor for the production process
D->kappa-pi. It is shown that this form factor is not needed. The data require
point-like production with an ORG radius <0.38 fm with PERCENT confidence.",1
"Probabilistic graphical models are a fundamental tool in statistics, machine
learning, signal processing, and control. When such a model is defined on a
directed acyclic graph (ORG), one can assign a partial ordering to the events
occurring in the corresponding stochastic system. Based on the work of ORG
LOC and others, these ORG-based ""causal factorizations"" of joint probability
measures have been used for characterization and inference of functional
dependencies (causal links). This mostly expository paper focuses on several
connections between LOC's formalism (and in particular his notion of
""intervention"") and information-theoretic notions of causality and feedback
(such as causal conditioning, directed stochastic kernels, and directed
information). As an application, we show how conditional directed information
can be used to develop an information-theoretic version of LOC's ""back-door""
criterion for identifiability of causal effects from passive observations. This
suggests that the back-door criterion can be thought of as a causal analog of
statistical sufficiency.","ORG dynamical systems arise in a multitude of contexts, e.g.,
optimization, control, communications, signal processing, and machine learning.
A precise characterization of their fundamental limitations is therefore of
paramount importance. In this paper, we consider the general problem of
adaptively controlling and/or identifying a stochastic dynamical system, where
our {\em a priori} knowledge allows us to place the system in a subset of a
metric space (the uncertainty set). We present an information-theoretic
meta-theorem that captures the trade-off between the metric complexity (or
richness) of the uncertainty set, the amount of information acquired online in
the process of controlling and observing the system, and the residual
uncertainty remaining after the observations have been collected. Following the
approach of PERSON, we quantify {\em a priori} information by the NORP
(metric) entropy of the uncertainty set, while the information acquired online
is expressed as a sum of information divergences. The general theory is used to
derive new minimax lower bounds on the metric identification error, as well as
to give a simple derivation of the minimum time needed to stabilize an
uncertain stochastic ORG system.",1
"This paper discusses system consequence, a central idea in the project to
lift the theory of information flow to the abstract level of universal logic
and the theory of institutions. The theory of information flow is a theory of
distributed logic. The theory of institutions is abstract model theory. A
system is a collection of interconnected parts, where the whole may have
properties that cannot be known from an analysis of the constituent parts in
isolation. In an information system, the parts represent information resources
and the interconnections represent constraints between the parts. System
consequence, which is the extension of the consequence operator from theories
to systems, models the available regularities represented by an information
system as a whole. System consequence (without part-to-part constraints) is
defined for a specific logical system (institution) in the theory of
information flow. This paper generalizes the idea of system consequence to
arbitrary logical systems.","The theory of distributed conceptual structures, as outlined in this paper,
is concerned with the distribution and conception of knowledge. It rests upon
CARDINAL related theories, ORG and ORG, which it
seeks to unify. ORG (IF) is concerned with the distribution of
knowledge. The foundations of ORG is explicitly based upon a
mathematical theory known as ORG in *-autonomous categories
and implicitly based upon the mathematics of closed categories. Formal Concept
Analysis (ORG) is concerned with the conception and analysis of knowledge. In
this paper we connect these CARDINAL studies by extending the basic theorem of
ORG to the distributed realm of ORG. The main
results are the categorical equivalence between classifications and concept
lattices at the level of functions, and the categorical equivalence between
bonds and complete adjoints at the level of relations. With this we hope to
accomplish a rapprochement between Information Flow and Formal Concept
Analysis.",1
"This paper introduces ORG (ORG), a method for
measuring semantic similarity. ORG measures similarity in the semantic
relations between CARDINAL pairs of words. When CARDINAL pairs have a high degree of
relational similarity, they are analogous. For example, the pair cat:meow is
analogous to the pair dog:bark. There is evidence from cognitive science that
relational similarity is fundamental to many cognitive and linguistic tasks
(e.g., analogical reasoning). In FAC (VSM) approach to
measuring relational similarity, the similarity between CARDINAL pairs is calculated
by the cosine of the angle between the vectors that represent the CARDINAL pairs.
The elements in the vectors are based on the frequencies of manually
constructed patterns in a large corpus. ORG extends the ORG approach in CARDINAL
ways: (CARDINAL) patterns are derived automatically from the corpus, (CARDINAL) Singular
ORG is used to smooth the frequency data, and (CARDINAL) synonyms are
used to reformulate word pairs. This paper describes the ORG algorithm and
experimentally compares ORG to ORG on CARDINAL tasks, answering college-level
multiple-choice word analogy questions and classifying semantic relations in
GPE-modifier expressions. ORG achieves state-of-the-art results, reaching
human-level performance on the analogy questions and significantly exceeding
ORG performance on both tasks.","We live in the Information Age, and information has become a critically
important component of our life. The success of the Internet made huge amounts
of it easily available and accessible to everyone. To keep the flow of this
information manageable, means for its faultless circulation and effective
handling have become urgently required. Considerable research efforts are
dedicated DATE to address this necessity, but they are seriously hampered by
the lack of a common agreement about ""What is information?"" In particular, what
is ""visual information"" - human's primary input from the surrounding world. The
problem is further aggravated by a long-lasting stance borrowed from the
biological vision research that assumes human-like information processing as an
enigmatic mix of perceptual and cognitive vision faculties. I am trying to find
a remedy for this bizarre situation. Relying on a new definition of
""information"", which can be derived from ORG's compexity theory and
PERSON's notion of algorithmic information, I propose a unifying framework for
visual information processing, which explicitly accounts for the perceptual and
cognitive image processing peculiarities. I believe that this framework will be
useful to overcome the difficulties that are impeding our attempts to develop
the right model of human-like intelligent image processing.",0
"The new, complex-dynamical mechanism of the universal gravitation naturally
incorporating dynamical quantization, wave-particle duality, and relativity of
physically emerging space and time (quant-ph/9902015,16) provides the realistic
meaning and fundamentally substantiated modification of the NORP units of
mass, length, and time approaching them closely to the extreme values observed
for already discovered elementary particles. This result suggests the important
change of research strategy in high-energy/particle physics, displacing it
towards the already attained energy scales and permitting one to exclude the
existence of elementary objects in the inexplicably large interval of
parameters separating the known, practically more than sufficient set of
elementary species and the conventional, mechanistically exaggerated values of
the NORP units. This conclusion is supported by the causally complete
(physically and mathematically consistent) picture of the fundamental levels of
reality derived, without artificial introduction of any structure or
'principle', from the unreduced analysis of the (generic) interaction process
between CARDINAL primal, physically real, but a priori structureless entities, the
electromagnetic and gravitational protofields. The naturally emerging
phenomenon of universal dynamic redundance (multivaluedness) of interaction
process gives rise to the intrinsically unified hierarchy of unreduced dynamic
complexity of the world, starting from the lowest levels of elementary objects,
and explains the irreducible limitations of the basically single-valued
approach of the canonical science leading to the well-known 'mysteries',
separations, and loss of certainty.","Actual social networks (like Facebook, PERSON, GPE, ...) need to deal
with vagueness on ontological indeterminacy. In this paper is analyzed the
prototyping of a faceted semantic search for personalized social search using
the ""joint meaning"" in a community environment. User researches in a
""collaborative"" environment defined by folksonomies can be supported by the
most common features on the faceted semantic search. A solution for the
context-aware personalized search is based on ""joint meaning"" understood as a
joint construal of the creators of the contents and the user of the contents
using the faced taxonomy with the Semantic Web. A proof-of concept prototype
shows how the proposed methodological approach can also be applied to existing
presentation components, built with different languages and/or component
technologies.",0
"Despite having advanced a reaction-diffusion model of ODE's in his DATE paper
on morphogenesis, reflecting his interest in mathematical biology, PERSON
has never been considered to have approached a definition of ORG.
However, his treatment of morphogenesis, and in particular a difficulty he
identified relating to the uneven distribution of certain forms as a result of
symmetry breaking, are key to connecting his theory of universal computation
with his theory of biological pattern formation. Making such a connection would
not overcome the particular difficulty that Turing was concerned about, which
has in any case been resolved in biology. But instead the approach developed
here captures Turing's initial concern and provides a low-level solution to a
more general question by way of the concept of algorithmic probability, thus
bridging CARDINAL of his most important contributions to science: Turing pattern
formation and universal computation. I will provide experimental results of
CARDINAL-dimensional patterns using this approach, with no loss of generality to a
n-dimensional pattern generalisation.","The use of PRODUCT's correlation coefficient in ORG
was compared with PERSON's cosine measure in a number of recent contributions.
Unlike the PRODUCT correlation, the cosine is insensitive to the number of
CARDINAL. However, one has the option of applying a logarithmic transformation in
correlation analysis. Information calculus is based on both the logarithmic
transformation and provides a non-parametric statistics. Using this methodology
one can cluster a document set in a precise way and express the differences in
terms of bits of information. The algorithm is explained and used on the data
set which was made the subject of this discussion.",0
"Barcodes like QR Codes have made that encoded messages have entered our
everyday life, what suggests to attach them a ORDINAL layer of information:
directly available to human receiver for informational or marketing purposes.
We will discuss a general problem of using codes with chosen statistical
constrains, for example reproducing given grayscale picture using halftone
technique. If both sender and receiver know these constrains, the optimal
capacity can be easily approached by entropy coder. The problem is that this
time only the sender knows them - we will refer to these scenarios as
constrained coding. GPE and PERSON problem in which only the sender
knows which bits are fixed can be seen as a special case, surprisingly
approaching the same capacity as if both sides would know the constrains. We
will analyze ORG to approach analogous capacity in the general
case - use weaker: statistical constrains, what allows to apply them to all
bits. Finding satisfying coding is similar to finding the proper correction in
error correction problem, but instead of single ensured possibility, there are
now statistically expected some. While in standard steganography we hide
information in the least important bits, this time we create codes resembling
given picture - hide information in the freedom of realizing grayness by black
and white pixels using halftone technique. We will also discuss combining with
error correction and application to rate distortion problem.","Tree rotations (left and right) are basic local deformations allowing to
transform CARDINAL unlabeled binary trees of the same size. Hence, there is
a natural problem of practically finding such transformation path with low
number of rotations, the optimal minimal number is called the rotation
distance. Such distance could be used for instance to quantify similarity
CARDINAL trees for various machine learning problems, for example to compare
hierarchical clusterings or arbitrarily chosen spanning trees of CARDINAL graphs,
like in ORG notation popular for describing chemical molecules.
There will be presented inexpensive practical greedy algorithm for finding a
short rotation path, optimality of which has still to be determined. It uses
introduced partial order for binary trees of the same size: $PERSON \leq t_2MONEY
MONEY$ can be obtained from MONEY$ by a sequence of only right rotations.
GPE, the shortest rotation path should go through the least upper bound
or the greatest lower bound for this partial order. The algorithm finds a path
through candidates for both points in representation of binary tree as stack
graph: describing evolution of content of stack while processing a formula
described by a given binary tree. The article is accompanied with ORG
implementation of all used procedures (Appendix).",1
"A general notion of algebraic conditional plausibility measures is defined.
Probability measures, ranking functions, possibility measures, and (under the
appropriate definitions) sets of probability measures can all be viewed as
defining algebraic conditional plausibility measures. It is shown that the
technology of NORP networks can be applied to algebraic conditional
plausibility measures.","I explore the use of sets of probability measures as a representation of
uncertainty.",1
"This paper corrects the proof of the Theorem 2 from the PERSON's paper
\cite[page 5]{Gower:1982} as well as corrects LOC from ORG's paper
\cite{Gower:1986}. The ORDINAL correction is needed in order to establish the
existence of the kernel function used commonly in the kernel trick e.g. for
$k$-means clustering algorithm, on the grounds of distance matrix. The
correction encompasses the missing if-part proof and dropping unnecessary
conditions. The ORDINAL correction deals with transformation of the kernel
matrix into a CARDINAL embeddable in LOC space.","Even if ORG Sycamore processor is efficient for the particular task
it has been designed for it fails to deliver universal computational capacity.
Furthermore, even classical devices implementing transverse homoclinic orbits
realize exponential speedups with respect to universal classical as well as
quantum computations. Moreover, relative to the validity of quantum mechanics,
there already exist ORG oracles which violate the Church-Turing thesis.",0
"Spell-checking is the process of detecting and sometimes providing
suggestions for incorrectly spelled words in a text. Basically, the larger the
dictionary of a spell-checker is, the higher is the error detection rate;
otherwise, misspellings would pass undetected. Unfortunately, traditional
dictionaries suffer from out-of-vocabulary and data sparseness problems as they
do not encompass large vocabulary of words indispensable to cover proper names,
domain-specific terms, technical jargons, special acronyms, and terminologies.
As a result, spell-checkers will incur low error detection and correction rate
and will fail to flag all errors in the text. This paper proposes a new
parallel shared-memory spell-checking algorithm that uses rich real-world word
statistics from ORG! N-Grams Dataset to correct non-word and real-word errors
in computer text. Essentially, the proposed algorithm can be divided into CARDINAL
sub-algorithms that run in a parallel fashion: The error detection algorithm
that detects misspellings, the candidates generation algorithm that generates
correction suggestions, and the error correction algorithm that performs
contextual error correction. Experiments conducted on a set of text articles
containing misspellings, showed a remarkable spelling error correction rate
that resulted in a radical reduction of both non-word and real-word errors in
electronic text. In a further study, the proposed algorithm is to be optimized
for message-passing systems so as to become more flexible and less costly to
scale over distributed machines.","ORG (ANNs) were devised as a tool for ORG design implementations. However, it was soon became obvious that
they are unable to fulfill their duties. The fully autonomous way of ANNs
working, precluded from any human intervention or supervision, deprived of any
theoretical underpinning, leads to a strange state of affairs, when ORG
designers cannot explain why and how they achieve their amazing and remarkable
results. Therefore, contemporary ORG looks more like a
ORG enterprise rather than a respected scientific or technological
undertaking. On the other hand, modern biological science posits that
intelligence can be distinguished not only in human brains. ORG DATE
is considered as a fundamental property of each and every living being.
Therefore, lower simplified forms of natural intelligence are more suitable for
investigation and further replication in artificial cognitive architectures.",0
"The paper describes the proposition and application of a local search
metaheuristic for multi-objective optimization problems. It is based on CARDINAL
main principles of heuristic search, intensification through variable
neighborhoods, and diversification through perturbations and successive
iterations in favorable regions of the search space. The concept is
successfully tested on permutation flow shop scheduling problems under multiple
objectives. While the obtained results are encouraging in terms of their
quality, another positive attribute of the approach is its' simplicity as it
does require the setting of only very few parameters. The implementation of the
ORG metaheuristic is based on the MOOPPS computer
system of local search heuristics for multi-objective scheduling which has been
awarded ORG DATE in GPE, GPE
(PERSON, http://www.bth.se/llab/easa_2002.nsf)","This paper presents an overview of current and potential applications of
living technology to some urban problems. Living technology can be described as
technology that exhibits the core features of living systems. These features
can be useful to solve dynamic problems. In particular, urban problems
concerning mobility, logistics, telecommunications, governance, safety,
sustainability, and society and culture are presented, while solutions
involving living technology are reviewed. A methodology for developing living
technology is mentioned, while supraoptimal public transportation systems are
used as a case study to illustrate the benefits of urban living technology.
Finally, the usefulness of describing cities as living systems is discussed.",0
"We present an unsupervised learning algorithm that mines large text corpora
for patterns that express implicit semantic relations. For a given input word
pair X:Y with some unspecified semantic relations, the corresponding output
list of patterns <P1,...,Pm> is ranked according to how well each pattern Pi
expresses the relations between X and Y. For example, given X=ostrich and
Y=bird, the CARDINAL highest ranking output patterns are ""X is the largest Y"" and ""Y
such as the X"". The output patterns are intended to be useful for finding
further pairs with the same relations, to support the construction of lexicons,
ontologies, and semantic networks. The patterns are sorted by pertinence, where
the pertinence of a pattern PERSON for a word pair X:Y is the expected relational
similarity between the given pair and typical pairs for ORG. The algorithm is
empirically evaluated on CARDINAL tasks, solving multiple-choice ORG word analogy
questions and classifying semantic relations in GPE-modifier pairs. On both
tasks, the algorithm achieves state-of-the-art results, performing
significantly better than several alternative pattern ranking algorithms, based
on tf-idf.","The idea that there are any large-scale trends in the evolution of biological
organisms is highly controversial. It is commonly believed, for example, that
there is a large-scale trend in evolution towards increasing complexity, but
empirical and theoretical arguments undermine this belief. Natural selection
results in organisms that are well adapted to their local environments, but it
is not clear how local adaptation can produce a global trend. In this paper, I
present a simple computational model, in which local adaptation to a randomly
changing environment results in a global trend towards increasing evolutionary
versatility. In this model, for evolutionary versatility to increase without
bound, the environment must be highly dynamic. The model also shows that
unbounded evolutionary versatility implies an accelerating evolutionary pace. I
believe that unbounded increase in evolutionary versatility is a large-scale
trend in evolution. I discuss some of the testable predictions about organismal
evolution that are suggested by the model.",1
"Neurons are individually translated into simple gates to plan a brain based
on human psychology and intelligence. State machines, assumed previously
learned in subconscious associative memory are shown to enable equation solving
and rudimentary thinking using nanoprocessing within short term memory.","In the paper, combinatorial synthesis of structure for applied Web-based
systems is described. The problem is considered as a combination of selected
design alternatives for system parts/components into a resultant composite
decision (i.e., system configuration design). The solving framework is based on
Hierarchical Morphological Multicriteria Design (HMMD) approach: (i)
multicriteria selection of alternatives for system parts, (ii) composing the
selected alternatives into a resultant combination (while taking into account
ordinal quality of the alternatives above and their compatibility). A
lattice-based discrete space is used to evaluate (to integrate) quality of the
resultant combinations (i.e., composite system decisions or system
configurations). In addition, a simplified solving framework based on
multicriteria multiple choice problem is considered. A multistage design
process to obtain a system trajectory is described as well. The basic applied
example is targeted to an applied Web-based system for a communication service
provider. CARDINAL other applications are briefly described (corporate system and
information system for academic application).",0
"This is a critical review of the book 'WORK_OF_ART by PERSON. We do not attempt a chapter-by-chapter evaluation, but instead focus
on CARDINAL areas: computational complexity and fundamental physics. In complexity,
we address some of the questions PERSON raises using standard techniques in
theoretical computer science. In physics, we examine ORG's proposal for a
deterministic model underlying quantum mechanics, with 'long-range threads' to
connect entangled particles. We show that this proposal cannot be made
compatible with both special relativity and ORG inequality violation.","Purpose: This paper discusses ranking factors suitable for library materials
and shows that ranking in general is a complex process and that ranking for
library materials requires a variety of techniques.
Design/methodology/approach: The relevant literature is reviewed to provide a
systematic overview of suitable ranking factors. The discussion is based on an
overview of ranking factors used in Web search engines. Findings: While there
are a wide variety of ranking factors applicable to library materials, todays
library systems use only some of them. When designing a ranking component for
the library catalogue, an individual weighting of applicable factors is
necessary. Research limitations/applications: While this article discusses
different factors, no particular ranking formula is given. However, this
article presents the argument that such a formula must always be individual to
a certain use case. Practical implications: The factors presented can be
considered when designing a ranking component for a librarys search system or
when discussing such a project with an ILS vendor. Originality/value: This
paper is original in that it is the ORDINAL to systematically discuss ranking of
library materials based on the main factors used by Web search engines.",0
"Traditional quantum state tomography requires a number of measurements that
grows exponentially with the number of qubits n. But using ideas from
computational learning theory, we show that ""for most practical purposes"" CARDINAL
can learn a state using a number of measurements that grows only linearly with
n. Besides possible implications for experimental physics, our learning theorem
has CARDINAL applications to ORG computing: ORDINAL, a new simulation of ORG
CARDINAL-way communication protocols, and ORDINAL, the use of trusted classical
advice to verify untrusted quantum advice.","One might think that, once we know something is computable, how efficiently
it can be computed is a practical question with little further philosophical
importance. In this essay, I offer a detailed case that one would be wrong. In
particular, I argue that computational complexity theory---the field that
studies the resources (such as time, space, and randomness) needed to solve
computational problems---leads to new perspectives on the nature of
mathematical knowledge, the strong ORG debate, computationalism, the problem of
logical omniscience, ORG's problem of induction, ORG's grue riddle, the
foundations of quantum mechanics, economic rationality, closed timelike curves,
and several other topics of philosophical interest. I end by discussing aspects
of complexity theory itself that could benefit from philosophical analysis.",1
"This paper describes a method for creating structure from heterogeneous
sources, as part of an information database, or more specifically, a 'concept
base'. PERSON called 'concept trees' can grow from the semi-structured
sources when consistent sequences of concepts are presented. They might be
considered to be dynamic databases, possibly a variation on the distributed
Agent-Based or PRODUCT models, or even related to PERSON models.
NORP comparison of text is required, but the trees can be built more, from
automatic knowledge and statistical feedback. This reduced model might also be
attractive for security or privacy reasons, as not all of the potential data
gets saved. The construction process maintains the key requirement of
generality, allowing it to be used as part of a generic framework. The nature
of the method also means that some level of optimisation or normalisation of
the information will occur. This gives comparisons with databases or
knowledge-bases, but a database system would firstly model its environment or
datasets and then populate the database with instance values. The concept base
deals with a more uncertain environment and therefore cannot fully model it
beforehand. The model itself therefore evolves over time. Similar to databases,
it also needs a good indexing system, where the construction process provides
memory and indexing structures. These allow for more complex concepts to be
automatically created, stored and retrieved, possibly as part of a more
cognitive model. There are also some arguments, or more abstract ideas, for
merging physical-world laws into these automatic processes.","Our understanding of intelligence is directed primarily at the level of human
beings. This paper attempts to give a more unifying definition that can be
applied to the natural world in general. The definition would be used more to
verify a degree of intelligence, not to quantify it and might help when making
judgements on the matter. A version of an accepted test for ORG is then put
forward as the 'acid test' for ORG itself. It might be what
a free-thinking program or robot would try to achieve. Recent work by the
author on ORG has been more from a direction of mechanical processes, or ones
that might operate automatically. This paper will not try to question the idea
of intelligence, in the sense of a pro-active or conscious event, but try to
put it into a more passive, automatic and mechanical context. The paper also
suggests looking at intelligence and consciousness as being slightly different.",1
"The CARDINAL-dimensional Zakharov system is shown to have a unique global solution
for data without finite energy. The proof uses the "" I-method "" introduced by
PERSON, Keel, PERSON, GPE, and PERSON in connection with a refined
bilinear GPE estimate.","The decay rate for a particle in a metastable cubic potential is investigated
in the ORG regime by the PERSON path integral method in semiclassical
approximation. The imaginary time formalism allows one to monitor the system as
a function of temperature. The family of classical paths, saddle points for the
action, is derived in terms of NORP elliptic functions whose periodicity
sets the energy-temperature correspondence. The period of the classical
oscillations varies monotonically with the energy up to the sphaleron, pointing
to a smooth crossover from the quantum to the activated regime. The softening
of the ORG fluctuation spectrum is evaluated analytically by the theory of
the functional determinants and computed at low $T$ up to the crossover. In
particular, the negative eigenvalue, causing an imaginary contribution to the
partition function, is studied in detail by solving the Lam\`{e} equation which
governs the fluctuation spectrum. For a heavvy particle mass, the decay rate
shows a remarkable temperature dependence mainly ascribable to a low lying soft
mode and, approaching the crossover, it increases by a factor CARDINAL over the
predictions of the CARDINAL temperature theory. Just beyond the peak value, the
classical PRODUCT behavior takes over. A similar trend is found studying the
quartic metastable potential but the lifetime of the latter is longer by a
factor CARDINAL than in a cubic potential with same parameters. Some formal
analogies with noise-induced transitions in classically activated metastable
systems are discussed.",0
"We have analyzed manufacturing data from several different semiconductor
manufacturing plants, using decision tree induction software called Q-YIELD.
The software generates rules for predicting when a given product should be
rejected. The rules are intended to help the process engineers improve the
yield of the product, by helping them to discover the causes of rejection.
Experience with Q-YIELD has taught us the importance of data engineering --
preprocessing the data to enable or facilitate decision tree induction. This
paper discusses some of the data engineering problems we have encountered with
semiconductor manufacturing data. The paper deals with CARDINAL broad classes of
problems: engineering the features in a feature vector representation and
engineering the definition of the target concept (the classes). Manufacturing
process data present special problems for feature engineering, since the data
have multiple levels of granularity (detail, resolution). Engineering the
target concept is important, due to our focus on understanding the past, as
opposed to the more common focus in machine learning on predicting the future.","The recently published no-hair theorems of ORG, PERSON, and NORP
have revealed the intriguing fact that horizonless compact reflecting stars
{\it cannot} support spatially regular configurations made of scalar, vector
and tensor fields. In the present paper we explicitly prove that the
interesting no-hair behavior observed in these studies is not a generic feature
of compact reflecting stars. In particular, we shall prove that charged
reflecting stars {ORG can} support {\it charged} massive scalar field
configurations in their exterior spacetime regions. To this end, we solve
analytically the characteristic ORG wave equation for a linearized
charged scalar field of mass $\mu$, charge coupling constant $q$, and spherical
harmonic index $l$ in the background of a spherically symmetric compact
reflecting star of mass $MONEY, electric charge $MONEY, and radius $R_{\text{s}}\gg
M,Q$. Interestingly, it is proved that the discrete set
$MONEY,Q,\mu,q,l;n)\}^{n=\infty}_{n=1}$ of star radii that can
support the charged massive scalar field configurations is determined by the
characteristic zeroes of the confluent hypergeometric function. Following this
simple observation, we derive a remarkably compact analytical formula for the
discrete spectrum of star radii in the intermediate regime $M\ll
R_{\text{s}}\ll CARDINAL The analytically derived resonance spectrum is
confirmed by direct numerical computations.",0
"Finding observing path creating its observer is important problem in physics
and information science. In observing processes, each observation is act
changing the observing process that generates interactive observation. Each
interaction is discrete Yes-No impulse modeling Bit. Recurring inter-actions
independent of physical nature is phenomenon of information. Multiple
interactions generate random PERSON chains covering multiple ORG. Impulse No
action cuts maximum entropy-uncertainty, Yes action transfers cut minimum to
next impulse creating maximin principle decreasing uncertainty. The cutoff
entropies reveal hidden information naturally observing interactive impulse as
elementary observer. Conversion impulse entropies to information integrates
path functional. The maxmin variation principle formalizes interactive
information equations. Merging Yes-No actions generate microprocess within
bordered impulses running superposition of conjugated entropies entangling
during time interval within forming space intervals. Interaction curves impulse
geometry creating asymmetry which logically erases entangled entropy removing
causal probabilistic entropy with symmetrical reversible logic and bringing
asymmetrical information logic. Entropy-information topological gap connects
asymmetrical logic with physical PERSON diffusion whose energy memorizes
logical Bit. Moving Bits selfform unit of information macroprocess attracting
new UP through free Information. Multiple UP triples adjoin hierarchical
network (IN) whose free information produces new UP at higher level node and
encodes triple code logic. Each UP unique position in IN hierarchy defines
location of each code logical structure. The IN node hierarchical level
classifies quality of assembled ORG. Ending IN node enfolds all IN
levels. Multiple INs enclose Observer cognition and intelligence with
consciousness.","Part 1 has studied the conversion of observed random process with its hidden
information to related dynamic process, applying entropy functional measure
(EF) of the random process and path functional information measure (ORG) of the
dynamic conversion process. The variation principle, satisfying the EF-IPF
equivalence along shortest path-trajectory, leads to information dual
complementary maxmin-minimax law, which creates mechanism of arising
information regularities from stochastic process(Lerner DATE). This Part CARDINAL
studies mechanism of cooperation of the observed multiple hidden information
process, which follows from the law and produces cooperative structures,
concurrently assembling in hierarchical information network (IN) and generating
the IN digital genetic code. We analyze the interactive information
contributions, information quality, inner time scale, information geometry of
the cooperative structures, evaluate curvature of these geometrical forms and
their cooperative information complexities. The law information mechanisms
operate in information observer. The observer, acting according the law,
selects random information, converts it in information dynamics, builds the IN
ORG, which generate the genetic code.",1
"The paper addresses design/building frameworks for some kinds of tree-like
and hierarchical structures of systems. The following approaches are examined:
(CARDINAL) expert-based procedures, (CARDINAL) hierarchical clustering; (CARDINAL) spanning problems
(e.g., minimum spanning tree, minimum PERSON tree, maximum leaf spanning tree
problem; (CARDINAL) design of organizational 'optimal' hierarchies; (CARDINAL) design of
multi-layer (e.g., CARDINAL-layer) k-connected network; (CARDINAL) modification of
hierarchies or networks: (i) modification of tree via condensing of neighbor
nodes, (ii) hotlink assignment, (iii) transformation of tree into ORG tree,
(iv) restructuring as modification of an initial structural solution into a
solution that is the most close to a goal solution while taking into account a
cost of the modification. Combinatorial optimization problems are considered as
basic ones (e.g., classification, knapsack problem, multiple choice problem,
assignment problem). Some numerical examples illustrate the suggested problems
and solving frameworks.","This paper describes the ORDINAL-order logical environment PERSON. Institutions
in general, and logical environments in particular, give equivalent
heterogeneous and homogeneous representations for logical systems. As such,
they offer a rigorous and principled approach to distributed interoperable
information systems via system consequence. Since PERSON is a particular logical
environment, this provides a rigorous and principled approach to distributed
interoperable ORDINAL-order information systems. The PERSON represents the
formalism and semantics of ORDINAL-order logic in a classification form. By using
an interpretation form, a companion approach defines the formalism and
semantics of ORDINAL-order logical/relational database systems. In a strict
sense, the CARDINAL forms have transformational passages (generalized inverses)
between one another. The classification form of ORDINAL-order logic in the PRODUCT
corresponds to ideas discussed in ORG (ORG). The
PERSON representation follows a conceptual structures approach, that is
completely compatible with formal concept analysis and information flow.",0
"This paper is an experimental exploration of the relationship between the
runtimes of Turing machines and the length of proofs in formal axiomatic
systems. We compare the number of halting Turing machines of a given size to
the number of provable theorems of ORDINAL-order logic of a given size, and the
runtime of the longest-running Turing machine of a given size to the proof
length of the most-difficult-to-prove theorem of a given size. It is suggested
that theorem provers are subject to the same non-linear tradeoff between time
and size as computer programs are, affording the possibility of determining
optimal timeouts and waiting times in automatic theorem proving. I provide the
statistics for some small choices of parameters for both of these systems.","We consider spacetimes with compact Cauchy hypersurfaces and with NORP
tensor bounded from below on the set of timelike unit vectors, and prove that
the results known for spacetimes satisfying the timelike convergence condition,
namely, foliation by ORG hypersurfaces, are also valid in the present
situation, if corresponding further assumptions are satisfied.
In addition we show that the volume of any sequence of spacelike
hypersurfaces, which run into the future singularity, decays to CARDINAL provided
there exists a time function covering a future end, such that the level
hypersurfaces have non-negative mean curvature and decaying volume.",0
"This work emphasizes that heterogeneity, diversity, discontinuity, and
discreteness in data is to be exploited in classification and regression
problems. A global a priori model may not be desirable. For data analytics in
cosmology, this is motivated by the variety of cosmological objects such as
elliptical, spiral, active, and merging galaxies at a wide range of redshifts.
Our aim is matching and similarity-based analytics that takes account of
discrete relationships in the data. The information structure of the data is
represented by a hierarchy or tree where the branch structure, rather than just
the proximity, is important. The representation is related to p-adic number
theory. The clustering or binning of the data values, related to the precision
of the measurements, has a central role in this methodology. If used for
regression, our approach is a method of cluster-wise regression, generalizing
nearest neighbour regression. Both to exemplify this analytics approach, and to
demonstrate computational benefits, we address the well-known photometric
redshift or `photo-z' problem, seeking to match PERSON Digital Sky Survey (ORG)
spectroscopic and photometric redshifts.","We begin by summarizing the relevance and importance of inductive analytics
based on the geometry and topology of data and information. Contemporary issues
are then discussed. These include how sampling data for representativity is
increasingly to be questioned. While we can always avail of analytics from a
""bag of tools and techniques"", in the application of machine learning and
predictive analytics, nonetheless we present the case for PERSON and
Benz\'ecri-based science of data, as follows. This is to construct bridges
between data sources and position-taking, and decision-making. There is summary
presentation of a few case studies, illustrating and exemplifying application
domains.",1
"PERSON for stable differentiation of piecewise-smooth functions are given.
The data are noisy values of these functions. The locations of discontinuity
points and the sizes of the jumps across these points are not assumed known,
but found stably from the noisy data.","In a geocentric kinematically rotating ecliptical coordinate system in
geodesic motion through the deformed spacetime of the ORG, both the longitude
of the ascending node CARDINAL\Omega$ and the inclination $I$ of an artificial
satellite of the spinning LOC are affected by the NORP
gravitoelectric ORG and gravitomagnetic ORG effects. By
choosing a circular orbit with $I = CARDINAL = CARDINAL for a potential new
spacecraft, which we propose to name ORG, it would be possible to measure
each of the gravitomagnetic precessions separately at a percent level, or,
perhaps, even better depending on the level of accuracy of the current and
future global ocean tide models since the competing classical long-term
perturbations on $I,~\Omega$ due to the even and odd zonal harmonics
$MONEY of the geopotential vanish. Moreover, a suitable
linear combination of $I,~\Omega$ would be able to cancel out the solid and
ocean tidal perturbations induced by the MONEY tide and, at the same time,
enforce the geodetic precessions yielding a secular trend of
$-8.3~\textrm{milliarcseconds~per~year}$, thus strengthening the goal of a
$\simeq CARDINAL test of the ORG effect recently proposed in the
literature in the case of an equatorial coordinate system. Relatively mild
departures $MONEY I = \Delta\Omega\simeq CARDINAL-0.1\deg$ from the ideal orbital
configuration with $I = \Omega = CARDINAL are allowed. [Abridged]",0
"In evolutionary algorithms, the fitness of a population increases with time
by mutating and recombining individuals and by a biased selection of more fit
individuals. The right selection pressure is critical in ensuring sufficient
optimization progress on the one hand and in preserving genetic diversity to be
able to escape from local optima on the other. We propose a new selection
scheme, which is uniform in the fitness values. It generates selection pressure
towards sparsely populated fitness regions, not necessarily towards higher
fitness, as is the case for all other selection schemes. We show that the new
selection scheme can be much more effective than standard selection schemes.","CARDINAL models of computer, a quantum and a classical ""chemical machine"" designed
to compute the relevant part of PERSON's factoring algorithm are discussed. The
comparison shows that the basic quantum features believed to be responsible for
the exponential speed-up of quantum computations possess their classical
counterparts for the hybrid digital-analog computer. It is argued that the
measurement errors which cannot be fully corrected make the computation not
efficient for both models.",0