text1
stringlengths 4
124k
| text2
stringlengths 3
149k
| same
int64 0
1
|
---|---|---|
This paper describes an ORG method (called ORG) to generate predefined arbitrarily shaped CARDINAL-dimensional arrays of
cells by means of evolutionary techniques. It is based on a model of
development, whose key features are: i) the distinction bewteen ``normal'' and
``driver'' cells, being the latter able to receive guidance from the genome,
ii) the implementation of the proliferation/apoptosis events in such a way that
many cells are created/deleted at once, in order to speed-up the morphogenetic
process. iii) the presence in driver cells of an epigenetic memory, that holds
the position of the cell in the cell lineage tree and represents the source of
differentiation during development. The experiments performed with a number of
100x100 black and white and colour target shapes (the horse, the couple, the
hand, the dolphin, the map of GPE, the foot, the frog, the baby, the
stomach, the NORP flag, the head) bring to the conclusion that the method
described is able to generate any target shape, outperforming any other known
method in terms of size and variety of the generated shapes. The interpretation
of the proposed method as a model of embryogenesis and its biological
implications are discussed. | Methods to find correlation among variables are of interest to many
disciplines, including statistics, machine learning, (big) data mining and
neurosciences. Parameters that measure correlation between CARDINAL variables are of
limited utility when used with multiple variables. In this work, I propose a
simple criterion to measure correlation among an arbitrary number of variables,
based on a data set. The central idea is to i) design a function of the
variables that can take different forms depending on a set of parameters, ii)
calculate the difference between a statistics associated to the function
computed on the data set and the same statistics computed on a randomised
version of the data set, called "scrambled" data set, and iii) optimise the
parameters to maximise this difference. Many such functions can be organised in
layers, which can in turn be stacked CARDINAL on top of the other, forming a neural
network. The function parameters are searched with an enhanced genetic
algortihm called POET and the resulting method is tested on a cancer gene data
set. The method may have potential implications for some issues that affect the
field of neural networks, such as overfitting, the need to process huge amounts
of data for training and the presence of "adversarial examples". | 1 |
PERSON's original definition of default logic allows for the application of a
default that contradicts a previously applied one. We call failure this
condition. The possibility of generating failures has been in the past
considered as a semantical problem, and variants have been proposed to solve
it. We show that it is instead a computational feature that is needed to encode
some domains into default logic. | We show that the separability of states in ORG has a close
counterpart in classical physics, and that conditional mutual information
(a.k.a. conditional information transmission) is a very useful quantity in the
study of both quantum and classical separabilities. We also show how to define
entanglement of formation in terms of conditional mutual information. This
paper lays the theoretical foundations for a sequel paper which will present a
computer program that can calculate a decomposition of any separable quantum or
classical state. | 0 |
The article describes an investigation of the effectiveness of genetic
algorithms for multi-objective combinatorial optimization (MOCO) by presenting
an application for the vehicle routing problem with soft time windows. The work
is motivated by the question, if and how the problem structure influences the
effectiveness of different configurations of the genetic algorithm.
Computational results are presented for different classes of vehicle routing
problems, varying in their coverage with time windows, time window size,
distribution and number of customers. The results are compared with a simple,
but effective local search approach for multi-objective combinatorial
optimization problems. | Non deterministic applications arise in many domains, including, stochastic
optimization, multi-objectives optimization, stochastic planning, contingent
stochastic planning, reinforcement learning, reinforcement learning in
partially observable PERSON decision processes, and conditional planning. We
present a logic programming framework called non deterministic logic programs,
along with a declarative semantics and fixpoint semantics, to allow
representing and reasoning about inherently non deterministic real-world
applications. The language of non deterministic logic programs framework is
extended with non-monotonic negation, and CARDINAL alternative semantics are
defined: the stable non deterministic model semantics and the well-founded non
deterministic model semantics as well as their relationship is studied. These
semantics subsume the deterministic stable model semantics and the
deterministic well-founded semantics of deterministic normal logic programs,
and they reduce to the semantics of deterministic definite logic programs
without negation. We show the application of the non deterministic logic
programs framework to a conditional planning problem. | 0 |
Deep neural networks (DNN) are the state of the art on many engineering
problems such as computer vision and audition. A key factor in the success of
the DNN is scalability - bigger networks work better. However, the reason for
this scalability is not yet well understood. Here, we interpret the DNN as a
discrete system, of ORG filters followed by nonlinear activations, that is
subject to the laws of sampling theory. In this context, we demonstrate that
over-sampled networks are more selective, learn faster and learn more robustly.
Our findings may ultimately generalize to the human brain. | Regularisation of deep neural networks (DNN) during training is critical to
performance. By far the most popular method is known as dropout. Here, cast
through the prism of signal processing theory, we compare and contrast the
regularisation effects of dropout with those of dither. We illustrate some
serious inherent limitations of dropout and demonstrate that dither provides a
more effective regulariser. | 1 |
Importance sampling and PERSON sampling (of which GPE sampling
is a special case) are CARDINAL methods commonly used to sample multi-variate
probability distributions (that is, NORP networks). Heretofore, the
sampling of NORP networks has been done on a conventional "classical
computer". In this paper, we propose methods for doing importance sampling and
PERSON sampling of a classical NORP network on a quantum
computer. | In spite of all {\bf no-go} PERSON (e.g., PERSON, GPE and
NORP,..., ORG,...) we constructed a realist basis of quantum mechanics. In
our model both classical and quantum spaces b are rough images of the
fundamental {\bf prespace.} ORG mechanics cannot be reduced to classical
one. Both classical and quantum representations induce reductions of prespace
information. | 0 |
Since PERSON, finite automata theory has been inspired by physics, in
particular by ORG complementarity. We review automaton complementarity,
reversible automata and the connections to generalized urn models. Recent
developments in quantum information theory may have appropriate formalizations
in the GPE context. | Gray (DATE) argued that the GPE paradox (PERSON) is a misnomer, and it is not a
valid paradox. Gray also speculated that the argument was misattributed to
GPE, whose lunchtime remarks did not pertain to the existence of
extraterrestrial intelligence, but to the feasibility of interstellar travel.
Instead, the paradox is ascribed to LAW, and it is further
suggested that the paradox is not a real problem or research subject and should
not be used in debates about ORG projects. The arguments given are
unpersuasive, ahistorical, and, in CARDINAL instance, clearly hinge on
literalistic and uncharitable reading of evidence. Instead, I argue the
following CARDINAL points: (i) Contrary to Gray's assertion, the historical issue
of naming of ideas or concepts is completely divorced from their epistemic
status. (ii) PERSON is easily and smoothly generalized into EVENT
paradox, so it makes no sense either theoretically or empirically to separate
the CARDINAL. (iii) In sharp contrast to the main implication of PERSON's paper, ORG
has become more aggravated lately due to advances in astrobiology. | 0 |
In the framework of the emergent gravity scenario by PERSON, it was
recently observed by PERSON and PERSON that, among other things, an anomalous
pericenter precession would affect the orbital motion of a test particle
orbiting an isolated central body. Here, it is shown that, if it were real, its
expected magnitude for the inner planets of the Solar System would be at the
same level of the present-day accuracy in constraining any possible deviations
from their standard perihelion precessions as inferred from long data records
spanning DATE. The most favorable situation for testing the
Verlinde-type precession seems to occur for LOC. Indeed, according to recent
versions of the ORG and PERSON planetary ephemerides, non-standard perihelion
precessions, of whatsoever physical origin, which are larger than some $GPE
CARDINAL-0.11$ milliarcseconds per century are not admissible, while the putative
precession predicted by PERSON and PERSON amounts to MONEY milliarcseconds per
century. Other potentially interesting astronomical and astrophysical scenarios
like, e.g., the LOC's LOC artificial satellite, the double pulsar
system PERSON/B and the S-stars orbiting ORG
in GPE A$^\ast$ are, instead, not viable because of the excessive smallness of
the predicted effects for them. | General formulas of entanglement concentration are derived by using an
information-spectrum approach for the i.i.d. sequences and the general
sequences of partially entangled pure states. That is, we derive general
relations between the performance of the entanglement concentration and the
eigenvalues of the partially traced state. The achievable rates with constant
constraints and those with exponential constraints can be calculated from these
formulas. | 0 |
This is a collection of linguistic-mathematical approaches to NORP rebus,
puzzles, poetical and juridical texts, and proposes fancies, recreational math
problems, and paradoxes. We study the frequencies of letters, syllables, vowels
in various poetry, grill definitions in rebus, and rebus rules. We also compare
the scientific language, poetical language, and puzzle language, and compute
the FAC entropy and NORP informational energy. | In order to more accurately situate and fit the neutrosophic logic into the
framework of nonstandard analysis, we present the neutrosophic inequalities,
neutrosophic equality, neutrosophic infimum and supremum, neutrosophic standard
intervals, including the cases when the neutrosophic logic standard and
nonstandard components T, I, F get values outside of the classical real unit
interval [CARDINAL, CARDINAL], and a brief evolution of neutrosophic operators. The paper
intends to answer ORG criticism that we found benefic in better
understanding the nonstandard neutrosophic logic, although the nonstandard
neutrosophic logic was never used in practical applications. | 1 |
Whereas the research program of the measurement of scientific communications
emerged in a context where the delineations among academia, government, and
industry were institutionalized, the systemic development of these relations
during DATE has changed the system of reference
for the evaluation of research. In a knowledge-based economy science fulfills
functions that change the definitions of what is considered research and
globalization has changed the relevance of national systems of reference.
Science, of course, has been internationally oriented from its very beginning,
but the entrainment of the research process in these global developments is
reflected in the research evaluation and the scientometric measurement. In
other words, the systems under study have become more complex. A complex
dynamics can analytically be decomposed in several subdynamics. The evolving
systems and subsystems communicate in different dimensions and the evaluation
has become part of the codification of these communications. | CARDINAL steps aid in the analysis of selection. ORDINAL, describe phenotypes by
their component causes. Components include genes, maternal effects, symbionts,
and any other predictors of phenotype that are of interest. ORDINAL, describe
fitness by its component causes, such as an individual's phenotype, its
neighbors' phenotypes, resource availability, and so on. ORDINAL, put the
predictors of phenotype and fitness into an exact equation for evolutionary
change, providing a complete expression of selection and other evolutionary
processes. The complete expression separates the distinct causal roles of the
various hypothesized components of phenotypes and fitness. Traditionally, those
components are given by the covariance, variance, and regression terms of
evolutionary models. I show how to interpret those statistical expressions with
respect to information theory. The resulting interpretation allows one to read
the fundamental equations of selection and evolution as sentences that express
how various causes lead to the accumulation of information by selection and the
decay of information by other evolutionary processes. The interpretation in
terms of information leads to a deeper understanding of selection and
heritability, and a clearer sense of how to formulate causal hypotheses about
evolutionary process. Kin selection appears as a particular type of causal
analysis that partitions social effects into meaningful components. | 0 |
How best to quantify the information of an object, whether natural or
artifact, is a problem of wide interest. A related problem is the computability
of an object. We present practical examples of a new way to address this
problem. By giving an appropriate representation to our objects, based on a
hierarchical coding of information, we exemplify how it is remarkably easy to
compute complex objects. Our algorithmic complexity is related to the length of
the class of objects, rather than to the length of the object. | The concept of {\em complexity} (as a quantity) has been plagued by numerous
contradictory and confusing definitions. By explicitly recognising a role for
the observer of a system, an observer that attaches meaning to data about the
system, these contradictions can be resolved, and the numerous complexity
measures that have been proposed can be seen as cases where different observers
are relevant, and/or being proxy measures that loosely scale with complexity,
but are easy to compute from the available data. Much of the epistemic
confusion in the subject can be squarely placed at science's tradition of
removing the observer from the description in order to guarantee {\em
objectivity}. Explicitly acknowledging the role of the observer helps untangle
other confused subject areas. PERSON} is a topic about which much ink
has been spilt, but it can be understand easily as an irreducibility between
description space and meaning space. ORG can also be understood
as a theory of observation. The success in explaining quantum mechanics, leads
one to conjecture that all of physics may be reducible to properties of the
observer. And indeed, what are the necessary (as opposed to contingent)
properties of an observer? This requires a full theory of consciousness, from
which we are a long way from obtaining. However where progress does appear to
have been made, e.g. PERSON's {\em PERSON}, a
recurring theme of self-observation is a crucial ingredient. | 0 |
This paper proposes a new mechanism for pruning a search game-tree in
computer chess. The algorithm stores and then reuses chains or sequences of
moves, built up from previous searches. These move sequences have a built-in
forward-pruning mechanism that can radically reduce the search space. A typical
search process might retrieve a move from FAC, where the
decision of what move to retrieve would be based on the position itself. This
algorithm stores move sequences based on what previous sequences were better,
or caused cutoffs. This is therefore position independent and so it could also
be useful in games with imperfect information or uncertainty, where the whole
situation is not known at any CARDINAL time. Over a small set of tests, the
algorithm was shown to clearly out-perform Transposition Tables, both in terms
of search reduction and game-play results. | This paper reconsiders the problem of the absent-minded driver who must
choose between alternatives with different payoff with imperfect recall and
varying degrees of knowledge of the system. The classical absent-minded driver
problem represents the case with limited information and it has bearing on the
general area of communication and learning, social choice, mechanism design,
auctions, theories of knowledge, belief, and rational agency. Within the
framework of extensive games, this problem has applications to many artificial
intelligence scenarios. It is obvious that the performance of the agent
improves as information available increases. It is shown that a non-uniform
assignment strategy for successive choices does better than a fixed probability
strategy. We consider both classical and quantum approaches to the problem. We
argue that the superior performance of quantum decisions with access to
entanglement cannot be fairly compared to a classical algorithm. If the
cognitive systems of agents are taken to have access to quantum resources, or
have a quantum mechanical basis, then that can be leveraged into superior
performance. | 0 |
This is a review of "WORK_OF_ART", by PERSON. | Causal models defined in terms of a collection of equations, as defined by
PRODUCT, are axiomatized here. Axiomatizations are provided for CARDINAL
successively more general classes of causal models: (CARDINAL) the class of recursive
theories (those without feedback), (CARDINAL) the class of theories where the
solutions to the equations are unique, (CARDINAL) arbitrary theories (where the
equations may not have solutions and, if they do, they are not necessarily
unique). It is shown that to reason about causality in the most general ORDINAL
class, we must extend the language used by GPE and GPE. In addition, the
complexity of the decision procedures is characterized for all the languages
and classes of models considered. | 1 |
In this paper are discussed some formal properties of ORG devices
necessary for implementation of nondeterministic Turing machine. | The essential graph is a distinguished member of a PERSON equivalence class
of ORG chain graphs. However, the directed edges in the essential graph are not
necessarily strong or invariant, i.e. they may not be shared by every member of
the equivalence class. Likewise for the undirected edges. In this paper, we
develop a procedure for identifying which edges in an essential graph are
strong. We also show how this makes it possible to bound some causal effects
when the true chain graph is unknown. | 0 |
This paper presents a theory of error in cross-validation testing of
algorithms for predicting real-valued attributes. The theory justifies the
claim that predicting real-valued attributes requires balancing the conflicting
demands of simplicity and accuracy. Furthermore, the theory indicates precisely
how these conflicting demands must be balanced, in order to minimize
cross-validation error. A general theory is presented, then it is developed in
detail for ORG regression and instance-based learning. | Many ORG researchers and cognitive scientists have argued that analogy is the
core of cognition. The most influential work on computational modeling of
analogy-making is WORK_OF_ART (ORG) and its implementation in the
WORK_OF_ART (ORG). A limitation of ORG is the requirement for
complex hand-coded representations. We introduce ORG (ORG), which combines ideas from ORG and ORG
(ORG) in order to remove the requirement for hand-coded representations. ORG
builds analogical mappings between lists of words, using a large corpus of raw
text to automatically discover the semantic relations among the words. We
evaluate ORG on a set of CARDINAL analogical mapping problems, CARDINAL based on
scientific analogies and CARDINAL based on common metaphors. ORG achieves
human-level performance on the CARDINAL problems. We compare ORG with a variety
of alternative approaches and find that they are not able to reach the same
level of performance. | 1 |
Let $u_t = u_{xx} - q(x) u, CARDINAL \leq x \leq MONEY, MONEY, $u(0, t) = CARDINAL, ORG, t) =
a(t), u(x,0) = MONEY, where $PERSON is a given function vanishing for $t>T$, $a(t)
\not\equiv MONEY, $\int^T_0 a(t) dt < \infty$. Suppose one measures the flux $PERSON(0,t) := b_0 (t)$ for all $t>0$. Does this information determine $MONEY
uniquely? Do the measurements of the flux $u_x (CARDINAL) := b(t)$ give more
information MONEY (ORG does?
The above questions are answered in this paper. | Mathematically rigorous inversion method is developed to recover compactly
supported potentials from the fixed-energy scattering data in CARDINAL dimensions.
Error estimates are given for the solution.
An algorithm for inversion of noisy discrete fixed-energy CARDINALD scattering data
is developed and its error estimates are obtained | 1 |
If a system falls through a black hole horizon, then its information is lost
to an observer at infinity. But we argue that the {\it accessible} information
is lost {\it before} the horizon is crossed. The temperature of the hole limits
information carrying signals from a system that has fallen too close to the
horizon. Extremal holes have T=0, but there is a minimum energy required to
emit a quantum in the short proper time left before the horizon is crossed. If
we attempt to bring the system back to infinity for observation, then
acceleration radiation destroys the information. All CARDINAL considerations give
a critical distance from the horizon $d\sim \sqrt{r_H\over \Delta E}$, where
$PERSON is the horizon radius and MONEY E$ is the energy scale characterizing
the system. For systems in string theory where we pack information as densely
as possible, this acceleration constraint is found to have a geometric
interpretation. These estimates suggest that in theories of gravity we should
measure information not as a quantity contained inside a given system, but in
terms of how much of that information can be reliably accessed by another
observer. | We argue that bound states of branes have a size that is of the same order as
the horizon radius of the corresponding black hole. Thus the interior of a
black hole is not `empty space with a central singularity', and Hawking
radiation can pick up information from the degrees of freedom of the hole. | 1 |
PERSON equilibrium is the most commonly-used notion of equilibrium in game
theory. However, it suffers from numerous problems. Some are well known in the
game theory community; for example, the ORG equilibrium of repeated prisoner's
dilemma is neither normatively nor descriptively reasonable. However, new
problems arise when considering PERSON equilibrium from a computer science
perspective: for example, PERSON equilibrium is not robust (it does not tolerate
``faulty'' or ``unexpected'' behavior), it does not deal with coalitions, it
does not take computation cost into account, and it does not deal with cases
where players are not aware of all aspects of the game. Solution concepts that
try to address these shortcomings of ORG equilibrium are discussed. | The original ORG definition of causality [Halpern and GPE, DATE]
was updated in the journal version of the paper [PRODUCT and GPE, DATE] to
deal with some problems pointed out by PERSON and PERSON [DATE]. Here the
definition is modified yet again, in a way that (a) leads to a simpler
definition, (b) handles the problems pointed out by PERSON and GPE, and many
others, (c) gives reasonable answers (that agree with those of the original and
updated definition) in the standard problematic examples of causality, and (d)
has lower complexity than either the original or updated definitions. | 1 |
This paper verifies a result of {Shenoy:94} concerning graphoidal structure
of Shenoy's notion of independence for ORG theory of belief
functions. Shenoy proved that his notion of independence has graphoidal
properties for positive normal valuations.
The requirement of strict positive normal valuations as prerequisite for
application of graphoidal properties excludes a wide class of ORG belief
functions. It excludes especially so-called probabilistic belief functions. It
is demonstrated that the requirement of positiveness of valuation may be
weakened in that it may be required that commonality function is non-zero for
singleton sets instead, and the graphoidal properties for independence of
belief function variables are then preserved. This means especially that
probabilistic belief functions with all singleton sets as focal points possess
graphoidal properties for independence. | In previous papers, we expressed ORG in terms of
ORG (ORG). In this brief paper, we express ORG in terms of ORG. | 0 |
We review the application of the critical point large N_f self-consistency
method to ORG. In particular we derive the O(1/N_f) d-dimensional critical
exponents whose epsilon-expansion determines the perturbative coefficients in
MSbar of the field dimensions, beta-function and various twist-2 operators
which occur in the operator product expansion of deep inelastic scattering. | The leading order coefficients of the beta-function of ORG are computed in a
large N_f expansion. They are in agreement with the CARDINAL loop ORG
calculation. The method involves computing the anomalous dimension of the
operator (G^2_{mu nu})^2 at the d-dimensional fixed point in the NORP
Thirring model to which ORG is equaivalent in this limit. The effect the
O(1/N_f) corrections have on the location of the infrared stable fixed point
for a range of N_f is also examined. | 1 |
Unlike classical information, ORG knowledge is restricted to the outcome
of measurements of maximal observables corresponding to single contexts. | Some physical aspects related to the limit operations of the ORG lamp are
discussed. Regardless of the formally unbounded and even infinite number of
"steps" involved, the physical limit has an operational meaning in agreement
with the PRODUCT sums of infinite series. The formal analogies to accelerated
(hyper-) computers and the recursion theoretic diagonal methods are discussed.
As ORG information is not bound by the mutually exclusive states of
classical bits, it allows a consistent representation of fixed point states of
the diagonal operator. In an effort to reconstruct the self-contradictory
feature of diagonalization, a generalized diagonal method allowing no quantum
fixed points is proposed. | 1 |
It is shown in the ORDINAL part of this paper that a combined model comprising
ordinary and quintessential matter can support a traversable wormhole in
Einstein-Maxwell gravity. Since the solution allows CARDINAL tidal forces, the
wormhole is suitable for a humanoid traveler. The ORDINAL part of the paper
shows that the electric field can be eliminated (Einstein gravity), but only by
tolerating enormous tidal forces. Such a wormhole would still be capable of
transmitting signals. | It has been shown that a noncommutative-geometry background may be able to
support traversable wormholes. This paper discusses the possible detection of
such wormholes in the outer regions of galactic halos by means of gravitational
lensing. The procedure allows a comparison to other models such as the
ORG-Frenk-White model and PERSON) modified gravity and is likely to favor a
model based on noncommutative geometry. | 1 |
A general expression of the axial-vector current is presented, in which both
the effects of the chiral symmetry breaking and the spontaneous chiral symmetry
breaking are included. A new resonance formula of the axial-vector meson is
derived and in the limit of $q^{2}\rightarrow 0$ this formula doesn't go back
to the ``chiral limit``. The studies show that the dominance of the
axial-vector meson is associated with the satisfaction of PCAC. The dominance
of pion exchange is companied by the strong anomaly of ORG. | A correction of the low energy theorem of the \gamma\to3\pi GPE has been
found. A_{3\pi}(0,0,0) and the cross section are calculated. Theory agrees with
data. There is no new adjustable parameter. | 1 |
A PERSON simulation based on O(alpha_s) QCD matrix elements matched to
parton showers shows that final-state hadrons in deep inelastic scattering
(ORG) can be used to tag events with a single (anti)quark recoiled against the
proton. The method is particularly suited to study the mean charge of leading
particles, which is sensitive to fragmentation and sea quark contribution to
the proton structure function. We also discuss methods to study the charm
production in ORG using the Breit frame. | We propose a data format for PERSON (MC) events, or any structural data,
including experimental data, in a compact binary form using variable-size
integer encoding as implemented in the Google's Protocol Buffers package. This
approach is implemented in the so-called ProMC library which produces smaller
file sizes for MC records compared to the existing input-output libraries used
in high-energy physics (ORG). Other important features are a separation of
abstract data layouts from concrete programming implementations,
self-description and random access. Data stored in FAC files can be written,
read and manipulated in a number of programming languages, such C++, PERSON and
Python. | 1 |
Probability answer set programming is a declarative programming that has been
shown effective for representing and reasoning about a variety of probability
reasoning tasks. However, the lack of probability aggregates, e.g. {\em
expected values}, in the language of disjunctive hybrid probability logic
programs (ORG) disallows the natural and concise representation of many
interesting problems. In this paper, we extend ORG to allow arbitrary
probability aggregates. We introduce CARDINAL types of probability aggregates; a
type that computes the expected value of a classical aggregate, e.g., the
expected value of the minimum, and a type that computes the probability of a
classical aggregate, ORG, the probability of sum of values. In addition, we
define a probability answer set semantics for ORG with arbitrary probability
aggregates including monotone, antimonotone, and nonmonotone probability
aggregates. We show that the proposed probability answer set semantics of ORG
subsumes both the original probability answer set semantics of ORG and the
classical answer set semantics of classical disjunctive logic programs with
classical aggregates, and consequently subsumes the classical answer set
semantics of the original disjunctive logic programs. We show that the proposed
probability answer sets of ORG with probability aggregates are minimal
probability models and hence incomparable, which is an important property for
nonmonotonic probability reasoning. | Ageing of publications, percentage of self-citations, and impact vary from
journal to journal within fields of science. The assumption that citation and
publication practices are homogenous within specialties and fields of science
is invalid. Furthermore, the delineation of fields and among specialties is
fuzzy. Institutional units of analysis and persons may move between fields or
span different specialties. The match between the citation index and
institutional profiles varies among institutional units and nations. The
respective matches may heavily affect the representation of the units. Non-ISI
journals are increasingly cornered into "transdisciplinary" PRODUCT functions
with the exception of specialist journals publishing in languages other than
LANGUAGE. An "externally cited impact factor" can be calculated for these
journals. The citation impact of non-ISI journals will be demonstrated using
WORK_OF_ART as the example. | 0 |
Science and mathematics help people better to understand world, eliminating
different fallacies and misconceptions. CARDINAL of such misconception is related to
arithmetic, which is so important both for science and everyday life. People
think that their counting is governed by the rules of the conventional
arithmetic and that other kinds of arithmetic do not exist and cannot exist. It
is demonstrated in this paper that this popular image of the situation with
integer numbers is incorrect. In many situations, we have to utilize different
rules of counting and operating. This is a consequence of the existing
diversity in nature and society and to represent correctly this diversity
people have to utilize different arithmetics. To distinct them, we call the
conventional arithmetic PERSON, while other arithmetics are called
NORP. Theory of NORP arithmetics is developed in the book
of the author "Non-Diophantine arithmetics or is it possible that CARDINAL + 2 is not
equal to CARDINAL." In this work, some properties of NORP arithmetics are
considered, as well as their connections to numerical computations and
contemporary physics are explained. | A practical tool for natural language modeling and development of
human-machine interaction is developed in the context of formal grammars and
languages. A new type of formal grammars, called grammars with prohibition, is
introduced. Grammars with prohibition provide more powerful tools for natural
language generation and better describe processes of language learning than the
conventional formal grammars. Here we study relations between languages
generated by different grammars with prohibition based on conventional types of
formal grammars such as context-free or context sensitive grammars. Besides, we
compare languages generated by different grammars with prohibition and
languages generated by conventional formal grammars. In particular, it is
demonstrated that they have essentially higher computational power and
expressive possibilities in comparison with the conventional formal grammars.
Thus, while conventional formal grammars are recursive and subrecursive
algorithms, many classes of grammars with prohibition are superrecursive
algorithms. Results presented in this work are aimed at the development of
human-machine interaction, modeling natural languages, empowerment of
programming languages, computer simulation, better software systems, and theory
of recursion. | 1 |
The "SP theory of intelligence", with its realisation in the "SP computer
model", aims to simplify and integrate observations and concepts across
AI-related fields, with information compression as a unifying theme. This paper
describes how abstract structures and processes in the theory may be realised
in terms of neurons, their interconnections, and the transmission of signals
between neurons. This part of the NORP theory -- "SP-neural" -- is a tentative
and partial model for the representation and processing of knowledge in the
brain. In the NORP theory (apart from NORP-neural), all kinds of knowledge are
represented with "patterns", where a pattern is an array of atomic symbols in
CARDINAL or CARDINAL dimensions. In SP-neural, the concept of a "pattern" is realised as
an array of neurons called a "pattern assembly", similar to Hebb's concept of a
"cell assembly" but with important differences. Central to the processing of
information in the NORP system is the powerful concept of "multiple alignment",
borrowed and adapted from bioinformatics. Processes such as pattern
recognition, reasoning and problem solving are achieved via the building of
multiple alignments, while unsupervised learning -- significantly different
from EVENT -- is achieved by creating patterns from
sensory information and also by creating patterns from multiple alignments in
which there is a partial match between CARDINAL pattern and another. Short-lived
neural structures equivalent to multiple alignments will be created via an
inter-play of excitatory and inhibitory neural signals. The paper discusses
several associated issues, with relevant empirical evidence. | This paper examines common assumptions regarding the decision-making internal
environment for intelligent agents and investigates issues related to
processing of memory and belief states to help obtain better understanding of
the responses. In specific, we consider order effects and discuss both
classical and non-classical explanations for them. We also consider implicit
cognition and explore if certain inaccessible states may be best modeled as
quantum states. We propose that the hypothesis that quantum states are at the
basis of order effects be tested on large databases such as those related to
medical treatment and drug efficacy. A problem involving a maze network is
considered and comparisons made between classical and quantum decision
scenarios for it. | 0 |
We introduce operational semantics into games. And based on the operational
semantics, we establish a full algebra of games, including basic algebra of
games, algebra of concurrent games, recursion and abstraction. The algebra can
be used widely to reason on the behaviors of systems (not only computational
systems) with game theory supported. | ORG to traditionally accurate computing, approximate computing focuses
on the rapidity of the satisfactory solution, but not the unnecessary accuracy
of the solution. Approximate bisimularity is the approximate one corresponding
to traditionally accurate bisimilarity. Based on the work of distances between
basic processes, we propose an algebraic approach for distances between
processes to support a whole process calculus ORG, which contains prefix, sum,
composition, restriction, relabeling and recursion. | 1 |
We study the problem of deciding whether some PSPACE-complete problems have
models of bounded size. Contrary to problems in ORG, models of ORG-complete
problems may be exponentially large. However, such models may take polynomial
space in a succinct representation. For example, the models of a ORG are
explicitely represented by and-or trees (which are always of exponential size)
but can be succinctely represented by circuits (which can be polynomial or
exponential). We investigate the complexity of deciding the existence of such
succinct models when a bound on size is given. | We analyze the computational complexity of problems related to case-based
planning: planning when a plan for a similar instance is known, and planning
from a library of plans. We prove that planning from a single case has the same
complexity than generative planning (i.e., planning "from scratch"); using an
extended definition of cases, complexity is reduced if the domain stored in the
case is similar to the one to search plans for. Planning from a library of
cases is shown to have the same complexity. In both cases, the complexity of
planning remains, in the worst case, PSPACE-complete. | 1 |
I provide an alternative way of seeing quantum computation. ORDINAL, I describe
an idealized classical problem solving machine that, thanks to a many body
interaction, reversibly and nondeterministically produces the solution of the
problem under the simultaneous influence of all the problem constraints. This
requires a perfectly accurate, rigid, and reversible relation between the
coordinates of the machine parts - the machine can be considered the many body
generalization of another perfect machine, the bounching ball model of
reversible computation. The mathematical description of the machine, as it is,
is applicable to ORG problem solving, an extension of the quantum
algorithms that comprises the physical representation of the problem-solution
interdependence. The perfect relation between the coordinates of the machine
parts is transferred to the populations of the reduced density operators of the
parts of the computer register. The solution of the problem is reversibly and
nondeterministically produced under the simultaneous influence of the state
before measurement and the quantum principle. At the light of the present
notion of simultaneous computation, the quantum speed up turns out to be
"precognition" of the solution, namely the reduction of the initial ignorance
of the solution due to backdating, to before running the algorithm, a
time-symmetric part of the state vector reduction on the solution; as such, it
is bounded by state vector reduction through an entropic inequality. PACS
numbers: CARDINAL, 01.55.+b, 01.70.+w | There exists an increasing evidence supporting the picture of the PERSON
junction (JJ) as a "macroscopic quantum system". On the other hand the
interpretation of experimental data strongly depends on the assumed theoretical
model. We analyse the possible states of a NORP pair box ("charge qubit") for
the CARDINAL types of models : CARDINAL-mode ORG model with its large $MONEY
aproximations and the many-body description within the mean-field approximation
(Gross-Pitaevski equation). While the ORDINAL class of models supports the
picture of JJ being a quantum subsystem of a single degree of freedom, the
ORDINAL approach yields an essentially classical structure of accessible quantum
states which, in particular, implies the absence of entanglement for CARDINAL
coupled JJ's. The arguments in favor of the mean-field theory are presented and
different experimental tests including a new proposal are briefly discussed. | 0 |
Belief integration methods are often aimed at deriving a single and
consistent knowledge base that retains as much as possible of the knowledge
bases to integrate. The rationale behind this approach is the minimal change
principle: the result of the integration process should differ as less as
possible from the knowledge bases to integrate. We show that this principle can
be reformulated in terms of a more general model of belief revision, based on
the assumption that inconsistency is due to the mistakes the knowledge bases
contain. Current belief revision strategies are based on a specific kind of
mistakes, which however does not include all possible ones. Some alternative
possibilities are discussed. | Merging beliefs requires the plausibility of the sources of the information
to be merged. They are typically assumed equally reliable in lack of hints
indicating otherwise; yet, a recent line of research spun from the idea of
deriving this information from the revision process itself. In particular, the
history of previous revisions and previous merging examples provide information
for performing subsequent mergings.
Yet, no examples or previous revisions may be available. In spite of the
apparent lack of information, something can still be inferred by a
try-and-check approach: a relative reliability ordering is assumed, the merging
process is performed based on it, and the result is compared with the original
information. The outcome of this check may be incoherent with the initial
assumption, like when a completely reliable source is rejected some of the
information it provided. In such cases, the reliability ordering assumed in the
ORDINAL place can be excluded from consideration. The ORDINAL theorem of this
article proves that such a scenario is indeed possible. Other results are
obtained under various definition of reliability and merging. | 1 |
The computer revolution has been driven by a sustained increase of
computational speed of CARDINAL order of magnitude (a factor of CARDINAL)
DATE since DATE. In natural sciences this has led to a
continuous increase of the importance of computer simulations. Major enabling
techniques are PERSON (MCMC) and ORG (GPE)
simulations.
This article deals with the MCMC approach. ORDINAL basic simulation techniques,
as well as methods for their statistical analysis are reviewed. Afterwards the
focus is on generalized ensembles and biased updating, CARDINAL advanced techniques,
which are of relevance for simulations of biomolecules, or are expected to
become relevant with that respect. In particular we consider the multicanonical
ensemble and the replica exchange method (also known as parallel tempering or
method of multiple PERSON chains). | Evolution of a physical quantum state vector is described as governed by CARDINAL
distinct physical laws: PERSON, unitary time evolution and a
relativistically covariant reduction process. In previous literature, it was
concluded that a relativistically satisfactory version of the collapse
postulate is in contradiction with physical measurements of a non-local state
history. Here it is shown that such measurements are excluded when reduction is
formulated as a physical process and the measurement devices are included as
part of the state vector. | 1 |
In DATE we witness a dramatic growth of research focused on
semantic image understanding. Indeed, without understanding image content
successful accomplishment of any image-processing task is simply incredible. Up
to the recent times, the ultimate need for such understanding has been met by
the knowledge that a domain expert or a vision system supervisor have
contributed to every image-processing application. The advent of the Internet
has drastically changed this situation. Internet sources of visual information
are diffused and dispersed over the whole Web, so the duty of information
content discovery and evaluation must be relegated now to an image
understanding agent (a machine or a computer program) capable to perform image
content assessment at a remote image location. Development of Content Based
Image Retrieval (ORG) techniques was a right move in a right direction,
launched DATE. Unfortunately, very little progress has been made
since then. The reason for this can be seen in a rank of long lasting
misconceptions that ORG designers are continuing to adhere to. I hope, my
arguments will help them to change their minds. | In DATE, we witness a paradigm shift in our nature
studies - from a data-processing based computational approach to an
information-processing based cognitive approach. The process is restricted and
often misguided by the lack of a clear understanding about what information is
and how it should be treated in research applications (in general) and in
biological studies (in particular). The paper intend to provide some remedies
for this bizarre situation. | 1 |
We prove that extreme ORG initial data set is a unique absolute minimum of
the total mass in a (physically relevant) class of vacuum, maximal,
asymptotically flat, axisymmetric data for PERSON equations with fixed
angular momentum. These data represent non-stationary, axially symmetric, black
holes. As a consequence, we obtain that any data in this class satisfy the
inequality $\sqrt{J} \leq m$, where $m$ and $MONEY are the total mass and angular
momentum of the spacetime. | For a given asymptotically flat initial data set for PERSON equations a new
geometric invariant is constructed. This invariant measure the departure of the
data set from the stationary regime, it vanishes if and only if the data is
stationary. In vacuum, it can be interpreted as a measure of the total amount
of radiation contained in the data. | 1 |
Most existing approaches in ORG (CRS) focus on
recommending relevant items to users taking into account contextual
information, such as time, location, or social aspects. However, few of them
have considered the problem of user's content dynamicity. We introduce in this
paper an algorithm that tackles the user's content dynamicity by modeling the
CRS as a contextual bandit algorithm and by including a situation clustering
algorithm to improve the precision of the ORG. Within a deliberately designed
offline simulation framework, we conduct evaluations with real online event log
data. The experimental results and detailed analysis reveal several important
discoveries in context aware recommender system. | We introduce in this paper an algorithm named PERSON that
tackles the dynamicity of the user's content. It is based on dynamic
exploration/exploitation tradeoff and can adaptively balance the CARDINAL aspects by
deciding which situation is most relevant for exploration or exploitation. The
experimental results demonstrate that our algorithm outperforms surveyed
algorithms. | 1 |
The article describes the proposition and application of a local search
metaheuristic for multi-objective optimization problems. It is based on CARDINAL
main principles of heuristic search, intensification through variable
neighborhoods, and diversification through perturbations and successive
iterations in favorable regions of the search space. The concept is
successfully tested on permutation flow shop scheduling problems under multiple
objectives and compared to other local search approaches. While the obtained
results are encouraging in terms of their quality, another positive attribute
of the approach is its simplicity as it does require the setting of only very
few parameters. | Logitboost is an influential boosting algorithm for classification. In this
paper, we develop robust logitboost to provide an explicit formulation of
tree-split criterion for building weak learners (regression trees) for
logitboost. This formulation leads to a numerically stable implementation of
logitboost. We then propose ORG-logitboost for multi-class classification, by
combining robust logitboost with the prior work of ORG-boost. Previously,
ORG-boost was implemented as ORG using the mart algorithm. Our extensive
experiments on multi-class classification compare CARDINAL algorithms: mart,
abcmart, (robust) logitboost, and ORG-logitboost, and demonstrate the
superiority of ORG-logitboost. Comparisons with other learning methods
including ORG and deep learning are also available through prior publications. | 0 |
"WORK_OF_ART" is the name of a model of cellular development that,
coupled with an evolutionary technique, becomes an evo-devo (or "artificial
embryology", or "computational development") method to generate CARDINAL or CARDINAL sets
of artificial cells arbitrarily shaped. 'In silico' experiments have proved the
effectiveness of the method in devo-evolving any kind of shape, of any
complexity (in terms of number of cells, number of colours, etc.); being shape
complexity a metaphor for organismal complexity, such simulations established
its potential to generate the complexity typical of biological systems.
Moreover, it has also been shown how the underlying model of cellular
development is able to produce the artificial version of key biological
phenomena such as embryogenesis, the presence of "junk DNA", the phenomenon of
ageing and the process of carcinogenesis. The objective of this document is not
to provide new material (most of the material presented here has already been
published elsewhere): rather, it is to provide all details that, for lack of
space, could not be provided in the published papers and in particular to give
all technical details necessary to re-implement the method. | Transposable elements are DNA sequences that can move around to different
positions in the genome. During this process, they can cause mutations, and
lead to an increase in genome size. Despite representing a large genomic
fraction, transposable elements have no clear biological function. This work
builds upon a previous model, to propose a new concept of natural selection
which combines NORP and NORP elements. Transposable elements are
hypothesised to be the vector of a flow of genetic information from soma to
germline, that shapes gene regulatory regions across the genome. The paper
introduces the concept, presents and discusses the body of evidence in support
of this hypothesis, and suggests an experiment to test it. | 1 |
Fluctuations on de Sitter solution of FAC field equations are
obtained in terms of the matter density primordial density fluctuations and
spin-torsion density and matter density fluctuations obtained from ORG data.
Einstein-de Sitter solution is shown to be unstable even in the absence of
torsion.The spin-torsion density fluctuation to generate a deflationary phase
is computed from the ORG data. | Any regular NORP probability distribution that can be represented by an
ORG chain graph (CG) can be expressed as a system of ORG equations with
correlated errors whose structure depends on the CG. However, the ORG represents
the errors implicitly, as no nodes in the ORG correspond to the errors. We
propose in this paper to add some deterministic nodes to the ORG in order to
represent the errors explicitly. We call the result an ORG CG. We will show
that, as desired, every AMP CG is PERSON equivalent to its corresponding ORG
ORG under marginalization of the error nodes. We will also show that every ORG
ORG under marginalization of the error nodes is PERSON equivalent to some PERSON
under marginalization of the error nodes, and that the latter is PERSON
equivalent to some directed and acyclic graph (ORG) under marginalization of
the error nodes and conditioning on some selection nodes. This is important
because it implies that the independence model represented by an AMP CG can be
accounted for by some data generating process that is partially observed and
has selection bias. Finally, we will show that ORG CGs are closed under
marginalization. This is a desirable feature because it guarantees parsimonious
models under marginalization. | 0 |
We prove new estimates for the volume of a NORP manifold and show
especially that cosmological spacetimes with crushing singularities have finite
volume. | When agents are acting together, they may need a simple mechanism to decide
on joint actions. CARDINAL possibility is to have the agents express their
preferences in the form of a ballot and use a voting rule to decide the winning
action(s). Unfortunately, agents may try to manipulate such an election by
misreporting their preferences. Fortunately, it has been shown that it is
ORG-hard to compute how to manipulate a number of different voting rules.
However, ORG-hardness only bounds the worst-case complexity. Recent theoretical
results suggest that manipulation may often be easy in practice. To address
this issue, I suggest studying empirically if computational complexity is in
practice a barrier to manipulation. The basic tool used in my investigations is
the identification of computational "phase transitions". Such an approach has
been fruitful in identifying hard instances of propositional satisfiability and
other ORG-hard problems. I show that phase transition behaviour gives insight
into the hardness of manipulating voting rules, increasing concern that
computational complexity is indeed any sort of barrier. Finally, I look at the
problem of computing manipulation of other, related problems like stable
marriage and tournament problems. | 0 |
I postulate that human or other intelligent agents function or should
function as follows. They store all sensory observations as they come - the
data is holy. At any time, given some agent's current coding capabilities, part
of the data is compressible by a short and hopefully fast program / description
/ explanation / world model. In the agent's subjective eyes, such data is more
regular and more "beautiful" than other data. It is well-known that knowledge
of regularity and repeatability may improve the agent's ability to plan actions
leading to external rewards. In absence of such rewards, however, known beauty
is boring. Then "interestingness" becomes the ORDINAL derivative of subjective
beauty: as the learning agent improves its compression algorithm, formerly
apparently random data parts become subjectively more regular and beautiful.
Such progress in compressibility is measured and maximized by the curiosity
drive: create action sequences that extend the observation history and yield
previously unknown / unpredictable but quickly learnable algorithmic
regularity. We discuss how all of the above can be naturally implemented on
computers, through an extension of passive unsupervised learning to the case of
active data selection: we reward a general reinforcement learner (with access
to the adaptive compressor) for actions that improve the subjective
compressibility of the growing data. An unusually large breakthrough in
compressibility deserves the name "discovery". The "creativity" of artists,
dancers, musicians, pure mathematicians can be viewed as a by-product of this
principle. Several qualitative examples support this hypothesis. | We discuss the problem of gauge invariance of the vector meson
photoproduction at small $x$ within the CARDINAL-gluon exchange model. It is found
that the gauge invariance is fulfilled if one includes the graphs with higher
Fock states in the meson wave function. Obtained results are used to estimate
the amplitudes with longitudinal and transverse photon and vector meson
polarization. | 0 |
The recently proposed "generalized PERSON" (ORG) kernel can be efficiently
linearized, with direct applications in large-scale statistical learning and
fast near neighbor search. The linearized ORG kernel was extensively compared
in with linearized radial basis function (RBF) kernel. On a large number of
classification tasks, the tuning-free ORG kernel performs (surprisingly) well
compared to the best-tuned ORG kernel. Nevertheless, one would naturally expect
that the ORG kernel ought to be further improved if we introduce tuning
parameters.
In this paper, we study CARDINAL simple constructions of tunable ORG kernels:
(i) the exponentiated-GMM (or eGMM) kernel, (ii) the powered-GMM (or pGMM)
kernel, and (iii) the exponentiated-powered-GMM (epGMM) kernel. The pGMM kernel
can still be efficiently linearized by modifying the original hashing procedure
for the ORG kernel. On CARDINAL publicly available classification datasets, we
verify that the proposed tunable ORG kernels typically improve over the
original ORG kernel. On some datasets, the improvements can be astonishingly
significant.
For example, on CARDINAL popular datasets which were used for testing deep learning
algorithms and tree methods, our experiments show that the proposed tunable ORG
kernels are strong competitors to trees and deep nets. The previous studies
developed tree methods including "ORG-robust-logitboost" and demonstrated the
excellent performance on those CARDINAL datasets (and other datasets), by
establishing the ORDINAL-order tree-split formula and new derivatives for
multi-class logistic loss. Compared to tree methods like
"ORG-robust-logitboost" (which are slow and need substantial model sizes), the
tunable ORG kernels produce largely comparable results. | We develop the concept of ORG (PERSON Class PERSON) for
multi-class classification and present ORG, a concrete implementation of
ORG. The original MART (Multiple Additive Regression Trees) algorithm has
been very successful in large-scale applications. For binary classification,
ORG recovers MART. For multi-class classification, ORG considerably
improves MART, as evaluated on several public data sets. | 1 |
The present article is a brief informal survey of computability logic --- the
game-semantically conceived formal theory of computational resources and tasks.
This relatively young nonclassical logic is a conservative extension of
classical ORDINAL order logic but is much more expressive than the latter,
yielding a wide range of new potential application areas. In a reasonable (even
if not strict) sense the same holds for intuitionistic and ORG logics, which
allows us to say that ORG reconciles and unifies the CARDINAL traditions of
logical thought (and beyond) on the basis of its natural and "universal" game
semantics. A comprehensive online survey of the subject can be found at
http://www.csc.villanova.edu/~japaridz/CL/ . | Computability logic (CL) (see ORG ) is a
research program for redeveloping logic as a formal theory of computability, as
opposed to the formal theory of truth which it has more traditionally been.
PERSON in ORG stand for interactive computational problems, seen as games
between a machine and its environment; logical operators represent operations
on such entities; and "truth" is understood as existence of an effective
solution. The formalism of ORG is open-ended, and may undergo series of
extensions as the studies of the subject advance. So far three -- parallel,
sequential and choice -- sorts of conjunction and disjunction have been
studied. The present paper adds CARDINAL more natural kind to this collection,
termed toggling. The toggling operations can be characterized as lenient
versions of choice operations where choices are retractable, being allowed to
be reconsidered any finite number of times. This way, they model
trial-and-error style decision steps in interactive computation. The main
technical result of this paper is constructing a sound and complete
axiomatization for the propositional fragment of computability logic whose
vocabulary, together with negation, includes all CARDINAL -- parallel, toggling,
sequential and choice -- kinds of conjunction and disjunction. Along with
toggling conjunction and disjunction, the paper also introduces the toggling
versions of quantifiers and recurrence operations. | 1 |
Review of: PERSON and PERSON, Geometric Data Analysis, From
ORG, LOC, Dordrecht, DATE,
PERSON. | A review of some of the author's results in the area of inverse scattering is
given. The following topics are discussed: CARDINAL) Property $MONEY and applications, CARDINAL)
Stable inversion of fixed-energy 3D scattering data and its error estimate, CARDINAL)
Inverse scattering with ''incomplete`` data, CARDINAL) Inverse scattering for
inhomogeneous Schr\"odinger equation, CARDINAL) PERSON's inverse scattering method, CARDINAL)
Invertibility of the steps in ORG, PERSON, and PERSON inversion
methods, 7) The Newton-Sabatier and PERSON procedures are not inversion
methods, WORK_OF_ART: existence, location, perturbation theory, 9) Born
inversion as an ill-posed problem, CARDINAL) Inverse obstacle scattering with
fixed-frequency data, CARDINAL) Inverse scattering with data at a fixed energy and a
fixed incident direction, CARDINAL) Creating materials with a desired refraction
coefficient and wave-focusing properties. | 0 |
We study the problem of estimating the coefficients in linear ordinary
differential equations (ODE's) with a diverging number of variables when the
solutions are observed with noise. The solution trajectories are ORDINAL smoothed
with local polynomial regression and the coefficients are estimated with
nonconcave penalty proposed by \cite{fan01}. Under some regularity and sparsity
conditions, we show the procedure can correctly identifies nonzero coefficients
with probability converging to one and the estimators for nonzero coefficients
have the same asymptotic normal distribution as they would have when the CARDINAL
coefficients are known and the same CARDINAL-step procedure is used. Our asymptotic
results are valid under the misspecified case where linear ODE's are only used
as an approximation to nonlinear ORG's, and the estimates will converge to the
coefficients of the best approximating linear system. From our results, when
the solution trajectories of the ORG's are sufficiently smooth, the parametric
$\sqrt{n}$ rate is achieved even though nonparametric regression estimator is
used in the ORDINAL step of the procedure. The performance of the CARDINAL-step
procedure is illustrated by a simulation study as well as an application to
yeast cell-cycle data. | We study light hadron leptoproduction at small $x$. PERSON production
is analysed in terms of generalized gluon distributions with taking into
account the transverse quark motion. Within a CARDINAL-gluon model the double spin
asymmetries for longitudinally polarized leptons and transversely polarized
protons in the diffractive $Q \bar Q$ production are investigated. The
predicted $A_{lT}$ asymmetry is large and can be used to obtain information on
the polarized gluon distributions in the proton. | 0 |
It is demonstrated how linear computational time and storage efficient
approaches can be adopted when analyzing very large data sets. More
importantly, interpretation is aided and furthermore, basic processing is
easily supported. Such basic processing can be the use of supplementary, i.e.
contextual, elements, or particular associations. Furthermore pixellated grid
cell contents can be utilized as a basic form of imposed clustering. For a
given resolution level, here related to an associated m-adic ($m$ here is a
non-prime integer) or p-adic ($p$ is prime) number system encoding, such
pixellated mapping results in partitioning. The association of a range of
m-adic and p-adic representations leads naturally to an imposed hierarchical
clustering, with partition levels corresponding to the m-adic-based and
p-adic-based representations and displays. In these clustering embedding and
imposed cluster structures, some analytical visualization and search
applications are described | This paper describes information flow within logical environments. The theory
of information flow, the logic of distributed systems, was first defined by
GPE and ORG.
DATE). Logical environments are a semantic-oriented version of institutions.
The theory of institutions, which was initiated by PERSON and PERSON
(Institutions: Abstract Model Theory for Specification and Programming. DATE),
is abstract model theory. Information flow is the flow of information in
channels over distributed systems. The semantic integration of distributed
systems, be they ontologies, databases or other information resources, can be
defined in terms of the channel theory of information flow. As originally
defined, the theory of information flow uses only a specific logical
environment in order to discuss information flow. This paper shows how
information flow can be defined in an arbitrary logical environment. | 0 |
A plethora of natural, artificial and social complex systems exists which
violate the basic hypothesis (e.g., ergodicity) of ORG (GPE)
statistical mechanics. Many of such cases can be satisfactorily handled by
introducing nonadditive entropic functionals, such as $PERSON p_i^q}{q-1} \; \Bigl(q \in {\cal R}; PERSON, \sum_{i=1}^W
GPE \Bigr)$, with $S_1=S_{BG}\equiv PERSON \ln p_i$. Each class
of such systems can be characterized by a set of values $\{q\}$, directly
corresponding to its various physical/dynamical/geometrical properties. A most
important subset is usually referred to as the $q$-triplet, namely
$(q_{sensitivity}, q_{relaxation}, NORP, defined in the body
of this paper. In the GPE limit we have
$PERSON For a given class of
complex systems, the set CARDINAL\{q\}$ contains only a few independent values of $PERSON,
all the others being functions of those few. An illustration of this structure
was given in DATE [Tsallis, ORG and PERSON, GPE. Natl. Acad. Sc. USA {\bf
CARDINAL}, DATE; TGS]. This illustration enabled a satisfactory analysis of the
PRODUCT data on the solar wind. But the general form of these structures
still is an open question. This is so, for instance, for the challenging
$PERSON associated with the edge of chaos of the logistic map. We introduce
here a transformation which sensibly generalizes the ORG one, and which might
constitute an important step towards the general solution. | The so called $q$-triplets were conjectured in DATE and then found in nature
in DATE. A relevant further step was achieved in DATE when the possibility was
advanced that they could reflect an entire infinite algebra based on
combinations of the self-dual relations MONEY 2-q$ ({\it additive duality})
and MONEY/q$ ({ORG multiplicative duality}). The entire algebra collapses
into the single fixed point $q=1$, corresponding to FAC entropy
and statistical mechanics. For $q \ne 1$, an infinite set of indices $PERSONappears, corresponding in principle to an infinite number of physical
properties of a given complex system describable in terms of the so called
$q$-statistics. The basic idea that is put forward is that, for a given
universality class of systems, a small number (typically CARDINAL or CARDINAL) of
independent CARDINALq$ indices exist, the infinite others being obtained from these
few ones by simply using the relations of the algebra. The $q$-triplets appear
to constitute a few central elements of the algebra. During DATE, an
impressive amount of $q$-triplets have been exhibited in analytical,
computational, experimental and observational results in natural, artificial
and social systems. Some of them do satisfy the available algebra constructed
solely with the additive and multiplicative dualities, but some others seem to
violate it. In the present work we generalize those CARDINAL dualities with the hope
that a wider set of systems can be handled within. The basis of the
generalization is given by the {ORG selfdual} relation $q \to q_a(q) \equiv
\frac{(a+2) -aq}{a-(a-2)q} \,\, (a \in {\cal R})$. We verify that MONEY,
and that $q_2(q)=2-q$ and $q_0(q)=1/q$. To physically motivate this
generalization, we briefly review illustrative applications of $q$-statistics,
in order to exhibit possible candidates where the present generalized algebras
could be useful. | 1 |
The series solution of the behavior of a finite number of physical bodies and
PERSON's PERSON number share quasi-algorithmic expressions; yet both lack a
computable radius of convergence. | In an independence model, the triplets that represent conditional
independences between singletons are called elementary. It is known that the
elementary triplets represent the independence model unambiguously under some
conditions. In this paper, we show how this representation helps performing
some operations with independence models, such as finding the dominant triplets
or a minimal independence map of an independence model, or computing the union
or intersection of a pair of independence models, or performing causal
reasoning. For the latter, we rephrase in terms of conditional independences
some of LOC's results for computing causal effects. | 0 |
ORG) has recently become a real formal science: the
new millennium brought the ORDINAL mathematically sound, asymptotically optimal,
universal problem solvers, providing a new, rigorous foundation for the
previously largely heuristic field of General PERSON and embedded agents. At the
same time there has been rapid progress in practical methods for learning true
sequence-processing programs, as opposed to traditional methods limited to
stationary pattern association. Here we will briefly review some of the new
results, and speculate about future developments, pointing out that the time
intervals between the most notable events in DATE or CARDINAL^CARDINAL lifetimes
of human history have sped up exponentially, apparently converging to CARDINAL
within DATE. Or is this impression just a by-product of the way
humans allocate memory space to past events? | We present the ORDINAL class of mathematically rigorous, general, fully
self-referential, self-improving, optimally efficient problem solvers. Inspired
by PERSON celebrated self-referential formulas (DATE), such a problem
solver rewrites any part of its own code as soon as it has found a proof that
the rewrite is useful, where the problem-dependent utility function and the
hardware and the entire initial code are described by axioms encoded in an
initial proof searcher which is also part of the initial code. The searcher
systematically and efficiently tests computable proof techniques (programs
whose outputs are proofs) until it finds a provably useful, computable
self-rewrite. We show that such a self-rewrite is globally optimal - no local
PRODUCT! - since the code first had to prove that it is not useful to continue
the proof search for alternative self-rewrites. Unlike previous
non-self-referential methods based on hardwired proof searchers, ours not only
boasts an optimal order of complexity but can optimally reduce any slowdowns
hidden by the O()-notation, provided the utility of such speed-ups is provable
at all. | 1 |
The emerging Web of Data utilizes the web infrastructure to represent and
interrelate data. The foundational standards of the Web of Data include the
WORK_OF_ART) and ORG (ORG).
URIs are used to identify resources and ORG is used to relate resources. While
ORG has been posited as a logic language designed specifically for knowledge
representation and reasoning, it is more generally useful if it can
conveniently support other models of computing. In order to realize the Web of
ORG as a general-purpose medium for storing and processing the world's data,
it is necessary to separate ORG from its logic language legacy and frame it
simply as a data model. Moreover, there is significant advantage in seeing the
NORP Web as a particular interpretation of the Web of Data that is focused
specifically on knowledge representation and reasoning. By doing so, other
interpretations of the Web of Data are exposed that realize ORG in different
capacities and in support of different computing models. | This paper argues that the operations of a 'Universal Turing Machine' (UTM)
and equivalent mechanisms such as ORG (ORG) - which are
widely accepted as definitions of the concept of `computing' - may be
interpreted as *information compression by multiple alignment, unification and
search* (ICMAUS).
The motivation for this interpretation is that it suggests ways in which the
UTM/PCS model may be augmented in a proposed new computing system designed to
exploit the ICMAUS principles as fully as possible. The provision of a
relatively sophisticated search mechanism in the proposed 'SP' system appears
to open the door to the integration and simplification of a range of functions
including unsupervised inductive learning, best-match pattern recognition and
information retrieval, probabilistic reasoning, planning and problem solving,
and others. Detailed consideration of how the ICMAUS principles may be applied
to these functions is outside the scope of this article but relevant sources
are cited in this article. | 0 |
This paper presents some properties of unary coding of significance for
biological learning and instantaneously trained neural networks. | Sparse methods for supervised learning aim at finding good linear predictors
from as few variables as possible, i.e., with small cardinality of their
supports. This combinatorial selection problem is often turned into a convex
optimization problem by replacing the cardinality function by its convex
envelope (tightest convex lower bound), in this case the CARDINAL-norm. In this
paper, we investigate more general set-functions than the cardinality, that may
incorporate prior knowledge or structural constraints which are common in many
applications: namely, we show that for nondecreasing submodular set-functions,
the corresponding convex envelope can be obtained from its ORG extension, a
common tool in submodular analysis. This defines a family of polyhedral norms,
for which we provide generic algorithmic tools (subgradients and proximal
operators) and theoretical results (conditions for support recovery or
high-dimensional inference). By selecting specific submodular functions, we can
give a new interpretation to known norms, such as those based on
rank-statistics or grouped norms with potentially overlapping groups; we also
define new norms, in particular ones that can be used as non-factorial priors
for supervised learning. | 0 |
Submodular functions are relevant to machine learning for CARDINAL
reasons: (CARDINAL) some problems may be expressed directly as the optimization of
submodular functions and (CARDINAL) the lovasz extension of submodular functions
provides a useful set of regularization functions for supervised and
unsupervised learning. In this monograph, we present the theory of submodular
functions from a convex analysis perspective, presenting tight links between
certain polyhedra, combinatorial optimization and convex optimization problems.
In particular, we show how submodular function minimization is equivalent to
solving a wide variety of convex optimization problems. This allows the
derivation of new efficient algorithms for approximate and exact submodular
function minimization with theoretical guarantees and good practical
performance. By listing many examples of submodular functions, we review
various applications to machine learning, such as clustering, experimental
design, sensor placement, graphical model structure learning or subset
selection, as well as a family of structured sparsity-inducing norms that can
be derived and used from submodular functions. | A unary constraint (on the NORP domain) is a function from {CARDINAL} to the
set of real numbers. A free use of auxiliary unary constraints given besides
input instances has proven to be useful in establishing a complete
classification of the computational complexity of approximately solving
weighted counting NORP constraint satisfaction problems (or #CSPs). In
particular, CARDINAL special constant unary constraints are a key to an arity
reduction of arbitrary constraints, sufficient for the desired classification.
In an exact counting model, both constant unary constraints are always assumed
to be available since they can be eliminated efficiently using an arbitrary
nonempty set of constraints. In contrast, we demonstrate in an approximate
counting model, that CARDINAL of them is efficiently approximated and thus
eliminated approximately by a nonempty constraint set. This fact directly leads
to an efficient construction of polynomial-time randomized
approximation-preserving Turing reductions (or ORG-reductions) from #CSPs with
designated constraints to any given #CSPs composed of symmetric real-valued
constraints of arbitrary arities even in the presence of arbitrary extra unary
constraints. | 0 |
There has been a remarkable increase in work at the interface of computer
science and game theory in DATE. In this article I survey some of
the main themes of work in the area, with a focus on the work in computer
science. Given the length constraints, I make no attempt at being
comprehensive, especially since other surveys are also available, and a
comprehensive survey book will appear shortly. | A general notion of algebraic conditional plausibility measures is defined.
Probability measures, ranking functions, possibility measures, and (under the
appropriate definitions) sets of probability measures can all be viewed as
defining algebraic conditional plausibility measures. It is shown that
algebraic conditional plausibility measures can be represented using NORP
networks. | 1 |
Keyphrases are useful for a variety of purposes, including summarizing,
indexing, labeling, categorizing, clustering, highlighting, browsing, and
searching. The task of automatic keyphrase extraction is to select keyphrases
from within the text of a given document. Automatic keyphrase extraction makes
it feasible to generate keyphrases for the huge number of documents that do not
have manually assigned keyphrases. Good performance on this task has been
obtained by approaching it as a supervised learning problem. An input document
is treated as a set of candidate phrases that must be classified as either
keyphrases or non-keyphrases. To classify a candidate phrase as a keyphrase,
the most important features (attributes) appear to be the frequency and
location of the candidate phrase in the document. Recent work has demonstrated
that it is also useful to know the frequency of the candidate phrase as a
manually assigned keyphrase for other documents in the same domain as the given
document (e.g., the domain of computer science). Unfortunately, this
keyphrase-frequency feature is domain-specific (the learning process must be
repeated for each new domain) and training-intensive (good performance requires
a relatively large number of training documents in the given domain, with
manually assigned keyphrases). The aim of the work described here is to remove
these limitations. In this paper, I introduce new features that are derived by
mining lexical knowledge from a very large collection of unlabeled data,
consisting of CARDINAL Web pages without manually assigned
keyphrases. I present experiments that show that the new features result in
improved keyphrase extraction, although they are neither domain-specific nor
training-intensive. | An inductive learning algorithm takes a set of data as input and generates a
hypothesis as output. A set of data is typically consistent with an infinite
number of hypotheses; therefore, there must be factors other than the data that
determine the output of the learning algorithm. In machine learning, these
other factors are called the bias of the learner. Classical learning algorithms
have a fixed bias, implicit in their design. Recently developed learning
algorithms dynamically adjust their bias as they search for a hypothesis.
PERSON that shift bias in this manner are not as well understood as
classical algorithms. In this paper, we show that the PERSON effect has
implications for the design and analysis of bias shifting algorithms. The
PERSON effect was proposed in DATE, to explain how phenomena that might appear
to require NORP evolution (inheritance of acquired characteristics) can
arise from purely NORP evolution. GPE and ORG presented a
computational model of the PERSON effect in DATE. We explore a variation on
their model, which we constructed explicitly to illustrate the lessons that the
PERSON effect has for research in bias shifting algorithms. The main lesson is
that it appears that a good strategy for shift of bias in a learning algorithm
is to begin with a weak bias and gradually shift to a strong bias. | 1 |
We prove that for any vacuum, maximal, asymptotically flat, axisymmetric
initial data for PERSON equations close to extreme ORG data, the inequality
$\sqrt{J} \leq m$ is satisfied, where $m$ and $MONEY are the total mass and
angular momentum of the data. The proof consists in showing that extreme ORG
is a local minimum of the mass. | The aim of this chapter is to present an introduction and also an overview of
some of the most relevant results concerning positivity energy theorems in
General PERSON. These theorems provide the answer to a long standing
problem that has been proved remarkably difficult to solve. They constitute CARDINAL
of the major results in classical General Relativity and they uncover a deep
self-consistence of the theory. | 1 |
This paper deals with a model of cellular growth called "WORK_OF_ART", whose key features are: i) distinction bewteen "normal" and "driver"
cells; ii) presence in driver cells of an epigenetic memory, that holds the
position of the cell in the driver cell lineage tree and represents the source
of differentiation during development. In the ORDINAL part of the paper the model
is proved able to generate arbitrary target shapes of unmatched size and
variety by means of evo-devo techniques, thus being validated as a model of
embryogenesis and cellular differentiation. In the ORDINAL part of the paper it
is shown how the model can produce artificial counterparts for some key aspects
of multicellular biology, such as junk DNA, ageing and carcinogenesis. If
individually each of these topics has been the subject of intense investigation
and modelling effort, to our knowledge no single model or theory seeking to
cover all of them under a unified framework has been put forward as yet: this
work contains such a theory, which makes ORG a potential basis
for a project of ORG. | We report complexity results about redundancy of formulae in CARDINAL form. We
ORDINAL consider the problem of checking redundancy and show some algorithms that
are slightly better than the trivial one. We then analyze problems related to
finding irredundant equivalent subsets (I.E.S.) of a given set. The concept of
cyclicity proved to be relevant to the complexity of these problems. Some
results about LOC formulae are also shown. | 0 |
The commentators have brought a wealth of new perspectives to the question of
how culture evolves. Each of their diverse disciplines--ranging from psychology
to biology to anthropology to economics to engineering--has a valuable
contribution to make to our understanding of this complex, multifaceted topic.
Though the vast majority of their comments were supportive of my approach, it
is natural that a reply such as this focus on points where my views differ from
that of the commentators. ... I conclude by saying that I am grateful to the
commentators for their diverse perspectives and insights, their overall support
for the project, and provocative ideas for where to go from here. Clearly there
are many fascinating avenues to explore as we move forward on our quest to
understand how culture evolves. | We describe ORG gate response in a mesoscopic ring threaded by a magnetic
flux $MONEY The ring is attached symmetrically to CARDINAL semi-infinite
CARDINAL-dimensional metallic electrodes and CARDINAL gate voltages, viz, $V_a$ and
$PERSON, are applied in CARDINAL arm of the ring which are treated as the inputs of
the ORG gate. The calculations are based on the tight-binding model and the
PERSON's function method, which numerically compute the conductance-energy and
current-voltage characteristics as functions of the ring-to-electrode coupling
strength, magnetic flux and gate voltages. Our theoretical study shows that,
for a particular value of CARDINAL\phi$ (MONEY) ($\phi_0=ch/e$, the elementary
flux-quantum), a high output current (CARDINAL) (in the logical sense) appears if both
the CARDINAL inputs to the gate are the same, while if one but not both inputs are
high (1), a low output current (0) results. It clearly exhibits the ORG gate
behavior and this aspect may be utilized in designing an electronic logic gate. | 0 |
This paper presents ORG (CNs) framework. CNs are used to
generalize neural and swarm architectures. Artificial neural networks, ant
colony optimization, particle swarm optimization, and realistic biological
models are used as examples of instantiations of CNs. The description of these
architectures as CNs allows their comparison. Their differences and
similarities allow the identification of properties that enable neural and
swarm architectures to perform complex computations and exhibit complex
cognitive abilities. In this context, the most relevant characteristics of CNs
are the existence multiple dynamical and functional scales. The relationship
between multiple dynamical and functional scales with adaptation, cognition (of
brains and swarms) and computation is discussed. | It is shown that the gauge fixings of the LOC and the W fields and CARDINAL
scalars are via nonconserved axial-vector and charged vector currents of
massive fermions dynamically generated by fermion masses in WORK_OF_ART
interactions. The top quark mass provides enough strength for this chiral
symmetry breaking. The masses of the CARDINAL scalars are determined to be about
$MONEY GeV. These scalars have negative probability. They are the
nonperturbative solutions of the ORG. A new perturbation theory with dynamically
generated and fixed gauge fixings is constructed. The Faddeev-Popov procedure
is not invoked. | 0 |
A mobile ad hoc network (ORG) is a collection of autonomous nodes that
communicate with each other by forming a multi-hop radio network and
maintaining connections in a decentralized manner. Security remains a major
challenge for these networks due to their features of open medium, dynamically
changing topologies, reliance on cooperative algorithms, absence of centralized
monitoring points, and lack of clear lines of defense. Protecting the network
layer of a PRODUCT from malicious attacks is an important and challenging
security issue, since most of the routing protocols for MANETs are vulnerable
to various types of attacks. Ad hoc on-demand distance vector routing (ORG) is
a very popular routing algorithm. However, it is vulnerable to the well-known
black hole attack, where a malicious node falsely advertises good paths to a
destination node during the route discovery process but drops all packets in
the data forwarding phase. This attack becomes more severe when a group of
malicious nodes cooperate each other. The proposed mechanism does not apply any
cryptographic primitives on the routing messages. Instead, it protects the
network by detecting and reacting to malicious activities of the nodes.
Simulation results show that the scheme has a significantly high detection rate
with moderate network traffic overhead and computation overhead in the nodes. | A mobile ad hoc network (ORG) is a collection of mobile nodes that
communicate with each other by forming a multi-hop radio network. Security
remains a major challenge for these networks due to their features of open
medium, dynamically changing topologies, reliance on cooperative algorithms,
absence of centralized monitoring points, and lack of clear lines of defense.
Design of an efficient and reliable ORG authentication protocol for such
networks is a particularly challenging task since the nodes are battery-driven
and resource constrained. This paper presents a robust and efficient key
exchange protocol for nodes authentication in a ORG based on multi-path
communication. Simulation results demonstrate that the protocol is effective
even in presence of large fraction of malicious nodes in the network. Moreover,
it has a minimal computation and communication overhead that makes it ideally
suitable for MANETs. | 1 |
The partition function of an oscillator disturbed by a set of electron
particle paths has been computed by a path integral method which permits to
evaluate at any temperature the relevant cumulant terms in the series
expansion. The time dependent source current peculiar of the semiclassical
PERSON model induces large electron-phonon anharmonicities on the
phonon subsystem. As a main signature of anharmonicity the phonon heat capacity
shows a peak whose temperature location strongly varies with the strength of
the {\it e-ph} coupling. High energy oscillators are less sensitive to
anharmonic perturbations. | We present a study of the CARDINAL dimensional PERSON model
Hamiltonian by a diagrammatic perturbative method in the weak electron-phonon
coupling regime. Exact computation of both the charge carrier effective mass
and the electron spectral function shows that electrons are good quasiparticles
in the adiabatic and antiadiabatic limits but novel features emerge in the
intermediate regime, where the phonons and the electrons compare on the energy
scale. Together with a sizeable mass enhancement we observe, in the latter
regime, a spread of the spectral weight (among several transition peaks)
associated with an increased relevance of multiphonons contributions at larger
{\it e-ph} couplings. Accordingly electrons cease to be the good quasiparticles
and an onset of polaron formation is favoured. | 1 |
We propose a method to organize experimental data from particle collision
experiments in a general format which can enable a simple visualisation and
effective classification of collision data using machine learning techniques.
The method is based on sparse fixed-size matrices with single- and CARDINAL-particle
variables containing information on identified particles and jets. We
illustrate this method using an example of searches for new physics at the LHC
experiments. | Let G be a Lie group and E be a locally convex topological G-module.
If E is sequentially complete, then E and its space of smooth vectors are
modules for the algebra D(G) of compactly supported smooth functions on PERSONHowever, the module multiplication need not be continuous. The pathology can be
ruled out if E is (or embeds into) a projective limit of PERSON.
Moreover, in this case the space of analytic vectors is a module for the
algebra A(G) of superdecaying analytic functions introduced by ORG,
GPE and GPE. We prove that the space of analytic vectors is a
topological A(G)-module if E is a GPE space or, more generally, if every
countable set of continuous seminorms on E has an upper bound. The same
conclusion is obtained if G has a compact Lie algebra.
The question of whether D(G) and PRODUCT) are topological algebras is also
addressed. | 0 |
Many mathematical models utilize limit processes. Continuous functions and
the calculus, differential equations and topology, all are based on limits and
continuity. However, when we perform measurements and computations, we can
achieve only approximate results. In some cases, this discrepancy between
theoretical schemes and practical actions changes drastically outcomes of a
research and decision-making resulting in uncertainty of knowledge. In the
paper, a mathematical approach to such kind of uncertainty, which emerges in
computation and measurement, is suggested on the base of the concept of a fuzzy
limit. A mathematical technique is developed for differential models with
uncertainty. To take into account the intrinsic uncertainty of a model, it is
suggested to use fuzzy derivatives instead of conventional derivatives of
functions in this model. | It is argued that the existing schemes of fault-tolerant quantum computation
designed for discrete-time models and based on quantum error correction fail
for continuous-time NORP models even with NORP noise. | 0 |
This report presents an empirical evaluation of CARDINAL algorithms for
automatically extracting keywords and keyphrases from documents. The CARDINAL
algorithms are compared using CARDINAL different collections of documents. For each
document, we have a target set of keyphrases, which were generated by hand. The
target keyphrases were generated for human readers; they were not tailored for
any of the CARDINAL keyphrase extraction algorithms. Each of the algorithms was
evaluated by the degree to which the algorithm's keyphrases matched the
manually generated keyphrases. The CARDINAL algorithms were (CARDINAL) the ORG
feature in ORG's Word 97, (CARDINAL) an algorithm based on PERSON
part-of-speech tagger, (CARDINAL) the ORG feature in FAC 97, and (CARDINAL)
ORG's ORG algorithm. For all CARDINAL document collections, ORG's Extractor
yields the best match with the manually generated keyphrases. | Recognizing analogies, synonyms, antonyms, and associations appear to be CARDINAL
distinct tasks, requiring distinct ORG algorithms. In the past, the CARDINAL tasks
have been treated independently, using a wide variety of algorithms. These CARDINAL
semantic classes, however, are a tiny sample of the full range of semantic
phenomena, and we cannot afford to create ad hoc algorithms for each semantic
phenomenon; we need to seek a unified approach. We propose to subsume a broad
range of phenomena under analogies. To limit the scope of this paper, we
restrict our attention to the subsumption of synonyms, antonyms, and
associations. We introduce a supervised corpus-based machine learning algorithm
for classifying analogous word pairs, and we show that it can solve
multiple-choice ORG analogy questions, ORG synonym questions, ORG
synonym-antonym questions, and similar-associated-both questions from cognitive
psychology. | 1 |
NORP integral functional measure of entropy-uncertainty (EF) on
trajectories of NORP multi-dimensional diffusion process is cutting off by
interactive impulses (controls). Each cutoff minimax of EF superimposes and
entangles conjugated fractions in microprocess, enclosing the captured entropy
fractions as source of an information unit. The impulse step-up action launches
the unit formation and step-down action finishes it and brings energy from the
interactive jump. This finite jump transfers the entangled entropy from
uncertain Yes-logic to the certain-information No-logic information unit whose
measuring at end of the cut kills final entropy-uncertainty and limits unit.
The Yes-No logic holds Bit Participator creating elementary information
observer without physical pre-law. Cooperating CARDINAL units in doublet and an
opposite directional information unit in triplet forms minimal stable
structure. Information path functional (ORG) integrates multiple hidden
information contributions along the cutting process correlations in information
units of cooperating doublets-triplets, bound by free information, and enfolds
the sequence of enclosing triplet structures in the information network (IN)
that sequentially decreases the entropy and maximizes information. The IN bound
triplets release free information rising information forces enable attracting
new information unit and ordering it. While ORG collects the information units,
the IN performs logical computing using doublet-triplet code. The IN different
levels unite logic of ORG and macro- information processes,
composing quantum and/or classical computation. | Man-in-the-Middle (MM) is not only a ubiquitous attack pattern in security,
but also an important paradigm of network computation and economics.
Recognizing ongoing GPE-attacks is an important security task; modeling
GPE-interactions is an interesting task for semantics of computation. Traced
monoidal categories are a natural framework for GPE-modelling, as the trace
structure provides a tool to hide what happens *in the middle*. An effective
analysis of what has been traced out seems to require an additional property of
traces, called *normality*. We describe a modest model of network computation,
based on partially ordered multisets (pomsets), where basic network
interactions arise from the monoidal trace structure, and a normal trace
structure arises from an iterative, i.e. coalgebraic structure over terms and
messages used in computation and communication. The correspondence is
established using a convenient monadic description of normally traced monoidal
categories. | 0 |
We propose that a general learning system should have CARDINAL kinds of agents
corresponding to sensory, short-term, and long-term memory that implicitly will
facilitate context-free and context-sensitive aspects of learning. These CARDINAL
agents perform mututally complementary functions that capture aspects of the
human cognition system. We investigate the use of ORG networks for use
as models of short-term and sensory memory. | It is generally accepted that machines can replicate cognitive tasks
performed by conscious agents as long as they are not based on the capacity of
awareness. We consider several views on the nature of subjective awareness,
which is fundamental for self-reflection and review, and present reasons why
this property is not computable. We argue that consciousness is more than an
epiphenomenon and assuming it to be a separate category is consistent with both
quantum mechanics and cognitive science. We speak of CARDINAL kinds of
consciousness, little-C and big-C, and discuss the significance of this
classification in analyzing the current academic debates in the field. The
interaction between the system and the measuring apparatus of the experimenter
is examined both from the perspectives of decoherence and the quantum PERSON
effect. These ideas are used as context to address the question of limits to
machine consciousness. | 1 |
Up to now information and information process have no scientific definitions,
neither implicit origin. They emerge in observing multiple impulses interactive
yes-no actions modeling information ORG. Merging action and reaction, joining
probabilistic prior and posterior actions on edge of the observed
predictability, begin a microprocess. Its time of entanglement starts space
interval composing CARDINAL qubits or Bit of reversible logic in the emerging
information process. The impulse interacting action curves impulse geometry
creating asymmetrical logic Bit as logical ORG demon. With approaching
probability one, the attracting interaction captures energy memorizing
asymmetrical logic in information certain Bit. Such Bit is naturally extracted
at minimal quality energy equivalent ln2 working as PERSON. The
memorized impulse Bit and its free information self-organizes multiple ORG in
triplets composing a macroprocess. Each memorized information binds reversible
microprocess with irreversible information macroprocess along multi-dimensional
observing process. The macroprocess self-forming triplet units attract new UP
through free Information. Multiple UP adjoin hierarchical network (IN) whose
free information produces new UP at higher level node and encodes triplets in
multi-levels hierarchical organization. The interactive information dynamics
assemble geometrical and information structures of cognition and intelligence
in double spiral rotating code. The ORG path functional integrates in
bits the interactive dynamics. | Compressed Counting (ORG)} was recently proposed for approximating the
$\alpha$th frequency moments of data streams, for $MONEY \leq CARDINAL Under the
relaxed strict-Turnstile model, ORG dramatically improves the standard algorithm
based on symmetric stable random projections}, especially as $\alpha\to 1$. A
direct application of ORG is to estimate the entropy, which is an important
summary statistic in Web/network measurement and often serves a crucial
"feature" for data mining. The R\'enyi entropy and the NORP entropy are
functions of the $\alpha$th frequency moments; and both approach the FAC
entropy as $\alpha\to 1$. A recent theoretical work suggested using the
$\alpha$th frequency moment to approximate the FAC entropy with
$\alpha=1+\delta$ and very small $MONEY (e.g., $<10^{-4}$).
In this study, we experiment using ORG to estimate frequency moments, R\'enyi
entropy, Tsallis entropy, and FAC entropy, on real Web crawl data. We
demonstrate the variance-bias trade-off in estimating FAC entropy and
provide practical recommendations. In particular, our experiments enable us to
draw some important conclusions:
(CARDINAL) As $\alpha\to 1$, ORG dramatically improves {\em symmetric stable random
projections} in estimating frequency moments, R\'enyi entropy, Tsallis entropy,
and FAC entropy. The improvements appear to approach "infinity."
(CARDINAL) Using {\em symmetric stable random projections} and $\alpha = CARDINAL
with very small $MONEY does not provide a practical algorithm because the
required sample size is enormous. | 0 |
Many relativists have been long convinced that black hole evaporation leads
to information loss or remnants. String theorists have however not been too
worried about the issue, largely due to a belief that the NORP argument for
information loss is flawed in its details. A recently derived inequality shows
that the Hawking argument for black holes with horizon can in fact be made
rigorous. What happens instead is that in string theory black hole microstates
have no horizons. Thus the evolution of radiation quanta with E ~ kT is
modified by order unity at the horizon, and we resolve the information paradox.
We discuss how it is still possible for E >> kT objects to see an approximate
black hole like geometry. We also note some possible implications of this
physics for the early ORG. | We consider the problem of nonlinear dimensionality reduction: given a
training set of high-dimensional data whose ``intrinsic'' low dimension is
assumed known, find a feature extraction map to low-dimensional space, a
reconstruction map back to high-dimensional space, and a geometric description
of the dimension-reduced data as a smooth manifold. We introduce a
complexity-regularized quantization approach for fitting a NORP mixture
model to the training set via a ORG algorithm. Complexity regularization
controls the trade-off between adaptation to the local shape of the underlying
manifold and global geometric consistency. The resulting mixture model is used
to design the feature extraction and reconstruction maps and to define a
NORP metric on the low-dimensional data. We also sketch a proof of
consistency of our scheme for the purposes of estimating the unknown underlying
pdf of high-dimensional data. | 0 |
The article presents results of preliminary study of solutions to recently
offered basic thermodynamic equation for equilibrium in chemical systems with
focus on chaotic behavior. Classical part of that equation was investigated
earlier in a series of papers. In this work a similarity between
CARDINAL-dimensional logistic map and non-classical (chaotic) term of the equation
was discussed to introduce the problem. Results of this work allow us to
evaluate the region where open equilibrium belongs to the basin of regular
attractor and leads to trivial solutions with CARDINAL deviation from true
thermodynamic equilibrium, and then to find ORDINAL bifurcation threshold as a
limit of open equilibrium and a limit of the classical region as well. Features
of the basic equation are discussed with regard to relative values of the
chaotic and thermodynamic temperatures. Obtained results prompt us to consider
the basic equation of new theory to be the general equation of state of
chemical systems. | The paper offers a discrete thermodynamic model of lasers. ORG is an open
system; its equilibrium is based on a balance of CARDINAL thermodynamic forces, CARDINAL
related to the incoming pumping power and another to the emitted light. The
basic expression for such equilibrium is a logistic map, graphical solutions to
which are pitchfork bifurcation diagrams. As pumping force increases, the
relative populations on the ground and lasing branches tend to CARDINAL and unity
correspondingly. An interesting feature of this model is the line spectrum of
the up and down transitions between the branches beyond bifurcation point. Even
in a simple case of CARDINAL-level laser with CARDINAL possible transition types (up and
down), the spectra look like sets of the line packets, starting well before the
population inversion. This effect is an independent confirmation of the
PERSON's prohibition on practical realization of CARDINAL-level laser. Multilevel
lasers may be approached by employing the idea of thermodynamic activity for
the emitting atoms. Considering coefficient of thermodynamic activity of the
lasing level atoms to be proportional to the ratio of life times on the upper
and lasing (the CARDINAL) levels, one can derive a new basic map for the multilevel
laser system. For a modest ratio only of CARDINAL, spontaneous transitions between
levels are pushed to the area beyond population inversion, opening a space for
the functioning of laser. | 1 |
The development of discursive knowledge presumes the communication of meaning
as analytically different from the communication of information. Knowledge can
then be considered as a meaning which makes a difference. Whereas the
communication of information is studied in the information sciences and
scientometrics, the communication of meaning has been central to PERSON's
attempts to make the theory of autopoiesis relevant for sociology. Analytical
techniques such as semantic maps and the simulation of anticipatory systems
enable us to operationalize the distinctions which ORG proposed as relevant
to the elaboration of ORG's "horizons of meaning" in empirical research:
interactions among communications, the organization of meaning in
instantiations, and the self-organization of interhuman communication in terms
of symbolically generalized media such as truth, love, and power. Horizons of
meaning, however, remain uncertain orders of expectations, and one should
caution against reification from the meta-biological perspective of systems
theory. | In requirements specification, software engineers create a textual
description of the envisioned system as well as develop conceptual models using
such tools as ORG (ORG) and System Modeling Language
(ORG). CARDINAL such tool, called FM, has recently been developed as an extension
of the INPUT-PROCESS-OUTPUT (ORG) model. ORG has been used extensively in many
interdisciplinary applications and is described as one of the most fundamental
and important of all descriptive tools. This paper is an attempt to
understanding the ORG in ORG. The fundamental way to describe ORG is in
verbs. This use of language has an important implication for systems modeling
since verbs express the vast range of actions and movements of all things. It
is clear that modeling needs to examine verbs. Accordingly, this paper involves
a study of LANGUAGE verbs as a bridge to learn about processes, not as
linguistic analysis but rather to reveal the semantics of processes,
particularly the CARDINAL verbs that form the basis of FM states: create, process,
receive, release, and transfer. The paper focuses on verb classification, and
specifically on how to model the action of verbs diagrammatically. From the
linguistics point of view, according to some researchers, further exploration
of the notion of verb classes is needed for real-world tasks such as machine
translation, language generation, and document classification. Accordingly,
this non-linguistics study may benefit linguistics. | 0 |
Feature Markov Decision Processes (PhiMDPs) are well-suited for learning
agents in general environments. Nevertheless, unstructured (Phi)MDPs are
limited to relatively simple environments. Structured MDPs like ORG (DBNs) are used for large-scale real-world problems. In this
article I extend PhiMDP to PhiDBN. The primary contribution is to derive a cost
criterion that allows to automatically extract the most relevant features from
the environment, leading to the "best" DBN representation. I discuss all
building blocks required for a complete general learning algorithm. | The provably asymptotically fastest algorithm within a factor of CARDINAL for
formally described problems will be constructed. The main idea is to enumerate
all programs provably equivalent to the original problem by enumerating all
proofs. The algorithm could be interpreted as a generalization and improvement
of PERSON search, which is, within a multiplicative constant, the fastest
algorithm for inverting functions. PERSON's speed-up theorem is avoided by taking
into account only programs for which a correctness proof exists. Furthermore,
it is shown that the fastest program that computes a certain function is also
one of the shortest programs provably computing this function. To quantify this
statement, the definition of NORP complexity is extended, and CARDINAL new
natural measures for the complexity of a function are defined. | 1 |
In this paper will be presented methodology of encoding information in
valuations of discrete lattice with some translational invariant constrains in
asymptotically optimal way. The method is based on finding statistical
description of such valuations and changing it into statistical algorithm,
which allows to construct deterministically valuation with given statistics.
NORP statistics allow to generate valuations with uniform distribution - we
get maximum information capacity this way. It will be shown that we can reach
the optimum for CARDINAL-dimensional models using maximal entropy random walk and
that for the general case we can practically get as close to the capacity of
the model as we want (found numerically: lost CARDINAL} bit/node for FAC). There will be also presented simpler alternative to arithmetic coding
method which can be used as cryptosystem and data correction method too. | A research programme is set out for developing the use of high-level methods
for quantum computation and information, based on the categorical formulation
of ORG introduced by the author and PERSON. | 0 |
We define a physically reasonable mass for an asymptotically ORG
(ORG) manifold which is uniquely defined in the case of a normalized
representation. | The aim is formal principles of origin information and information process
creating information observer self-creating information in interactive
observations. The interactive phenomenon creates Yes-No actions of information
ORG in its information observer. Information emerges from interacting random
field of NORP probabilities, which link PERSON law probabilities
and NORP probabilities observing PERSON diffusion process by probabilistic
CARDINAL-1 impulses. Each No-0 action cuts maximum of impulse minimal entropy while
following Yes-1 action transfers maxim between impulses performing dual
principle of converting process entropy to information. Merging Yes-No actions
generate microprocess within bordered impulse producing Bit with free
information when the microprocess probability approaches CARDINAL. Interacting bits
memorize free information which attracts multiple Bits moving macroprocess self
joining triplet macrounits. Memorized information binds reversible microprocess
with irreversible macroprocess. The observation converts cutting entropy to
information macrounits. Macrounits logically self-organize information networks
encoding the units in geometrical structures enclosing triplet code. Multiple
IN binds their ending triplets enclosing observer information cognition and
intelligence. The observer cognition assembles common units through multiple
attraction and resonances at forming IN triplet hierarchy which accept only
units that recognizes each IN node. Maximal number of accepted triplet levels
in multiple IN measures the observer maximum comparative information
intelligence. The observation process carries probabilistic and certain wave
functions which self-organize the space hierarchical structures. These
information regularities create integral logic and intelligence self-requesting
needed information. | 0 |
In this paper we demonstrate that it is possible to manage intelligence in
constant time as a pre-process to information fusion through a series of
processes dealing with issues such as clustering reports, ranking reports with
respect to importance, extraction of prototypes from clusters and immediate
classification of newly arriving intelligence reports. These methods are used
when intelligence reports arrive which concerns different events which should
be handled independently, when it is not known a priori to which event each
intelligence report is related. We use clustering that runs as a back-end
process to partition the intelligence into subsets representing the events, and
in parallel, a fast classification that runs as a front-end process in order to
put the newly arriving intelligence into its correct information fusion
process. | In this paper we develop methods for selection of templates and use these
templates to recluster an already performed ORG clustering taking
into account intelligence to template fit during the reclustering phase. By
this process the risk of erroneous force aggregation based on some misplace
pieces of evidence from the ORDINAL clustering process is greatly reduced.
Finally, a more reliable force aggregation is performed using the result of the
ORDINAL clustering. These steps are taken in order to maintain most of the
excellent computational performance of ORG clustering, while at the
same time improve on the clustering result by including some higher relations
among intelligence reports described by the templates. The new improved
algorithm has a computational complexity of O(n**3 ORG) compared to PERSON) of standard PERSON clustering using ORG spin mean field
theory. | 1 |
Many papers proved the security of quantum key distribution (QKD) system, in
the asymptotic framework. The degree of the security has not been discussed in
the finite coding-length framework, sufficiently. However, to guarantee any
implemented QKD system requires, it is needed to evaluate a protocol with a
finite coding-length. For this purpose, we derive a tight upper bound of the
eavesdropper's information. This bound is better than existing bounds. We also
obtain the exponential rate of the eavesdropper's information. Further, we
approximate our bound by using the normal distribution. | We discuss secure computation of modular sum when multiple access channel
from distinct players $ORG, \ldots, A_c$ to a ORDINAL party (Receiver) is given.
Then, we define the secure modulo sum capacity as the supremum of the
transmission rate of modulo sum without information leakage of other
information. We derive its useful lower bound, which is numerically calculated
under a realistic model that can be realizable as a NORP multiple access
channel. | 1 |
CARDINAL of basic difficulties of machine learning is handling unknown rotations
of objects, for example in image recognition. A related problem is evaluation
of similarity of shapes, for example of CARDINAL chemical molecules, for which
direct approach requires costly pairwise rotation alignment and comparison.
Rotation invariants are useful tools for such purposes, allowing to extract
features describing shape up to rotation, which can be used for example to
search for similar rotated patterns, or fast evaluation of similarity of shapes
e.g. for virtual screening, or machine learning including features directly
describing shape. A standard approach are rotationally invariant cylindrical or
spherical harmonics, which can be seen as based on polynomials on sphere,
however, they provide very few invariants - only one per degree of polynomial.
There will be discussed a general approach to construct arbitrarily large sets
of rotation invariants of polynomials, for MONEY in $PERSON up to
$O(n^D)$ independent invariants instead of CARDINALO(D)$ offered by standard
approaches, possibly also a complete set: providing not only necessary, but
also sufficient condition for differing only by rotation (and reflectional
symmetry). | PERSON (ORG) is a promising technique especially for
multimedia data compression, already used in GPE audio codec and considered
for AV1 video codec. It quantizes vectors from ORG unit sphere by ORDINAL
projecting them to MONEY norm unit sphere, then quantizing and encoding there.
This paper shows that the used standard radial projection is suboptimal and
proposes to tune its deformations by using parameterized power projection:
$PERSON instead, where the optimized power $p$ is applied
coordinate-wise, getting usually MONEY, NORP improvement comparing to
radial projection. | 1 |
Computer scientists are in the position to create new, free high-quality
journals. So what would it take? | For the most of my life, I have earned my living as a computer vision
professional busy with image processing tasks and problems. In the computer
vision community there is a widespread belief that artificial vision systems
faithfully replicate human vision abilities or at least very closely mimic
them. It was a great surprise to me when one day I have realized that computer
and human vision have next to nothing in common. The former is occupied with
extensive data processing, carrying out massive pixel-based calculations, while
the latter is busy with meaningful information processing, concerned with smart
objects-based manipulations. And the gap between the CARDINAL is insurmountable. To
resolve this confusion, I had had to return and revaluate ORDINAL the vision
phenomenon itself, define more carefully what visual information is and how to
treat it properly. In this work I have not been, as it is usually accepted,
biologically inspired . On the contrary, I have drawn my inspirations from a
pure mathematical theory, the ORG s complexity theory. The results of my
work have been already published elsewhere. So the objective of this paper is
to try and apply the insights gained in course of this my enterprise to a more
general case of information processing in human brain and the challenging issue
of human intelligence. | 0 |
This paper is placed at the intersection-point between the study of
theoretical computational models aimed at capturing the essence of genetic
regulatory networks and the field of ORG (or ORG). A model is proposed, with the objective of providing an effective
way to generate arbitrary forms by using evolutionary-developmental techniques.
Preliminary experiments have been performed. | Borderline personality disorder and narcissistic personality disorder are
important nosographic entities and have been subject of intensive
investigations. The currently prevailing psychodynamic theory for mental
disorders is based on the repertoire of defense mechanisms employed. Another
line of research is concerned with the study of psychological traumas and
dissociation as a defensive response. Both theories can be used to shed light
on some aspects of pathological mental functioning, and have many points of
contact. This work merges these CARDINAL psychological theories, and builds a model
of mental function in a relational context called ORG.
The model, which is enriched with ideas borrowed from the field of computer
science, leads to a new therapeutic proposal for psychological traumas and
personality disorders. | 1 |
Active learning strategies respond to the costly labelling task in a
supervised classification by selecting the most useful unlabelled examples in
training a predictive model. Many conventional active learning algorithms focus
on refining the decision boundary, rather than exploring new regions that can
be more informative. In this setting, we propose a sequential algorithm named
ORG that can improve any Active learning algorithm by an optimal random
exploration. Experimental results show a statistically significant and
appreciable improvement in the performance of our new approach over the
existing active feedback methods. | The conceptual knowledge framework ORG needs several components for a
successful design. CARDINAL important, but previously overlooked, component is the
central core of ORG. The central core provides a theoretical link between
the ontological specification in ORG and the conceptual knowledge
representation in ORG. This paper discusses the formal semantics and syntactic
styles of the central core, and also the important role it plays in defining
interoperability between ORG, ORG and GPE. | 0 |
We apply multilayer bootstrap network (ORG), a recent proposed unsupervised
learning method, to unsupervised speaker recognition. The proposed method ORDINAL
extracts supervectors from an unsupervised universal background model, then
reduces the dimension of the high-dimensional supervectors by multilayer
bootstrap network, and finally conducts unsupervised speaker recognition by
clustering the low-dimensional data. The comparison results with CARDINAL unsupervised
and CARDINAL supervised speaker recognition techniques demonstrate the effectiveness
and robustness of the proposed method. | Multitask clustering tries to improve the clustering performance of multiple
tasks simultaneously by taking their relationship into account. Most existing
multitask clustering algorithms fall into the type of generative clustering,
and none are formulated as convex optimization problems. In this paper, we
propose CARDINAL convex Discriminative Multitask Clustering (DMTC) algorithms to
address the problems. Specifically, we ORDINAL propose a NORP DMTC framework.
Then, we propose CARDINAL convex GPE objectives within the framework. The ORDINAL
one, which can be seen as a technical combination of the convex multitask
feature learning and the convex ORG (M3C),
aims to learn a shared feature representation. The ORDINAL one, which can be
seen as a combination of the convex multitask relationship learning and M3C,
aims to learn the task relationship. The CARDINAL objectives are solved in a uniform
procedure by the efficient cutting-plane algorithm. Experimental results on a
toy problem and CARDINAL benchmark datasets demonstrate the effectiveness of the
proposed algorithms. | 1 |
In this article, we tentatively assign the $X(3915)$ and $X(4500)$ to be the
ground state and the ORDINAL radial excited state of the
axialvector-diquark-axialvector-antidiquark type scalar $PERSON, respectively, assign the $PERSON to be the ground state
vector-diquark-vector-antidiquark type scalar $cs\bar{c}\bar{s}$ tetraquark
state, and study their masses and pole residues with the ORG sum rules in
details by calculating the contributions of the vacuum condensates up to
dimension CARDINAL. The numerical results support assigning the $X(3915)$ and
$X(4500)$ to be the ground state and the ORDINAL radial excited state of the
axialvector-diquark-axialvector-antidiquark type scalar $PERSON, respectively, and assigning the $PERSON to be the ground
state vector-diquark-vector-antidiquark type scalar $PERSON. | In this article, we take the point of view that the $D_s(2700)$ be a
tetraquark state, which consists of a scalar diquark and a vector antidiquark,
and calculate its mass with the ORG sum rules. The numerical result indicates
that the mass of the vector charmed ORG state is about
$ORG or $PERSON from different sum
rules, which is MONEYMONEY larger than the experimental data. Such
tetraquark component should be very small in the $PERSON | 1 |
Artificial intelligence has impacted many aspects of human life. This paper
studies the impact of artificial intelligence on economic theory. In particular
we study the impact of artificial intelligence on the theory of bounded
rationality, efficient market hypothesis and prospect theory. | Many academic journals ask their authors to provide a list of CARDINAL to
CARDINAL key words, to appear on the ORDINAL page of each article. Since these key
words are often phrases of CARDINAL or more words, we prefer to call them
keyphrases. There is a surprisingly wide variety of tasks for which keyphrases
are useful, as we discuss in this paper. Recent commercial software, such as
ORG's Word 97 and Verity's Search 97, includes algorithms that
automatically extract keyphrases from documents. In this paper, we approach the
problem of automatically extracting keyphrases from text as a supervised
learning task. We treat a document as a set of phrases, which the learning
algorithm must learn to classify as positive or negative examples of
keyphrases. Our ORDINAL set of experiments applies the C4.5 decision tree
induction algorithm to this learning task. The ORDINAL set of experiments
applies the GenEx algorithm to the task. We developed the ORG algorithm
specifically for this task. The ORDINAL set of experiments examines the
performance of GenEx on the task of metadata generation, relative to the
performance of ORG's Word 97. The ORDINAL and final set of experiments
investigates the performance of GenEx on the task of highlighting, relative to
PERSON's Search 97. The experimental results support the claim that a
specialized learning algorithm (GenEx) can generate better keyphrases than a
general-purpose learning algorithm (C4.5) and the non-learning algorithms that
are used in commercial software (Word CARDINAL and Search 97). | 0 |
A significant progress has been made in DATE over the study
of combinatorial ORG optimization problems and their associated optimization and
approximate classes, such as ORG, ORG, ORG (or APXP), and ORG. Unfortunately, a
collection of problems that are simply placed inside the P-solvable
optimization class ORG never have been studiously analyzed regarding their exact
computational complexity. To improve this situation, the existing framework
based on polynomial-time computability needs to be expanded and further refined
for an insightful analysis of various approximation algorithms targeting
optimization problems within ORG. In particular, we deal with those problems
characterized in terms of logarithmic-space computations and uniform-circuit
computations. We are focused on nondeterministic logarithmic-space (ORG)
optimization problems or ORG problems. Our study covers a wide range of
optimization and approximation classes, dubbed as, ORG, ORG, ORG, and LSAS as
well as new classes NORP, ORG, ORG, and DATE, which are founded on uniform
families of NORP circuits. Although many ORG decision problems can be
naturally converted into ORG optimization (ORG) problems, few ORG problems have
been studied vigorously. We thus provide a number of new ORG problems falling
into those low-complexity classes. With the help of NC1 or AC0
approximation-preserving reductions, we also identify the most difficult
problems (known as complete problems) inside those classes. Finally, we
demonstrate a number of collapses and separations among those refined
optimization and approximation classes with or without unproven
complexity-theoretical assumptions. | String theory suggests that black hole microstates are ORG, horizon sized
`fuzzballs', rather than smooth geometries with horizon. Radiation from
fuzzballs can carry information and does not lead to information loss. But if
we let a shell of matter collapse then it creates a horizon, and it seems that
subsequent radiation will lead to information loss. We argue that the
resolution to this problem is that the shell can tunnel to the fuzzball
configurations. The amplitude for tunneling is small because we are relating
CARDINAL macroscopically different configurations, but the number of states that we
can tunnel to, given through the PERSON entropy, is very large. These small
and large numbers can cancel each other, making it possible for the shell to
tunnel into fuzzball states before a significant amount of radiation has been
emitted. This offers a way to resolve the information paradox. | 0 |
The world is witnessing the birth of a revolutionary computing paradigm that
promises to have a profound effect on the way we interact with computers,
devices, physical spaces, and other people. This new technology, called
ubiquitous computing, envisions a world where embedded processors, computers,
sensors, and digital communications are inexpensive commodities that are
available everywhere. This paper presents a comprehensive discussion on the
central trends in ubiquitous computing considering them form technical, social
and economic perspectives. It clearly identifies different application areas
and sectors that will benefit from the potentials of ubiquitous computing. It
also brings forth the challenges of ubiquitous computing that require active
solutions and management. | Merely by existing, all physical systems register information. And by
evolving dynamically in time, they transform and process that information. The
laws of physics determine the amount of information that a physical system can
register (number of bits) and the number of elementary logic operations that a
system can perform (number of ops). The universe is a physical system. This
paper quantifies the amount of information that the universe can register and
the number of elementary operations that it can have performed over its
history. The universe can have performed MONEY ops on
$MONEY bits. | 0 |
Short philosophical essay | Deep neural networks are usually trained with stochastic gradient descent
(SGD), which minimizes objective function using very rough approximations of
gradient, only averaging to the real gradient. ORG approaches like
momentum or ORG only consider a single direction, and do not try to model
distance from extremum - neglecting valuable information from calculated
sequence of gradients, often stagnating in some suboptimal plateau. ORDINAL
order methods could exploit these missed opportunities, however, beside
suffering from very large cost and numerical instabilities, many of them
attract to suboptimal points like saddles due to negligence of signs of
curvatures (as eigenvalues of GPE).
Saddle-free ORG method (SFN)~\cite{SFN} is a rare example of addressing
this issue - changes saddle attraction into repulsion, and was shown to provide
essential improvement for final value this way. However, it neglects noise
while modelling ORDINAL order behavior, focuses on PERSON subspace for numerical
reasons, and requires costly eigendecomposion.
Maintaining SFN advantages, there are proposed inexpensive ways for
exploiting these opportunities. ORDINAL order behavior is linear dependence of
ORDINAL derivative - we can optimally estimate it from sequence of noisy
gradients with least square linear regression, in online setting here: with
weakening weights of old gradients. Statistically relevant subspace is
suggested by ORG of recent noisy gradients - in online setting it can be made
by slowly rotating considered directions toward new gradients, gradually
replacing old directions with recent statistically relevant. Eigendecomposition
can be also performed online: with regularly performed step of QR method to
maintain diagonal PERSON. Outside the ORDINAL order modeled subspace we can
simultaneously perform gradient descent. | 0 |
In DATE, the notion of quantum polynomial-time computability
has been modeled by ORG machines as well as quantum circuits. Here,
we seek a ORDINAL model, which is a quantum analogue of the schematic (inductive
or constructive) definition of (primitive) recursive functions. For quantum
functions, which map finite-dimensional PERSON spaces to themselves, we
present such a schematic definition, composed of a small set of initial quantum
functions and a few construction rules that dictate how to build a new quantum
function from the existing quantum functions. We prove that our schematic
definition precisely characterizes all functions that can be computable with
high success probabilities on well-formed quantum Turing machines in polynomial
time or equivalently, uniform families of polynomial-size quantum circuits. Our
new, schematic definition is quite simple and intuitive and, more importantly,
it avoids the cumbersome introduction of the well-formedness condition imposed
on a quantum PRODUCT machine model as well as of the uniformity condition
necessary for a quantum circuit model. Our new approach can further open a door
to the descriptional complexity of other functions and to the theory of
higher-type quantum functionals. | Wiring diagrams are given for a quantum algorithm processor in ORG to
compute, in parallel, all divisors of an n-bit integer. Lines required in a
wiring diagram are proportional to ORG time is proportional to the
square of n. | 0 |
This article is a semitutorial-style survey of computability logic. An
extended online version of it is maintained at
http://www.csc.villanova.edu/~japaridz/CL/ . | This paper presents a simple unsupervised learning algorithm for classifying
reviews as recommended (thumbs up) or not recommended (thumbs down). The
classification of a review is predicted by the average semantic orientation of
the phrases in the review that contain adjectives or adverbs. A phrase has a
positive semantic orientation when it has good associations (e.g., "subtle
nuances") and a negative semantic orientation when it has bad associations
(e.g., "very cavalier"). In this paper, the semantic orientation of a phrase is
calculated as the mutual information between the given phrase and the word
"excellent" minus the mutual information between the given phrase and the word
"poor". A review is classified as recommended if the average semantic
orientation of its phrases is positive. The algorithm achieves an average
accuracy of PERCENT when evaluated on CARDINAL reviews from PRODUCT, sampled from CARDINAL
different domains (reviews of automobiles, banks, movies, and travel
destinations). The accuracy ranges from PERCENT for automobile reviews to PERCENT for
movie reviews. | 0 |
In a Web Advertising Traffic Operation it's necessary to manage the
day-to-day trafficking, pacing and optimization of digital and paid social
campaigns. The data analyst on ORG can not only quickly provide
answers but also speaks the language of the ORG Manager and visually
displays the discovered process problems. In order to solve a growing number of
complaints in the customer service process, the weaknesses in the process
itself must be identified and communicated to the department. With the help of
WORK_OF_ART data it is possible to identify unwanted loops and
delays in the process. With this paper we propose a process discovery based on
ORG technique to automatically discover variations and detect at
ORDINAL glance what the problem is, and undertake corrective measures. | This work attempts to unify CARDINAL domains: ORG for cooperative
control systems and ORG, under the umbrella of
crowdsourcing for information gain on ORG related to different
devices (as PC, PERSON, GPE,...) This paper proposes a framework for
adapting ORG objects components for a disaggregated system, which dynamically
composes web pages for different kind of devices including ubiquitous/pervasive
computing systems. It introduces the notions of responsive webdesign for
non-cooperative ORG equilibrium proposing an algorithm (RE-SAUI) for the
dynamic interface based on the game theory. | 1 |
ORG is a powerful application package for doing mathematics and is
used almost in all branches of science. It has widespread applications ranging
from quantum computation, statistical analysis, number theory, zoology,
astronomy, and many more. PERSON gives a rich set of programming
extensions to its end-user language, and it permits us to write programs in
procedural, functional, or logic (rule-based) style, or a mixture of all CARDINAL.
For tasks requiring interfaces to the external environment, PERSONprovides mathlink, which allows us to communicate mathematica programs with
external programs written in C, PERSON, ORG, ORG, ORG, PERSON, or other languages.
It has also extensive capabilities for editing graphics, equations, text, etc.
In this article, we explore the basic mechanisms of parallelization of a
mathematica program by sharing different parts of the program into all other
computers available in the network. Doing the parallelization, we can perform
large computational operations within a very short period of time, and
therefore, the efficiency of the numerical works can be achieved. Parallel
computation supports any version of GPE and it also works as well even
if different versions of mathematica are installed in different computers. The
whole operation can run under any supported operating system like Unix,
ORG, ORG, etc. Here we focus our study only for the Unix based
operating system, but this method works as well for all other cases. | We explore electron transport properties in honeycomb lattice ribbons with
zigzag edges coupled to CARDINAL semi-infinite CARDINAL-dimensional metallic electrodes.
The calculations are based on the tight-binding model and the ORG's function
method, which numerically compute the conductance-energy and current-voltage
characteristics as functions of the lengths and widths of the ribbons. Our
numerical results predict that for such a ribbon an energy gap always appears
in the conductance spectrum across the energy E=0. With the increase of the
size of the ribbon, the gap gradually decreases but it never vanishes. This
clearly manifests that a honeycomb lattice ribbon with zigzag edges always
exhibits the semiconducting behavior, and it becomes much more clearly visible
from our presented current-voltage characteristics. | 1 |
We examine the effect of previous history on starting a computation on a
ORG computer. Specifically, we assume that the quantum register has some
unknown state on it, and it is required that this state be cleared and replaced
by a specific superposition state without any phase uncertainty, as needed by
ORG algorithms. We show that, in general, this task is computationally
impossible. | This article presents new properties of the mesh array for matrix
multiplication. In contrast to the standard array that requires 3n-2 steps to
complete its computation, the mesh array requires only 2n-1 steps. Symmetries
of the mesh array computed values are presented which enhance the efficiency of
the array for specific applications. In multiplying symmetric matrices, the
results are obtained in DATE steps. The mesh array is examined for its
application as a scrambling system. | 1 |
Necessary and sufficient conditions are given for the construction of a
hybrid ORG computer that operates on both continuous and discrete quantum
variables. Such hybrid computers are shown to be more efficient than
conventional quantum computers for performing a variety of ORG algorithms,
such as computing eigenvectors and eigenvalues. | A ORG's demon is a device that gets information and trades it in for
thermodynamic advantage, in apparent (but not actual) contradiction to the
ORDINAL law of thermodynamics. ORG-mechanical versions of ORG's demon
exhibit features that classical versions do not: in particular, a device that
gets information about a quantum system disturbs it in the process. In
addition, the information produced by ORG measurement acts as an additional
source of thermodynamic inefficiency. This paper investigates the properties of
quantum-mechanical ORG's demons, and proposes experimentally realizable
models of such devices. | 1 |
Evolvability is the capacity to evolve. This paper introduces a simple
computational model of evolvability and demonstrates that, under certain
conditions, evolvability can increase indefinitely, even when there is no
direct selection for evolvability. The model shows that increasing evolvability
implies an accelerating evolutionary pace. It is suggested that the conditions
for indefinitely increasing evolvability are satisfied in biological and
cultural evolution. We claim that increasing evolvability is a large-scale
trend in evolution. This hypothesis leads to testable predictions about
biological and cultural evolution. | A plasmodium of GPE polycephalum is a very large cell visible by unaided
eye. The plasmodium is capable for distributed sensing, parallel information
processing, and decentralized optimization. It is an ideal substrate for future
and emerging bio-computing devices. We study space-time dynamics of plasmodium
reactiom to localised illumination, and provide analogies between propagating
plasmodium and travelling wave-fragments in excitable media. We show how
plasmodium-based computing devices can be precisely controlled and shaped by
planar domains of illumination. | 0 |
A computer program is a set of electronic instructions executed from within
the computer memory by the computer central processing unit. Its purpose is to
control the functionalities of the computer allowing it to perform various
tasks. Basically, a computer program is written by humans using a programming
language. A programming language is the set of grammatical rules and vocabulary
that governs the correct writing of a computer program. In practice, the
majority of the existing programming languages are written in LANGUAGE-speaking
countries and thus they all use the LANGUAGE language to express their syntax
and vocabulary. However, many other programming languages were written in
NORP languages, for instance, the NORP BASIC, ORG, the
PERSON, and the Arabic Loughaty. This paper discusses the design and
implementation of a new programming language, called GPE. It is a
General-Purpose, High-Level, Imperative, Object-Oriented, and Compiled LANGUAGE
programming language that uses the LANGUAGE language as syntax and vocabulary.
The core of GPE is a compiler system made up of CARDINAL components, they are
the LOC, the scanner, the parser, the semantic analyzer, the code
generator, and the linker. The experiments conducted have illustrated the
several powerful features of the GPE language including functions,
while-loop, and arithmetic operations. As future work, more advanced features
are to be developed including inheritance, polymorphism, file processing,
graphical user interface, and networking. | This is brief and hopefully friendly, with basic notions, a few different
perspectives, and references with more information in various directions. | 0 |
We discuss the computation of the CARDINAL loop anomalous dimensions for various
operators used in deep inelastic scattering in the ORG and ORG' schemes. In
particular the results for the n = CARDINAL and CARDINAL ORG operators in arbitrary linear
covariant gauge in the RI' scheme are new. | We renormalize various scalar field theories with a $\phi^n$ self interaction
such as $n$ $=MONEY, MONEY in their respective critical dimensions which
are non-integer. The renormalization group functions for the $MONEY symmetric
extensions are also computed. | 1 |
PERSON has developed a general set of evolutionary statistics that quantify
the adaptive component of evolutionary processes. On the basis of these
measures, he has proposed a set of CARDINAL classes of evolutionary system. All
artificial life sytems so far looked at fall into the ORDINAL CARDINAL classes, whereas
the biosphere, and possibly the human economy belongs to the ORDINAL class. The
challenge to the artificial life community is to identify exactly what is
difference between these natural evolutionary systems, and existing artificial
life systems. At ALife VII, I presented a study using an artificial
evolutionary ecology called \EcoLab. PERSON's statistics captured the
qualitative behaviour of the model. \EcoLab{} exhibited behaviour from the
ORDINAL CARDINAL classes, but not class CARDINAL, which is characterised by unbounded growth in
diversity. \EcoLab{} exhibits a critical surface given by an inverse
relationship between connectivity and diversity, above which the model cannot
tarry long. Thus in order to get unbounded diversity increase, there needs to
be a corresponding connectivity reducing (or food web pruning) process. This
paper reexamines this question in light of CARDINAL possible processes that reduce
ecosystem connectivity: a tendency for specialisation and increase in
biogeographic zones through ORG drift. | In an earlier article PERSON, On nonspecific evidence, PERSON. PERSON.
ORG. CARDINAL), CARDINAL-725 (DATE)] we established within ORG theory a
criterion function called the metaconflict function. With this criterion we can
partition into subsets a set of several pieces of evidence with propositions
that are weakly specified in the sense that it may be uncertain to which event
a proposition is referring. Each subset in the partitioning is representing a
separate event. The metaconflict function was derived as the plausibility that
the partitioning is correct when viewing the conflict in PERSON's rule within
each subset as a newly constructed piece of metalevel evidence with a
proposition giving support against the entire partitioning. In this article we
extend the results of the previous article. We will not only find the most
plausible subset for each piece of evidence as was done in the earlier article.
In addition we will specify each piece of nonspecific evidence, in the sense
that we find to which events the proposition might be referring, by finding the
plausibility for every subset that this piece of evidence belong to the subset.
In doing this we will automatically receive indication that some evidence might
be false. We will then develop a new methodology to exploit these newly
specified pieces of evidence in a subsequent reasoning process. This will
include methods to discount evidence based on their degree of falsity and on
their degree of credibility due to a partial specification of affiliation, as
well as a refined method to infer the event of each subset. | 0 |
The meson fields are simulated by quark operators and an effective chiral
theory of mesons is presented. There are spontaneous chiral symmetry breaking
and dynamical chiral symmetry breaking. Theoretical results agree with data
well. | A $U(2)_{L}\times U(2)_{R}$ chiral theory of pseudoscalar, vector, and
axial-vector mesons has been proposed. ORG has been revealed from this theory.
The physical processes of normal parity and abnormal parity have been studied
by using the same lagrangian and the universality of coupling has been
revealed. CARDINAL new mass relations between vector and axial-vector mesons have
been found. PERSON's ORDINAL sum rule and new relations about the amplitude of
$a_{1}$ PERSON are satisfied. KSFR sum rule is satisfied pretty well. The $\rho$
pole in pion form factor has been achieved. The theoretical results of
$MONEY, $\omega\rightarrow ORG, $MONEY
and $\pi\gamma$, $MONEY \rho\nu$, $MONEY DATE,
$\pi^{0}\rightarrow \gamma\gamma$, $PERSON,
$MONEY, $MONEY} GPE,
$f_{1}\rightarrow\eta\pi\pi$, $\rho\rightarrow\eta\gamma$, $MONEY
\rightarrow\eta\gamma$ are in good agreement with data. PERSON's $PERSON
scattering lengths and slopes and $a^{0}_{2}$, $a^{2}_{2}$, and $b^{1}_{1}$
have been obtained. Especially, the $\rho$ resonance in the amplitude
$MONEY of $PERSON scattering has been revealed from this theory. CARDINAL
coefficients of chiral perturbation theory have been determined and they are
close to the values used by chiral perturbation theory. | 1 |
We mathematically model PERSON principles of symmetric and
asymmetric being through use of an ultrametric topology. We use for this the
highly regarded DATE book of this NORP psychiatrist and pyschoanalyst (born
DATE, died DATE). Such an ultrametric model corresponds to hierarchical
clustering in the empirical data, e.g. text. We show how an ultrametric
topology can be used as a mathematical model for the structure of the logic
that reflects or expresses ORG symmetric being, and hence of the
reasoning and thought processes involved in conscious reasoning or in reasoning
that is lacking, perhaps entirely, in consciousness or awareness of itself. In
a companion paper we study how symmetric (in the sense of ORG)
reasoning can be demarcated in a context of symmetric and asymmetric reasoning
provided by narrative text. | We consider decision problems of rating alternatives based on their pairwise
comparisons according to CARDINAL criteria. Given pairwise comparison matrices for
each criterion, the problem is to find the overall scores of the alternatives.
We offer a solution that involves the minimax approximation of the comparison
matrices by a common consistent matrix of unit rank in terms of the ORG
metric in logarithmic scale. The approximation problem reduces to a
bi-objective optimization problem to minimize the approximation errors
simultaneously for both comparison matrices. We formulate the problem in terms
of tropical (idempotent) mathematics, which focuses on the theory and
applications of algebraic systems with idempotent addition. To solve the
optimization problem obtained, we apply methods and results of tropical
optimization to derive a complete PERSON-optimal solution in a direct explicit
form ready for further analysis and straightforward computation. We then
exploit this result to solve the bi-criteria decision problem of interest. As
illustrations, we present examples of the solution of CARDINAL-dimensional
optimization problems in general form, and of a decision problem with CARDINAL
alternatives in numerical form. | 0 |
Constraint satisfaction problems (or CSPs) have been extensively studied in,
for instance, artificial intelligence, database theory, graph theory, and
statistical physics. From a practical viewpoint, it is beneficial to
approximately solve those CSPs. When CARDINAL tries to approximate the total number
of truth assignments that satisfy all NORP-valued constraints for
(unweighted) NORP CSPs, there is a known trichotomy theorem by which all
such counting problems are neatly classified into exactly CARDINAL categories
under polynomial-time (randomized) approximation-preserving reductions. In
contrast, we obtain a dichotomy theorem of approximate counting for
complex-weighted NORP CSPs, provided that all complex-valued unary
constraints are freely available to use. It is the expressive power of free
unary constraints that enables us to prove such a stronger, complete
classification theorem. This discovery makes a step forward in the quest for
the approximation-complexity classification of all counting CSPs. To deal with
complex weights, we employ proof techniques of factorization and arity
reduction along the line of solving PERSON problems. Moreover, we introduce a
novel notion of T-constructibility that naturally induces
approximation-preserving reducibility. Our result also gives an approximation
analogue of the dichotomy theorem on the complexity of exact counting for
complex-weighted NORP CSPs. | The article presents a study of rather simple local search heuristics for the
single machine total weighted tardiness problem (ORG), namely hillclimbing
and WORK_OF_ART. In particular, we revisit these approaches
for the ORG as there appears to be a lack of appropriate/challenging
benchmark instances in this case. The obtained results are impressive indeed.
Only few instances remain unsolved, and even those are approximated within PERCENT
of the optimal/best known solutions. Our experiments support the claim that
metaheuristics for the ORG are very likely to lead to good results, and
that, before refining search strategies, more work must be done with regard to
the proposition of benchmark data. Some recommendations for the construction of
such data sets are derived from our investigations. | 0 |
In this article, we take the $Z_c(3900)$ and $PERSON as the ground state
and the ORDINAL radial excited state of the axial-vector tetraquark states with
$PERSON, respectively, and study their masses and pole residues with
the ORG sum rules by calculating the contributions of the vacuum condensates up
to dimension-10 in a consistent way in the operator product expansion. The
numerical result favors assigning the $Z_c(3900)$ and $PERSON as the ground
state and ORDINAL radial excited state of the axial-vector tetraquark states,
respectively. | In this article, we study the MONEYCARDINAL type and $MONEY
CARDINAL type scalar $cs\bar{c}\bar{s}$ tetraquark states with the ORG sum rules by
calculating the contributions of the vacuum condensates up to dimension CARDINAL in a
consistent way. The ground state masses MONEY \gamma_5C}=3.89\pm
0.05\,\rm{GeV}$ and $M_{C\otimes C}=5.48\pm0.10\,\rm{GeV}$ support assigning
the $X(3915)$ to be the ground state $C\gamma_5\otimes \gamma_5C$ type
tetraquark state with $PERSON, but do not support assigning the
$PERSON to be the ground state $C\otimes CARDINAL type $PERSON state with $PERSON Then we tentatively assign the $X(3915)$
and $X(4500)$ to be the GPE and MONEY type scalar
$cs\bar{c}\bar{s}$ tetraquark states respectively, and obtain the CARDINAL mass
$M_{\rm CARDINAL and CARDINAL mass $M_{\rm
CARDINAL from the ORG sum rules, which support
assigning the $PERSON to be the GPE $C\gamma_5\otimes \gamma_5C$ type
tetraquark state, but do not support assigning the $X(4500)$ to be the CARDINAL
MONEYC$ type tetraquark state. | 1 |
The problem of statistical learning is to construct an accurate predictor of
a random variable as a function of a correlated random variable on the basis of
an i.i.d. training sample from their joint distribution. Allowable predictors
are constrained to lie in some specified class, and the goal is to approach
asymptotically the performance of the best predictor in the class. We consider
CARDINAL settings in which the learning agent only has access to rate-limited
descriptions of the training data, and present information-theoretic bounds on
the predictor performance achievable in the presence of these communication
constraints. Our proofs do not assume any separation structure between
compression and learning and rely on a new class of operational criteria
specifically tailored to joint design of encoders and learning algorithms in
rate-constrained settings. | We present a CARDINAL-stage quantum cryptographic protocol guaranteeing security
in which each party uses its own secret key. Unlike the BB84 protocol, where
the qubits are transmitted in CARDINAL direction and classical information
exchanged thereafter, the communication in the proposed protocol remains
quantum in each stage. A related system of key distribution is also described. | 0 |
It is hypothesized by some thinkers that benign looking ORG objectives may
result in powerful ORG drives that may pose an existential risk to human
society. We analyze this scenario and find the underlying assumptions to be
unlikely. We examine the alternative scenario of what happens when universal
goals that are not human-centric are used for designing ORG agents. We follow a
design approach that tries to exclude malevolent motivations from ORG agents,
however, we see that objectives that seem benevolent may pose significant risk.
We consider the following meta-rules: preserve and pervade life and culture,
maximize the number of free minds, maximize intelligence, maximize wisdom,
maximize energy production, behave like human, seek pleasure, accelerate
evolution, survive, maximize control, and maximize capital. We also discuss
various solution approaches for benevolent behavior including selfless goals,
hybrid designs, NORP, universal constraints, semi-autonomy, and
generalization of robot laws. A "prime directive" for ORG may help in
formulating an encompassing constraint for avoiding malicious behavior. We
hypothesize that social instincts for autonomous robots may be effective such
as attachment learning. We mention multiple beneficial scenarios for an
advanced semi-autonomous ORG agent in the near future including space
exploration, automation of industries, state functions, and cities. We conclude
that a beneficial ORG agent with intelligence beyond human-level is possible and
has many practical use cases. | We explore the relations between the zeta distribution and algorithmic
information theory via a new model of the transfer learning problem. The
program distribution is approximated by a zeta distribution with parameter near
$MONEY We model the training sequence as a stochastic process. We analyze the
upper temporal bound for learning a training sequence and its entropy rates,
assuming an oracle for the transfer learning problem. We argue from empirical
evidence that power-law models are suitable for natural processes. CARDINAL
sequence models are proposed. Random typing model is like no-free lunch where
transfer learning does not work. ORG process independently samples programs
from the zeta distribution. A model of common sub-programs inspired by genetics
uses a database of sub-programs. An evolutionary zeta process samples mutations
from ORG distribution. The analysis of stochastic processes inspired by
evolution suggest that ORG may be feasible in nature, countering no-free lunch
sort of arguments. | 1 |
In this paper we examine the possibility of testing the equivalence
principle, in its weak form, by analyzing the orbital motion of a pair of
artificial satellites of different composition moving along orbits of identical
shape and size in the gravitational field of LOC. It turns out that the
obtainable level of accuracy is, realistically, of the order of CARDINAL^-10 or
slightly better. It is limited mainly by the fact that, due to the unavoidable
orbital injection errors, it would not be possible to insert the satellites in
orbits with exactly the same radius and that such difference could be known
only with a finite precision. The present-day level of accuracy, obtained with
torsion balance LOC-based measurements and the analysis of LOC-Moon motion
in the gravitational field of LOC with ORG technique, is of
the order of CARDINAL^-13. The proposed space-based missions ORG, \muSCOPE, GG and
SEE aim to reach a CARDINAL^-15-10^-18 precision level. | Long-range constraints on the GPE radius of curvature L in the
ORG (ORG) braneworld model are inferred from orbital motions of well
known artificial and natural bodies. Thus, they do not rely upon more or less
speculative and untested theoretical assumptions, contrary to other long-range
ORG tests proposed in astrophysical scenarios in which many of the phenomena
adopted may depend on the system's composition, formation and dynamical history
as well. The perihelion precession of ORG and its radiotechnical ranging
from the LOC yield L <= QUANTITY. Tighter bounds come from the perigee
precession of the PERSON, from which it can be inferred L <= CARDINAL m. The best
constraints (L <= CARDINAL m) come from the Satellite-to-Satellite Tracking (SST)
range of the GRACE A/B spacecrafts orbiting the LOC: proposed follow-on of
such a mission, implying a subnm s-1 range-rate accuracy, may constrain L at
\sim QUANTITY level. Weaker constraints come from the double pulsar system (L <=
QUANTITY) and from the main sequence star CARDINAL orbiting the compact object in
ORG* (L <= CARDINAL - 8.8 AU). Such bounds on the length L, which must not
necessarily be identified with the GPE radius of curvature of the ORG model,
naturally translate into constraints on an, e.g., universal coupling parameter
K of the r^-3 interaction. GRACE yields K <= CARDINAL^CARDINAL m^5 s^-2. | 1 |
PERSON has written a wonderful book about visualization that makes our
field of scientometrics accessible to much larger audiences. The book is to be
read in relation to the ongoing series of exhibitions entitled "Places &
Spaces: Mapping Science" currently touring the world. The book also provides
the scholarly background to the exhibitions. It celebrates scientometrics as
the discipline in the background that enables us to visualize the evolution of
knowledge as the acumen of human civilization. | The Actor Network represents heterogeneous entities as actants (GPE et
GPE, DATE; DATE). Although computer programs for the visualization of social
networks increasingly allow us to represent heterogeneity in a network using
different shapes and colors for the visualization, hitherto this possibility
has scarcely been exploited (NORP et GPE, DATE). In this contribution to
the PERSON, I study the question of what heterogeneity can add
specifically to the visualization of a network. How does an integrated network
improve on the CARDINAL-dimensional ones (such as co-word and co-author maps)? The
oeuvre of PERSON is used as the case materials, that is, his CARDINAL papers
which can be retrieved from the (Social) Science Citation Index since DATE. | 1 |