zhaoxiang commited on
Commit
588adcc
1 Parent(s): aa7ef71

update examples

Browse files
Files changed (1) hide show
  1. app.py +3 -7
app.py CHANGED
@@ -3,18 +3,14 @@ import gradio as gr
3
  from ez_cite import ez_cite
4
 
5
 
 
6
 
 
7
 
8
 
9
- example1 = r"""Instead of measuring physical systems and then processing the classical measurement outcomes to infer properties of the physical systems, quantum sensors will eventually be able to transduce quantum information in physical systems directly to a quantum memory, where it can be processed by a quantum computer."""
10
 
11
- example2 = r"""Recurrent neural networks, long short-term memory and gated recurrent neural networks in particular, have been firmly established as state of the art approaches in sequence modeling and transduction problems such as language modeling and machine translation. Numerous efforts have since continued to push the boundaries of recurrent language models and encoder-decoder architectures.
12
 
13
- Recurrent models typically factor computation along the symbol positions of the input and output sequences. Aligning the positions to steps in computation time, they generate a sequence of hidden states $h_t$, as a function of the previous hidden state $h_{t-1}$ and the input for position $t$. This inherently sequential nature precludes parallelization within training examples, which becomes critical at longer sequence lengths, as memory constraints limit batching across examples. Recent work has achieved significant improvements in computational efficiency through factorization tricks and conditional computation, while also improving model performance in case of the latter. The fundamental constraint of sequential computation, however, remains.
14
-
15
- Attention mechanisms have become an integral part of compelling sequence modeling and transduction models in various tasks, allowing modeling of dependencies without regard to their distance in the input or output sequences. In all but a few cases, however, such attention mechanisms are used in conjunction with a recurrent network.
16
-
17
- In this work we propose the Transformer, a model architecture eschewing recurrence and instead relying entirely on an attention mechanism to draw global dependencies between input and output. The Transformer allows for significantly more parallelization and can reach a new state of the art in translation quality after being trained for as little as twelve hours on eight P100 GPUs."""
18
 
19
 
20
 
 
3
  from ez_cite import ez_cite
4
 
5
 
6
+ example1 = r"""In the current Noisy Intermediate-Scale Quantum (NISQ) era, a few methods have been proposed to construct useful quantum algorithms that are compatible with mild hardware restrictions. Most of these methods involve the specification of a quantum circuit Ansatz, optimized in a classical fashion to solve specific computational tasks.
7
 
8
+ Next to variational quantum eigensolvers in chemistry and variants of the quantum approximate optimization algorithm, machine learning approaches based on such parametrized quantum circuits stand as some of the most promising practical applications to yield quantum advantages."""
9
 
10
 
 
11
 
12
+ example2 = r"""Instead of measuring physical systems and then processing the classical measurement outcomes to infer properties of the physical systems, quantum sensors will eventually be able to transduce quantum information in physical systems directly to a quantum memory, where it can be processed by a quantum computer."""
13
 
 
 
 
 
 
14
 
15
 
16