<TeXmacs|1.99.16>

<project|rehash.tm>

<style|<tuple|tmmanual|british>>

<\body>
  <section|Maintaining information on active downloads>

  This module is defined in <scm|(remirror download-manager)>. Some
  information on terminology: a download (requested by some client) can lead
  to several subdownloads (initiated by <samp|remirror>) each trying to
  download the requested resource from a different source or with a different
  method. A sidedownload is remirror trying to download something else to aid
  the download itself \U for example, discovering GNUnet FS URIs through
  <samp|rehash>.

  <subsection|Why?>

  If an download location from a suspicous source (e.g. rehash) is used, it
  is quite possible some jerk (or a buggy implementation or something) has
  inserted a download source that doesn't point to the right content, so the
  download may need to be retried with a more reliable source (but possibly
  more centralised, with no automatic CDN of sorts, or disliked for some
  other reason). Also, it might not be obvious if a download via a
  decentralised system will succeed in reasonable time if the original
  uploader is down, in which case a fallback may be useful.

  <subsection|Creating a download context>

  When creating a download context, it needs to be specified when the
  downloaded file should be considered correct. These constructors don't
  include any subdownloads or sidedownloads by default, as that is a matter
  of policy, not mechanism.

  <\explain>
    <scm|(make-download-context/hash <scm-arg|hash>)><explain-synopsis|Download
    by hash>
  <|explain>
    Make a download context that considers downloaded resources matching the
    hash <var|hash> acceptable. <var|hash> will be added to the list of known
    hashes.
  </explain>

  <\explain>
    <scm|(make-download-context/nar <scm-arg|arg>
    ...)><explain-synopsis|Download a nar>
  <|explain>
    <scm|arg ...> must be the four-tuple as computed by <scm|recognise-url?>
    for Nix substitutes URLs. Any hashes specified in the corresponding
    narinfo will be added as known hashes.

    <todo|adjust recognise-url.scm to include information on the location of
    the Nar (*including* the hostname*)>
  </explain>

  <\explain>
    <scm|(make-download-context/url <scm-arg|url>)><explain-synopsis|Download
    from an URL>
  </explain|Consider any succesful download from url to be authoritive.
  <var|url> will be added to the list of known urls.>

  <subsection|The context fiber>

  Practically all requests are handled by communicating with a fiber. The
  fiber of a context can be started with <scm|(start-download-fiber!
  <scm-arg|ctx>)> and stopped with <scm|(stop-download-fiber! <var|ctx>)>. It
  is currently not possible to restart the context fiber. Except if specified
  otherwise, all operations on a context are synchronuous, i.e. require the
  fiber to be active. In the Guile implementation, forgotten contexts are
  deleted after garbage collection via guardians, but this shouldn't be
  relied upon.

  <var|stop-download-fiber!> can be used outside a fiber context, and will
  not block. Stopping a context fiber twice has no additonal effect.

  <subsection|Default sub- and sidedownloads>

  All operations defined here are asynchronuous, and performing them multiple
  times have no additional effects \U they are implemented with condition
  variables under the hood.

  <\explain>
    <scm|(try/rehash! <scm-arg|ctx>)>
  </explain|Try to discover a hash suitable for GNUnet FS, when an unsuitable
  hash is discovered and the GNUnet FS URI isn't known yet.>

  <\explain>
    <scm|(try/gnunet! <scm-arg|ctx>)>
  <|explain>
    When a hash suitable for GNUnet FS becomes known (possibly from an
    unreliable source), try to download from that. Downloads from reliable
    sources are prefered.
  </explain>

  <\explain>
    <scm|(try/http-etc! <scm-arg|ctx>)>
  </explain|When a URL for a client-server download becomes known, try to
  download from that. This includes the authoritive URL from
  <scm|make-download-context/url>.>

  <\explain>
    <scm|(try/narinfo-lookup! <scm-arg|ctx>)>
  <|explain>
    Try to download the narinfo corresponding to <var|ctx> (possibly cached),
    then add all relevant hashes to the download context.
  </explain>

  <subsection|Additional URLs>

  These operations are asynchronuous as well. They have no effect if the
  context is closed.

  <\explain>
    <scm|(suggest-urls! <scm-arg|ctx> <scm-arg|url>
    ...)><explain-synopsis|Suggest some URLs to download from>
  </explain|Add <var|url> <text-dots> to the list of known URLs. This can
  wake up <scm-arg|try/http-etc!>, but otherwise doesn't initiate a
  subdownload. The URLs won't be sniffed for hashes, that's something the
  caller should do.>

  <subsection|Priorities, time-out and performance policies>

  <todo|TODO>

  <subsection|Download methods>

  remirror can download from a few different sources that each are
  fundamentally different in some way. A list of the methods as used by
  <scm|(remirror download-manager)> and their particularities.

  <\description>
    <item*|url>For downloading over client-server connections, with protocols
    like <samp|http>, <samp|https> <text-dots> If we're asked to download
    over an <samp|http> URL, in practice the resource at the URL will have
    the correct hash, or it is not found at all, and this is communicated to
    the client. However, if the hash doesn't check out, we don't have an idea
    where the fault is localised, so the download should be restarted
    (presumably with a different URL).

    <item*|gnunet>Downloading over GNUnet (the logic would be mostly the same
    for BitTorrent <todo|check capitalisation>). There isn't an obvious way
    to know if the resource isn't accessible, but some heuristics are
    possible (timeouts, minimal bytes/second <text-dots>). In contrast to
    <samp|http>, it is feasible to check if a part of the download is correct
    \U in fact, this is a design consideration of file-sharing systems like
    e.g. BitTorrent and GNUnet where not all peers are reliable. It's also
    trivial to resume half-finished downloads.

    <item*|rehash>This method doesn't really download a resource per se,
    rather it can try to map an arbitrary hash (the NarHash in the narinfo)
    to the corrresponding GNUnet FS URI (or in principle, a suitable string
    for IPFS or BitTorrent).

    As discussed earlier in various places, this is theoretically impossible
    to compute without having the full resource in advance, so we ask other
    peers whether they have a mapping and publish such a mapping ourselves
    when we finish a download. Hopefully peers will be honest in practice.

    If they are not, then including a GNUnet FS URI in the narinfo becomes a
    priority.
  </description>

  <subsection|Download progress>

  <scm|(resume-download! ctx method)>

  <scm|(pause-download! ctx method)>

  <scm|(kill-download! ctx method)>

  Method = #t: resume/pause/kill all methods. If a download via a method is
  killed, that part of the download is considered failed and won't be retried
  automatically (not even by <scm|resume-download!>).

  <scm|(method-info ctx method)>

  This returns an alist. The value at key \<#2018\><scm|status>\<#2019\> is:

  <\description>
    <item*|<scm|need-info>>For the <samp|rehash> method: no authoritive hash
    is known, so figuring out the GNUnet URL will have to wait for now. For
    the <samp|url> method: no possible URLs are known at the moment. For the
    <samp|gnunet> method: no <samp|>GNUnet FS URL is known.

    <item*|<scm|authoritive-failed>>A download from some download location
    that was thought to be authoritive, actually failed \U not a simple 404
    not found or an internal server error, but the hash didn't check out. In
    this case, the subdownload won't continue.

    <item*|<scm|in-progress>>Busy downloading from a source.

    <item*|<scm|verifying>>Busy verifying if the subdownload matches the
    originally requested hash.

    <item*|<samp|completed>>Subdownload is completed and verified.

    <item*|something else>To be defined later
  </description>

  <todo|graph of possible transitions>

  <todo|more info>

  <subsection|Implementing new subdownload and sidedownloads>

  <todo|todo>
</body>

<\initial>
  <\collection>
    <associate|save-aux|false>
  </collection>
</initial>