<TeXmacs|1.99.16>

<project|rehash.tm>

<style|tmmanual>

<\body>
  Presuming everything goes according to plan:

  <paragraph|Source tarballs>Relevant source tarballs would become available
  in GNUnet FS over time, which should justify including GNUnet FS URIs in
  the origin of pre-existing Guix packages (in addition to more centralised
  <abbr|http> URIs). As GNUnet becomes widely installed on GNUish systems,
  and websites and pinning over GNUnet FS � la IPFS becomes popular (with
  some gateways on the \<#2018\>classical\<#2019\> web for convenience),
  maintainers can consider publishing the source code of their software on
  GNUnet FS.

  If a packager is worried about source code disappearing from GNUnet FS if
  no-one is pinning the code anymore, there is a simple solution to that:
  software archives! Plenty of software is mirrored and archived to some
  extent today (think Software Heritage, <slink|https://ftpmirror.gnu.org>).
  If the code cannot be downloaded from GNUnet FS, a centralised archive
  known to be reliable such as Software Heritage and GNU's mirrors make a
  fine fall-back.

  Where does rehash-remirror fit into this? In the transitional phase,
  rehash-remirror can act as a proxy to Guix, transparently downloading
  tarballs via GNUnet FS instead of <samp|http> when possible, and
  downloading via <samp|http> and inserting the result into GNUnet+rehash
  when not. rehash itself is used for converting between hash types.

  <paragraph|Binaries>What about binaries? Building from source can take
  plenty of time, memory and disk space in some cases (e.g.
  <samp|ungoogled-chromium>), so a user can chose to rely on (a set of)
  trusted substitute servers. Is there room for decentralisation here?
  Perhaps not in the build servers themselves as not all are necessarily
  trustworthy, but perhaps in the distribution of binaries?

  Guix already supports this, albeit on a fairly local scale: <samp|guix
  publish> can advertise itself on the local network with Avahi and
  <samp|guix substitute> can automatically use substitute servers on the
  local network, as long as the hashes of substitutes of the autodiscovered
  servers match a hash signed by a trusted build server.

  Replacing \<#2018\>local network\<#2019\> with \<#2018\>global
  Internet\<#2019\> won't work, as Internet-wide multicast (many to many, no
  filtering) and consulting each advertising machine for a substitute until a
  suitable host is found is rather impractical. A different approach is
  required. Our approach is downloading the narinfo as usual<\footnote>
    Removing the need of a (set of) central servers serving narinfo's (e.g.
    by inserting the narinfo's into a GNS zone, or a FS namespace) could
    lessen the load on the build farm infrastructure futher, but this is
    largely an orthogonal problem to downloading substitutes in a P2P
    fashion.
  </footnote>, which contains the hash of the nar itself, but downloading the
  substitute via GNUnet FS when possible (as with dowloading tarballs).

  Ideally speaking, the narinfo would include the GNUnet FS URI and the build
  farm would insert the substitute into GNUnet FS itself. However, it doesn't
  seem like this will happen in the near future, <small|and the author is
  impatient>, so in the meantime we need to work around not knowing the right
  GNUnet FS URI. But if GNUnet is popular enough, build farms can consider
  including the GNUnet FS URI, to skip the unreliable \<#2018\>hash
  conversion\<#2019\> process.
</body>

<\initial>
  <\collection>
    <associate|save-aux|false>
  </collection>
</initial>