<?php
/**
 * <https://y.st./>
 * Copyright © 2015 Alex Yst <mailto:copyright@y.st>
 * 
 * This program is free software: you can redistribute it and/or modify
 * it under the terms of the GNU General Public License as published by
 * the Free Software Foundation, either version 3 of the License, or
 * (at your option) any later version.
 * 
 * This program is distributed in the hope that it will be useful,
 * but WITHOUT ANY WARRANTY; without even the implied warranty of
 * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
 * GNU General Public License for more details.
 * 
 * You should have received a copy of the GNU General Public License
 * along with this program. If not, see <https://www.gnu.org./licenses/>.
**/

$xhtml = array(
	'title' => 'A flawed trust model',
	'body' => <<<END
<p>
	I tried setting up a hidden service on my mobile, but I could not figure out how to make it work.
	Orbot has an option to set up a hidden service.
	It says that it will automatically generate an onion address when doing so, after which, it will tell you what address it came up with.
	I figured that after it had set that up, I could find the file containing the onion key, then replace it with the onion key that my hidden service here uses.
	No dice.
	Orbot silently failed and did not generate an onion address.
	Next, I tried using the option that allows you to add configuration lines to the torrc file.
	Again, no dice.
	Like before, no error messages were output, but the <code>HiddenServiceDir</code> and <code>HiddenServicePort</code> lines seemed to have been ignored.
	I could not find any information on how to make it work on the Web, though <a href="https://wowana.me/">wowaname</a> said that it did not matter anyway; Orbot keeps the onion key in its monolithic Android application data file, so the key cannot be swapped out for the preferred one without special knowledge.
	I was hoping to be able to move the onion half of this website to my mobile so I wouldn&apos;t have to leave my laptop on at all times any more, but it looks like that is not going to work.
</p>
<p>
	I have not finished reading all the email from my overly-full inbox, but I did finish cleaning it out.
	I&apos;ve moved everything that still needs processing into a subdirectory so that I can actually see incoming mail.
	Most of the yet-to-be-processed mail is political.
	Everything else has been deleted, answered, and/or otherwise acted upon.
</p>
<p>
	It seems that Marc With A C has released a <a href="https://marcwithac.bandcamp.com/album/the-carpet-crawlers-from-nerdy-shows-call-of-cthulhu">new album</a>, albeit a small one.
	Josh Woodward seems to have cleaned up his website quite a bit, too.
	His website looked professional on the outside before, but if you navigated around a bit, it seemed to have multiple layers.
	It felt as if a newer version of the website had been partially built on top of an older version of the website, which very well could have been the case.
	Now, however, the website seems uniform and consistent.
</p>
<p>
	I found an issue with my bifurcated website setup.
	Due to the centralized security model used with $a[TLS], the certificates used by the two websites are considered somehow less trustworthy than neglecting to use any encryption at all.
	You can usually tell your Web browser to bypass the flawed trust model on a per-website basis, but that doesn&apos;t fully solve the problem when using cross-site $a[CSS] like I am using to make the links to the onion half of the website visible when the page is viewed over $a[Tor].
	The $a[CSS] file from the site you are viewing loads properly, but the $a[CSS] file from the other website is blocked unless both websites have been viewed separately and their certificates accepted by the viewer.
	On the clearnet website, this results in the links to the onion website remaining invisible even if the Web browser has been configured to use $a[Tor].
	On the onion website, this results in the website being mostly unstyled.
	This could be fixed by paying exorbitant fees to a &quot;certificate authority&quot;, but I refuse to do that as long as they continue to be expensive as they are.
	For a basic certificate with wildcard subdomain support, it would cost about \$80 $a[USD] per year, and that would just be for the clearnet certificate.
	It&apos;s more difficult to get &quot;certificate authorities&quot; to issue certificates for onion domains, and from what I hear, they charge even higher rates to do so.
	Assuming <a href="https://letsencrypt.org/">Let&apos;s Encrypt</a> will issue certificates for onion-based websites, they may be the solution, though right now, their services are still in a limited beta.
</p>
<p>
	I&apos;ve decided not to purchase any domains for the time being.
	I was going to wait until the renewal date of <code>//y.st.</code> in order to synchronize the renewal dates so that I would only have one renewal date to remember.
	However, I think it would be a better idea to just use my <code>//ystyst.mp.</code> domain for beginning my potentially-collaborative projects.
	It&apos;s a dumb domain for sure, but I&apos;d rather not purchase another domain that I&apos;m not even completely sure that I will get much use out of.
	The unfortunate side effect of this decision is that I&apos;ll have revealed the name of the group, Thorn, before making the domain purchase.
	The chance is slim, but someone may scoop up the domain I eventually plan to buy myself.
</p>
<p>
	I&apos;ve finally uploaded this website&apos;s code to <a href="https://notabug.org/y.st.">NotABug.org</a>.
	Due to the privacy issues that caused this website&apos;s bifurcation, I had to make the repository private.
	If not logged in, the repository&apos;s webpage even returns a 404 error.
	This means that while this repository functions as a backup copy in case I lose the website due to hard drive failure again, I can only retrieve the backup if I have not lost my KeePassX database.
	Keeping the KeePass database&apos;s local backup up to date is a lot easier than keeping the local backup of the website up to date though.
	The website is much larger, so I back it up to an external hard drive.
	Setting up the hard drive is inconvenient, so I don&apos;t do it every day.
	On the other hand, I usually back my KeePassX database up to my mobile every day that I&apos;ve made a change to it, in addition to backing it up on the external hard drive when I back up the website and my personal files.
	For that reason, I usually have an up-to-date local backup of the KeePassX database but not an up-to-date backup of the website.
	A Git-based remote backup is easy to update daily though.
	I&apos;ve also removed the outdated copy of the website from Github.
</p>
<p>
	My <a href="/a/canary.txt">canary</a> still sings the tune of freedom and transparency.
</p>
END
);
