<?php
/**
 * <https://y.st./>
 * Copyright © 2015 Alex Yst <mailto:copyright@y.st>
 * 
 * This program is free software: you can redistribute it and/or modify
 * it under the terms of the GNU General Public License as published by
 * the Free Software Foundation, either version 3 of the License, or
 * (at your option) any later version.
 * 
 * This program is distributed in the hope that it will be useful,
 * but WITHOUT ANY WARRANTY; without even the implied warranty of
 * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
 * GNU General Public License for more details.
 * 
 * You should have received a copy of the GNU General Public License
 * along with this program. If not, see <https://www.gnu.org./licenses/>.
**/

$xhtml = array(
	'<{title}>' => 'Difficulties in automating website compilation',
	'<{body}>' => <<<END
<p>
	I tried to rewrite my website compile code to check for updates and recompile individual pages if updates are present.
	However, consistency dictates that doing such requires that I also have the code check for the presence of pages that should be removed.
	There are at least four different places a page might get compiled from, all of which cause a page to be equally valid.
	I can rule one method out when the &quot;/*/weblog/&quot; directory is given special treatment, but that still leaves three sources of pages.
	This doesn&apos;t even take into account that each of the four compile methods will individually need to be re-factored to check for updates.
	This is too big of a job to deal with right now; I&apos;ll have to get back to trying to figure this out later.
</p>
<p>
	My <a href="/a/canary.txt">canary</a> still sings the tune of freedom and transparency.
</p>
END
);
