I’ve been maintaining several blogs for about 10 years, and I’ve come up with a few practices that I use to guide how I keep these going.
Hand-coded HTML to Wordpress to Pelican
When I first started making websites, I did it the old-fashioned way: 1 HTML tag at a time. Of course, I was young and green, and besides some brief encounters with Dreamweaver and the Homestead Site Builder (yuck!), this was about all I cared to do as far as making web sites went. Automating the process simply hadn’t occured to me (as I said - young!). But then I found Wordpress, and suddenly I had oodles of time to spend writing content!
Wordpress was super at first. But then, I started getting comments spam. Soon, I was spending at least 30m a week caring for my site. This sounds like nothing time-wise, but for websites where I wanted to post pretty pictures and review local restaurants, that sounded like time that wasn’t spent writing or doing anything else - over a day a year just spent keeping the thing running.
I don’t remember who introduced the idea to me, or exactly when (my git commit logs say around 2013), I decided I’d had enough of the maintenance grind and looked for alternatives. The idea of returning to static sites appealed to me (for reasons I’ll get into in a bit), and after a bit of searching I settled on Pelican.
Why static websites?
Here’s the secret: it doesn’t have to be Pelicani, or any other static site generator solution, in particular. Ultimately you want something that takes plaintext markup, uses a few templates to generate the HTML, and you’re done.
No database or PHP to worry about, nothing to keep on top of except your HTTP server. Whole classes of vulnerabilities simply don’t exist when it’s just static pages.
Moving to a new provider is only a matter of updating your deployment script. This came in handy when I decided to leave GoDaddy many years ago because of their SOPA support.
I started using Pelican when I was heavy into Python work, so it was a low-barrier way to jump into static sites. I’ve also used Octopress and Middleman since I started, but over time I’ve standardized on Pelican out of simplicity (ie I can copy configurations from one site to another).
I’ve come to prefer the reStructuredText format over Markdown. It’s a little more complex, but also more predictable output-wise, and the formatting is a bit easier for me to read.
I use virtualenv to make building the deployment environment easier, and use Fabric for doing the deployment itself (which is essentially a pelican run followed by rsync). This keeps the whole process to something simple that I can run on a cruddy little netbook with a crappy network connection.
The sites are kept in git, so it’s easy to keep my work synchronized across machines. From a fresh git clone, I can have my site ready to upload to a host in a few minutes. In time I’ll upload the sites to GitHub (still some loose ends to clear up).
Making things fast
Speed is important. If it doesn’t load in less than a second, people will close the tab. I don’t use a dialup network connection anymore, but I’ve been on public wifi in spots where the speed is comparable, and slow pages because of needless inefficiency are a bummer. I use WebPageTest to look for bottlenecks and any other quick wins, and try to keep up with other best practices like the Yahoo! and Google best practices. These things can change over time, but looking every 2-3 years is fine.
Making things secure
Making sites secure has never been easier. I use Let’s Encrypt for my TLS certificates, and Mozilla Observatory to make sure I’m on top of the necessary HTTP headers, configuration, etc. This takes a bit of time to get right (a few evenings for me), but again, for the most part you’re keeping an eye out for changes once a year or so.