My Jekyll Setup
I'm not doing anything revolutionary with Jekyll, a couple of plugins for the sitemap and category pages, a simple script for the atom feed and a bash script for deploying to Amason S3. I aim to keep things simple, after all - that's much the point.
For my reference, and anyone else coming here looking to setup jekyll, here's what I needed to get going (on Ubuntu 10.10):
# ruby, rubygems and jekyll itself sudo apt-get install ruby ruby-dev rubygems sudo gem install jekyll # we need this for syntax highlighting sudo apt-get install python-pygments # I'll also be using Juicer to minify css and js (well, there's actually no js as yet) sudo apt-get install libxslt-dev libxml2-dev sudo gem install juicer juicer install yui_compressor juicer install jslint
Setting up the blog files
Absolutely nothing new here, regular layout, plugins and posts folders with files you'd expect after reading the guide:
There's an atom.xml in the root that I stole from here (though I'm sure you'll find it all over the place):
I'm using just two, the generate_categories plugin, from here:
And the sitemap_generator plugin, from here:
I'm using s3cmd to sync my blog to S3 (see my post on using s3cmd for backups). I have two bash scripts, one to run jekyll server locally (for previewing the site), and the other to deploy live. They look like this:
#!/bin/bash echo ================== echo Minify css echo ================== juicer merge -i --force -o static/css/min.css static/css/master.css echo ================== echo Start jekyll server echo ================== jekyll --server exit 0
#!/bin/bash echo ================== echo Minify css echo ================== juicer merge -i --force -o static/css/min.css static/css/master.css echo ================== echo Building site echo ================== jekyll echo ================== echo Syncing to S3 echo ================== s3cmd sync --progress -M --acl-public _site/ s3://myblogwebsitebucket/ --exclude '*.sh' --exclude 'static/*' s3cmd sync --progress -M --acl-public --add-header 'Cache-Control: max-age=31449600' _site/static/ s3://mystaticfilesbucket/ echo ================== echo Backing up source echo ================== s3cmd sync --progress ./ s3://mybackupbucket/ --exclude '_site/*' exit 0
One thing perhaps left to do is gzip files before deploying, adding gzip headers to the S3 objects. The downside to this is the exclusion of user agents that don't accept gzipped content. But who, or what (in the case of bots and such), would one be excluding by doing this? Thoughts?
Now, to run ./deploy.sh and go to bed...