Why learn something new, when you could learn everything new? Titanic is a hobby/learning-project for me: a simple web app and corresponding Linux command line tool designed to keep host name aliases and Bash shortcuts synchronised across all of my Linux devices and servers.
The catch? I decided to build the tool from entirely new-to-me technologies, giving myself a "jump in at the deep end" learning opportunity.
There are perhaps a dozen Linux-powered devices that I interact with on a regular basis: my laptop, desktop and smartphone, a few RPis, a handful of servers, etc. They often need to talk to each other, so I use host name aliases (Chuck character names, actually) instead of remembering a list of IP addresses. This means maintaining an
/etc/hosts file on each machine, and updating all of them every time a new device enters the network -- such a hassle, right?
A similar problem can be found with Bash shortcuts. I have a collection of shortcuts on my machines, ranging from aliases for common typos (
git puhs, anyone?) to one-command reductions of common tasks. Maintaining these across all devices would be impractical, and trying to use a shortcut that doesn't exist on this device yet is frustrating.
Enter the solution: Titanic. An over-engineered solution to a low-key problem, aptly-named because it syncs everything.
Jumping in at the Deep End
I've always learned best by doing: I could watch videos or read books on a new language - and I sometimes do - but it will never beat getting my hands dirty and just playing with a new language, building things, and breaking things. That's what I embraced with Titanic: I started it at a hackathon, with an objective to use solely new-to-me technologies.
All in all, the project has exposed me to eight new technologies so far: Node.js, Express.js, MongoDB & Mongoose, Jade, CoffeeScript, SCSS,
GitLab (although I eventually switched to GitHub), and Bash scripting. I made only a few small concessions: I stuck with Git for version control, I used good ol' Bootstrap, I kept jQuery, and I didn't swap out my IDE.
The first few hours followed a very steep learning curve (which I'm sure made for great entertainment for the teams nearby!), but after that things smoothed out and I started picking up new concepts across the board at a rate I was very happy with.
The Code: Web App
The web side of the project is straightforward: a relatively simple interface to view and manage the entities and relationships controlled by the system. That includes devices, the host name aliases that connect them, Bash shortcuts, and their device assignments.
Really, the front end is nothing more than a few related models and their CRUD actions. The fun part was the journey that lead to its construction. I started with a blank Ubuntu VPS, then got down to installing Node.js, NPM, PM2, Nginx etc. Once I had everything running smoothly I used another server to install GitLab - a bit of set up later, I had a blank repository and I was ready to start building!
Having decided to use Express.js as the powerhouse with Jade and SCSS to build the front-end, I set about knocking it together. I've never really been much of a designer, and it's not the family of skills this project was targeting, so I stuck to Bootstrap and jQuery, both of which I've had a lot of experience with before.
Using MongoDB and Mongoose, the models and data-access layer came together quite quickly, and from there it was just a matter of setting up the appropriate views and controllers to orchestrate everything. I continued to use CoffeeScript for all front-end scripting, which I was already using to write my back-end Node.js code.
The Code: API
The API is exactly what you'd expect it to be: a bog-standard RESTful HTTP API that exposes the entities and relationships managed by the web app. It does nothing particularly exciting - it exists solely as a data source for the command line tool.
The Code: Command Line Tool
The command line tool is the last piece of the puzzle that needs to be built. It's a skeleton of a program right now, but it is in my pipeline for development over the next few months (unfortunately it takes a backseat to my real work). The tool, once finished, will be easy to use and blissfully simple. Once set up, a simple command of
titanic sync should update the machine with its specific set of host name aliases and Bash shortcuts.
First, I need to finish the basic tool. But after that, I have a few ideas:
- I'd like to see what else I can synchronise across my machines. The current feature set offers a lot of benefits to me, but I'm sure there's more I could unify. Settings perhaps? Users, even?
- I'd also like to launch this as a publicly-accessible service. Right now it manages my devices and settings, but has no concept of different users managing their data. Adding this to the tool would take time, but it would mean that other people could use it to keep a family of Linux machines in check!