After last week’s post, Making Wiki, you may have a Twiki installation on your computer. But what if you have more than one computer and you want to use the wiki on all of them?
This is something I am still working on, especially since one of my machines has lost its wiki install after an OS upgrade.
Let’s look at how you are accessing the wiki: through a web browser, pointing at localhost. “localhost” is the shortcut address for “the machine this application is running on.”
The most reliable way of sharing the wiki between machines is to have one machine host the wiki, then others on the network access it by entering the address of the hosting machine into a web browser.
So if I had a machine ithaqua which was hosting the wiki, on another machine I would navigate to “http://ithaqua/twiki/bin/view/Main”. Or, if I don’t have name propagation setup, I might need to enter the IP address of ithaqua instead.
This method only works if the accessing machine can access the hosting machine, which is only likely be true when they are on the same network*. The accessing machine also needs to have network access when it is reading the wiki contents. Since much of the time I would be accessing the wiki is on the bus when I have no network access, I don’t use this solution.
The second way to share the wiki is to share the wiki data files.
Two approaches are viable here: putting the files into a revision control system, and using a cloud drive to backup the files.
Revision control systems are tools to keep a history of a group of files. They are used extensively in software development, but they are relevant to any task where many files change frequently. There are many such tools, but the one which is preferred for this application is git**.
TWiki has a plugin for storing its pages in git. Using git is preferable over other systems because it automates so much of the merge drama.
git needs a repository to hold the page store in, and this could either be on the host machine or on a separate server somewhere.
At this point I would usually start in on instructions about this approach, but I have had very poor luck in getting it to work. It seems to be a particular problem with version matching the git installation on one machine with the git installation on another. If you can get it working then it would be largely seamless, and it would allow editing offline.
What I think is the most likely approach is to modify the wiki config to store its data on a cloud drive, such as Dropbox or Google Drive. In this case I suspect there are some permissions issues to address (ie the web server must be given write access to the user’s data directory) but this seems likely to be easier to set up than git.
However, this portion is still a work in progress since, as I mentioned at the top, I only have one machine with a working TWiki install on it, and that’s the one without most of the data (the data is safe).
Which Is Best?
Each approach has its pros and cons.
Publishing the TWiki from one central machine using the web server is the best for ensuring wiki integrity. Indeed, with more than one user I would say that it’s the only viable option because it’s the only way to be sure of handling editing conflicts. This approach utterly fails when you consider network visibility and offline editing.
My heart tells me to use the git approach, especially since I have the benefit of a public server I can use as a repository, but I have not been able to set it up successfully on either machine, so I can’t recommend it.
Which leaves the use of a cloud drive, which won’t handle merge conflicts as well as git would but has the benefit of using network storage that is already configured.
Now I just have to get both wikis working at the same time!
[*] this is less of a concern if the host is on the public internet, although that introduces many other challenges.
[**] I admire Linus Torvalds immensely, but I really dislike this name.