I have been experimenting with quickly recently – My plan was to develop an ebook packaging tool that would allow users to package ebooks for software centre. I’ve been side tracked from that development as I had an idea – could I use quickly to create an ebook template?
The theory goes that yes – it should be possible to create an ebook template for quickly that authors should be able to use to test, and then upload their source books to launchpad. This ties into my idea that content should be as revered as code.
Here’s where I’ve got to so far…
A new ebook can be created using the command :
quickly create ubuntu-ebook testbook
This creates a structure to store the ebook content.
I have been working on the quickly run command – which currently errors – but I have written a packaging script.
Anyway I had a go at trying to follow Didier Roche’s blog post instructions but I think it’s somewhat out of date.
I’ll keep plugging away at it, and update when I can.
Today, I listened with interest to FLOSS Weekly episode 207 – Aaron Newcomb and Dan Lynch were interviewing Denis Defreyne about his product nanoc. I started thinking – about 2 things actually (which for me is quite good).
- Compiling static HTML from markup? That sounds a lot like dexy – I wonder if the 2 could be used together to document – for example an API, and include the results (via dexy), automatically uploading that data through file sync to a webhost?
- I wonder if you could use that to compile things other than websites?
Aaron also seemed to have the same idea, and Denis mentioned that he had used nanoc to compile some C programs, and had heard of some example where files were being compiled to PDF.
But that only stirred more questions. I meandered to the coffee machine, and ordered a plastic cup of fresh brew coffee. I had been wondering if it could be used to package a manuscripts into a Epub file, and I believe it might be possible – the question became one of whether it would add any benefit.
I’m going to put nanoc to one side for the moment, and introduce something new to this blog post – the “novel publishing process”.
Regular readers may notice the the tag cloud for this website shows quite a large preoccupation towards content packaging – I’ll admit it – I’m passionate about content packaging, and have been working on (for a little while now) a pipeline or process to assist the publishers of ebooks. This pipeline basically describes tools that can be used to fulfill some of the requirements of authors, and at the end I’ll examine if there is a possible place for Nanoc within this process.
Goes without saying – when it comes to being creative on a computer, you’re going to need some form of version control. During the 2011 NaNoWriMo event, Fab (of linux outlaws fame) used gitorious to store his manuscript as he was working on it.
Git, BZR, SVN – I’m not going to pick a favourite – all I would suggest is that you play with the version systems available out there, and get used to them, and then pick one and use it. Equally important is backups. The internet is full of cloudy backup stores – from Dropbox to Ubuntu One – think about how you would recover your novel in the event of an emergency.
Here’s where the pipeline gets interesting. You could just start writing a document using gedit – or emacs or whatever your favourite text editor of choice is – however I would suggest that something that allows you to keep track of your chapters, characters etc should be something of importance to the aspiring author, and I think I’ve found one.
Storybook is an open source Java based development environment for managing your novel. You can write up all your character bios, add research notes, google map locations, images, whatever supports you as a writer. It has a timeline, so that you can plot your character states on a timeline, and even allows you the ability to track which characters are dead, and which are alive. The data is stored in a database file based on SQLLite, and therefore your data is never trapped there.
Layout and design
Once you have finished your masterpiece you need to take the text, and any other assets and put them into sigil.
Publishing will produce an epub file – This can now be tested on an ereader of your choice. Ubuntu offers the ereader product (other e-reader products are available)
Amazon do make a Kindle Previewer application available for Windows and Macintosh users – I have managed to get it working under Wine – the following hints should help :
- Download the Amazon Windows installer from here
- Run the installer under wine
- Set the default windows version to be Windows XP
- It may complain that SWT has caused an issue – potentially install VC2008 runtime, using the terminal line : wintricks vcrun2008
- it may crash when loading an epub file – potentially delete ~/.wine/drive_c/windows/winsxs/manifests/*.vc90.*_deadbeef.manifest files (based on a post on wineHQ forum) – this may be because my wine session was set to Windows7
Today Google launched their re-branded Android Market place to Google play. This is interesting for me as I have an android phone and had wondered what this new play malarky was all about. According to the register this new store puts applications 4th on the list, below music, books and movies.
Back when I did the oggcamp content packaging presentation I tried to highlight the advantages that packaging your content would provide to both consumers and producers, and it looks like Google have started to embrace it. What is equally interesting is the dropping of android from the store name. Could we see a generic Google store selling content soon? After all applications are only java applications, and it doesn’t seem beyond the realms of possibility that the virtual machine could be made to work on Linux…
Given Ubuntu on android, Ubuntu TV, I think we seriously need to look again at packaging content in addition to applications if Ubuntu wants to compete with Google here.
Back at oggcamp, I gave a brief presentation about what I saw as the dangers of storing content with cloud providers. My arguments were based on the fact that your provider may go put of business, putting your content at risk.
And now we have the Megaupload case. If you used cyberlocker services such as Megaupload, how safe should you be feeling right now?
The absolutely safest way to host your content somewhere that you control right? I would suggest that if you wanted to be more sure of safely storing your content somewhere, that you consider the possibility if storing it in more than one place. What I would like to investigate is some publishing mechanism that automatically pushes updates to sites.
Imagine you have written a cool book – you host your content in launchpad and an ubuntu package is built whenever you update. But perhaps you also want to make if available on an ftp site. What if launchpad could push your content to an ftp address once complete? I plan to investigate this possibility.
When Launchpad successfully builds a binary deb file is created. What I want to do is to write something that will allow the server to pull a specific file from a DEB file.
I have a PPA which I have used to store the content of the Linux Reality podcast – you can find it here.
So – let’s take episode 100 as an example:
You can find the Deb file here.
What I am working on is a plugin which allows the user to add a link to a page or post like so :
Download something from a debian file file = something...deb = something else
I have a radio podcast out on HPR.
HPR0855: Packaging for your distro,
I recently spoke to Gordon Sinclair on Hacker Public Radio Episode 811 speaking about his idea for a creative commons tracker site.
Gordon – or @thistleweb, wants to create a moderated tracker. The CCTracker contains moderated torrent information. As these torrents are moderated they can be downloaded by users, safe in the knowledge that the content can be downloaded legally, that it truly is the content that it claims to be, and doesn’t contain any torjans or other malicious ‘stuff’
I think this is a great idea – My opinion is that the term of Torrent has been tarred with the brush of illegal downloading, and that CCTracker offers an opportunity to ‘take it back’.
Here’s what I wrote :
Hi Gordon,Hope I’ve got your email right…I really enjoyed your interview with Klaatu – I thought I’d share my thoughts with you about what I think about regarding the importance of content.Firstly can I say that I think your idea to provide certified “Safe” torrents is brilliant. I think that torrents have suffered (as a technology) by the bad press that the illegal use of the technology has generated. I think this is an opportunity to bring it back, and make it a force for good again.I’m an ubuntu user – an I’m writing a series of articles for my blog which discuss an idea that I’m trying to champion, packaging content for inclusion within the distribution. It seems to me that certified torrents (safe torrents) could be a fantastic solution for larger files.
When we download a distro, we get access to the repositories, and we get all kinds of choice. Ubuntu supplies an element of infrastructure surrounding applications in the form of the repositories, but when you want to read a ebook, or listen to some music you’re pretty much out on your own. The (as yet un-named) CCTracker provides a mechanism by which distros can start to offer content along with applications. I think this is an opportunity for distros to start to compete not only in terms of applications, but in terms of content – and when you see how popular tablets are, then supplying content could make one distro to start to look better than another -
Imagine a mythbuntu distro that also has video content – that content could be supplied through software centre (but actually supplied by a CCTracker torrent). As regards to torrent machines hosting canonical torrents, could this be accomplished by using Amazon Computing cloud? Could we use the power of boinc to create a distribute torrent stream?
Here’s what Gordon replied…
Hi Mike,Yeah you got the email right, I should have thought of the show notes for the episode but forgot, so they were hastily drawn up at the last minute, leaving out my email and the tracker info page. Thanks for the feedback.The idea of being able to use the tracker as an efficient backend for content is a step further than I’d thought of but I like it. This is why a community of people is better than one individual. Both cobra2 and myself thought it’d be great to be able to subscribe to a torrent feed for new episodes, instead of a direct download feed. That was as far as we got on that.I like the idea of having an iTunes (for want of a better name) for free & legal content with the tracker as the delivery mechanism, with a few alterations. I want everything originating from this project to be as non-discriminatory as possible, as well as being standards based, with a policy of not reinventing the wheel.
The Ubuntu software centre is pretty decent, but it’s done in a way that’s not exactly easy to port to other distros, specially those not based on Ubuntu or Debian. It’s also a standalone application.
I think a standalone iTunes type of application may be overkill, although I’m open to being convinced. My initial instincts are that it’d be better as APIs allowing existing projects like Miro to add that functionality in. That way people can use the applications they like using, and get that functionality. This also allows devs on projects for Windows and OSX to make the content available to their users.
This is just my initial thoughts on an excellent suggestion. It’s something for later on. The plan is to hold the naming suggestions open for about one month after the HPR interview was released, narrow down and choose names. At that point we move on to getting domain names and setting up the site, including forums. I’d suggest at that point you could join the forums and lay out your suggestions for the others to comment on, build upon and refine.
I want to break down some of Gordon’s points when it comes to content packaging..
- Software Centre is not easy to port to other distros.
Software centre is just a front end to the packaging system provided by the distribution, so if you were running a fedora based distro, then the alternative is package kit – the point is that there isn’t a standard iTunes application for any of the distributions.
- Standalone iTunes may be overkill
I agree – if we’re talking about writing a new iTunes application then this is indeed overkill, plus I think it dilutes the advantage that a iTunes provides
- My initial instincts are that it’d be better as APIs
The packaging systems for the distrubution already have an API – but I think we could write an abstraction layer, allowing Miro or other projects to install content regardless of the packaging back end (APT / RPM)
I’m looking forward to seeing what happens with the CCTracker project accomplishes.
The Linux Reality podcast produced a little over 100 episodes during it’s 2 year existence. The podcast featured content covering topics as diverse as servers through to desktops and everything in between.
The episodes exist on Archive.org, but the plan for Whobuntu is to create a package of linux reality content with descriptions covering the content.
Software centre provides a mechanism by which this content can be found. Therefore if a user were to search in software centre for DNSMasq then they would be able to find the package – but equally importantly they would be able to locate the episode of Linux Reality with a review and discussion of DNS Masq and some of the alternatives.
Using Software centre you would be able to locate the package Gnumeric, but you would also be able to locate the Episode 23 of LR where Chess Giffin talked about Gnumeric.
How to package static content.
To start with the Linux Reality content was downloaded from archive.org. To make it easy a script was developed which automatically downloaded all the episodes via wget
The process to then package a podcast is as follows :
- Execute the getlr script to retrieve all episodes of Linux reality.
- Create a tarball of all the episodes by executing tar -cvf linuxreality_0.1.tar.gz *.ogg
I then created a tar.gz file containing the episodes. I created a Linux reality folder. Named the same as the tarball I created earlier. From here I ran dh_make and a standard package folder was set up for me. The longest most time consuming job during this process was to index all the episodes. Luckily archive.org had all the original notes which I was able to cut and paste into the control file.
The next question was whether a package is installed for everyone, or just the current user. Daniel Hollenbach confirmed that a package is installed at the system level. This makes supporting a multiuser system more complicated, as Linuxreality would show as installed. The next question was whether new users should have it installed as part of the new user process. My thoughts on that are that we should only be providing what is asked for.If I install linuxreality then I shouldn’t mess with other users music?
To that end I created a postinst script. This executes once the package is installed. The purpose of this script is to create a link between /etc/share/linuxreality and ~/Music/LinuxReality.
When installing an application from software centre, software centre does a fairly good job of presenting a very professional looking user inteface but for applications stored in a PPA, the interface isn’t quite in the same league as the applications provided by the ubuntu repositories.
So what can we do about it? How can we make the packages in our PPA look more like the packages in the main repository?
There are 2 parts of the interface which need more investigation if we want our app to look and feel like an Ubuntu application.
The Packaging Icon, and the Package Screenshot.
The package screenshot is downloaded by software centre from screenshots.ubuntu.com. Debian has a similar screnshots server. Debshots works by allowing users to upload screenshots. The screenshots are only allowed for packages within the repositories configured within debshots. Our choices are to either get our PPA added to the list of acceptable PPA’s or we can host our own server.
In order to make this work we’ll need to modify the software centre to make it look at a new server. To accomplish this I packaged a replacement software centre, which replaces the ubuntu software centre.
When software centre loads a package it checks in /usr/share/app-install/icons for an icon – any file matchng the package name. Typically these icons are added by Ubuntu. The whobuntu.applicationicons package adds a series of whobuntu icons to the folder, and as it’s a package we can pass updates before we publish any new packages to the PPA.
When I speak about content packaging to Dave, he gets bored – as far as he is concerned I’m talking about a process. What he gets more excited about is the idea of producing a library of content.
We feel that this is like the equivalent of moving into a new house – friends and family sometimes throw a house warming party, and give house warming gifts. Such gifts aren’t meant to furnish the whole house – in a way we get that pot plant in the form of the Ubuntu community content that typically ships on the CD.
Imagine a distribution where the CD, as it installs gives some sample content, but go onto software centre, and more content is available to download and install directly. For example what if you could download epub books of the complete works of Shakespear ? or the works of Charles Dickens?
So – what would a road map look like for the content packaging?
Initially I would like to see an Ubuntu Classics library – containing those works, and possibly others – perhaps a reference library.
So yesterday I tried my hand at public speaking. For a first attempt I feel that there are many lessons to learn. I think the biggest lesson to learn from my attempt was that the pressure and nervousness is plapable – in many ways I have a renewed respect for people who make it look so easy – people like Simon Phipps or Lorna Mitchell,
I also learned that going to the community with the slimest concept of an idea is not enough. I think I have the skeleton of an idea, and I still feel strongly about the concepts I was trying to talk about.
So I’m going to write more articles – learning the lessons from Lorna, and really flesh out these ideas,
There were interesting points raised – and I’m also going to address these….
- How about using Lenses in Unity
- What about if the user doesn’t have a music folder
- How do we enforce copyright / licence issues
- This only works for debian(?)
- Mature Content
My plan is to really complete this and to present not only a concept, but an implemented idea. All the articles will be tagged ContentPackaging