It has been bought to my attention that the initial version of this post incorrectly identified the titanium bunker department responsible for the development of the version 1 selfietorium enclosure as titanium shed. This was obviously incorrect, as titanium shed was the original project code for the department now known as titanium workshop. Accordingly the credit for this work should be titanium workshop.
Archive for the ‘ python ’ Category
Titaniumshed has now produced an initial version of the selfietorium enclosure.
I’ve been focusing my efforts on the selfietorium lately and in particular how to combine all the various support systems with GitHub, where the source is stored. This blog post details making the magic work: to get Continuous Integration, build, package and release uploads working.
Continuous Integration is the foundation from which the other support services will hang. There’s no point in performing code analysis on code that doesn’t build or pass its tests. So, let’s get started.
Selfietorium is a raspberry pi based Python project, and there is great support in Travis-CI for Python – some languages such as c# are not 100% supported, so Travis-CI may not be suitable for all uses. Before you start looking at using Travis-CI for your solution, you should probably check that the language is supported by taking a look at the getting started page in the Travis-CI Docs.
Techies amongst you might be thinking
Mike – what are you going to build? Python is an interpreted language – there is no compiler for Python
And that’s true enough. I aim to use the Travis-CI build system to run my unit tests (when I write some) and package my Python code into a Debian .deb file to allow easy installation onto a Raspberry Pi.
So let’s get cracking
To start with, you’ll need an account on Travis-CI. Travis-CI uses GitHub for authentication, so that’s not too difficult to set up – just sign in with your GitHub account.
Now that you have an account what do you do next? There are a couple of things you need to do to make your project build: Create your project within Travis-CI and create a .travis.yml file
The .travis.yml file contains all of the steps to build and process your project, and it can be somewhat complicated. What is amazingly simple though is setting up a GitHub repository to build. Travis-CI presents me with all of the repositories that are capable of being built. From here I picked the TitaniumBunker/Selfietorium repository, and that was pretty much it.
Once your repository is set up it needs to be configured – the docs are an absolute must here. There is no IDE to manage your configuration – all that stands between build success and multiple frustrating build failures is you and your ability to write a decent .travis.yml file.
Nothing will build until you next push something to your GitHub repository. Push something to your repository and Travis-CI will spring into life, and potentially fail with an error, probably looking something like this:
Worker information hostname: ip-10-12-2-57:94955ffd-d111-46f9-ae1e-934bb94a5b20 version: v2.5.0-8-g19ea9c2 https://github.com/travis-ci/worker/tree/19ea9c20425c78100500c7cc935892b47024922c instance: ad8e75d:travis:ruby startup: 653.84368ms Could not find .travis.yml, using standard configuration. Build system information Build language: ruby Build group: stable Build dist: precise Build id: 185930222 Job id: 185930223 travis-build version: 7cac7d393 Build image provisioning date and time Thu Feb 5 15:09:33 UTC 2015 Operating System Details Distributor ID: Ubuntu Description: Ubuntu 12.04.5 LTS Release: 12.04 Codename: precise ... ... ...
There’s a lot of cruft in there – but the lines that are interesting are:
- version – The version line hints that the Travis-CI worker code is on GitHub. It is.
- Could not find .travis.yml, using standard configuration. – The build fails to find a .travis.yml file and defaults to building a Ruby project.
- Description: Ubuntu 12.04.5 LTS – the build workers seem to be Ubuntu based.
- Cookbooks Version a68419e – Travis cookbooks are used with chef to set up workers
The .travis.yml file is effectively a script that executes as the build life cycle executes. The Customizing the Build page says that a build is made up of 2 main steps
- install: install any dependencies required
- script: run the build script
These 2 sections are essential for any .travis.yml file. There can be more than just these 2 sections, and the Customizing the Build page details a whole bunch of extra steps that can be added to your .travis.yml file.
dist: trusty sudo: required addons: sonarqube: token: secure: '$SONARQUBE_API_KEY' language: python python: - "2.7" # uncomment and edit the following line if your project needs to run something other than `rake`: before_install: - sudo apt-get -qq update - sudo apt-get install -y build-essential devscripts ubuntu-dev-tools debhelper dh-make diffutils patch cdbs - sudo apt-get install -y dh-python python-all python-setuptools python3-all python3-setuptools - sudo apt-get install -y python-cairo python-lxml python-rsvg python-twitter install: true script: - sonar-scanner - sudo dpkg-buildpackage -us -uc before_deploy: cp ../python-selfietorium_1_all.deb python-selfietorium_1_all.deb tyle="white-space: pre-wrap; word-wrap: break-word;" deploy: provider: releases api_key: '$GITHUB_API_KEY' file: 'python-selfietorium_1_all.deb' skip_cleanup: true on: branch: master tags: true
So let’s break down what this script does.
dist: trusty sudo: required addons: sonarqube: token: secure: '$SONARQUBE_API_KEY' language: python python: - "2.7"
This section is the pre-requisites section. It tells Travis-CI that the worker that is going to run this script should be an Ubuntu 14.04 LTS (Trusty Tahr) based machine. Travis-CI will build on either a Virtual Machine environment (with sudo enabled), or a Container – which is I believe based on Docker. The issue with docker is that while it takes seconds to provision a container based environment, it currently doesn’t have sudo available to it, meaning that performing activities using sudo (for example, installing build dependencies) is not possible in a container based environment. The Travis blog does state that:
If you require sudo, for instance to install Ubuntu packages, a workaround is to use precompiled binaries, uploading them to S3 and downloading them as part of your build, installing them into a non-root directory.
Now I still have some work to do around dependency resolution – I think it is possible to trim the number of dependencies right down. At the moment the build system installs all of the runtime dependencies which might potentially be overkill for the packaging – however they still might be needed for unit testing. Further work is required to look into that. If these dependencies can be removed, then the build could potentially be done in a container, speeding up the whole process. I can almost hear the other fellow techies…
But Mike, why don’t you use Python distutils, and use pypi to install your dependencies?
A fair question. Using pypi would mean that I could potentially install the dependencies without needing sudo access – the issue is that python-rsvg doesn’t seem to be available on pypi, and only seems to be available as a Linux package.
In this section I’m also telling Travis-CI that I would like to use SonarQube to perform analysis on the solution  and that the solution language is Python 2.7. I think the general opinion of developers out there is:
Use Python 3 – because Python 2 is the old way, and Python 3 is the newest
I’d like to use the new and shiny Python 3, but I fear that there may be libraries that I am using that have no Python 3 implementation – and that fear has led me back into the warm embrace that is Python 2.7. I plan to perform an audit and determine whether the project can be ported to Python 3.
before_install: - sudo apt-get -qq update - sudo apt-get install -y build-essential devscripts ubuntu-dev-tools debhelper dh-make diffutils patch cdbs - sudo apt-get install -y dh-python python-all python-setuptools python3-all python3-setuptools - sudo apt-get install -y python-cairo python-lxml python-rsvg python-twitter
In this section I am installing the various Linux packages required to perform the build. These are standard commands for installing packages onto an environment.
And here the installation does nothing. As I said at the top of this article, there is no build for Python programs.
script: - sonar-scanner - sudo dpkg-buildpackage -us -uc
Right about here is where I would be running any unit tests – but I don’t have any yet. This script then sends the code to SonarQube – a topic for a future post – and then calls dpkg-buildpackage to create the binary package. At the end of this step we have a deb file that could potentially be deployed.
before_deploy: cp ../python-selfietorium_1_all.deb python-selfietorium_1_all.deb
Before I deploy the deb file, I need to copy it, so I copy the generated deb file into the current working directory.
deploy: provider: releases api_key: '$GITHUB_API_KEY' file: 'python-selfietorium_1_all.deb' skip_cleanup: true on: branch: master
It uses a secret API key to gain access to the project releases. The file is the name of the generated file, and the skip_cleanup prevents Travis-CI from resetting the working directory and deleting all changes made during the build. The on section controls when files are deployed. With this setting, only changes made to the master branch, that are tags trigger the deployment. GitHub releases are actually tags, so creating a release creates a tag on a branch. For selfietorium we create releases on the master branch. The release deployment then pushes the build artifact to GitHub effectively offering it as the binary file for that release tag – and uploading it to the GitHub Release.
In order for Travis-CI to upload your build artifact, it will need to identify itself, and to do that we create a Personal Access Token. Using this Token, the GitHub Releases provider can now communicate with GitHub as if it was you. We can’t just add the GitHub token to our .travis.yml file. Well, I suppose we could, but then we shouldn’t be surprised if other files start appearing in our releases. The .travis.yml file is publicly available on GitHub – so we need a way of storing the token securely, and injecting it into the script during execution. Travis-CI offers an ability to store environment variables within a project. These variables are hidden when the log files are produced if you clear the display value in build log check box. To use that variable in your .travis.yml file you’d refer to it like this:
Grabbing a Status badge
Within the Travis-CI project settings screen, clicking the status badge offers the ability to generate suitable markdown for GitHub.
So what we’ve done so far is:
- Configured Travis-CI to build when we push to the repository.
- Eventually this will allow for unit tests to be run – but at the moment there are no unit tests for selfietorium.
- Configured Travis-CI to package the application into a .deb file when a release is created.
- Releases are effectively tags within a git repository.
- Configured Travis-CI to deploy out build artifact back to GitHub using a Personal Access Token.
- Personal Access Token is securely stored in a Travis-CI environment variable.
- We’ve created some spiffy markdown for a status badge that we can incorporate into our repository markdown.
In a forthcoming as yet un-written post, I’ll document how to set up the packaging folder so that selfietorium is installable and executable on a Linux system. It will probably borrow quite heavily from this page I wrote about content packaging.
Travis WebLint – check your Travis configuration for obvious errors.
-  I’ll return to that in a later post when I’ll talk about Continuous Inspection
Well the BBC have recently announced a new initiative to get children to code so take a second to think how would you accomplish this?
The BBC have made their own small computer called the micro:bit which comes with a number of sensors built-in, can be run from a couple of batteries and more importantly will be given away to year 7 children
So far this all sounds good. So how does one code for this device?
Well, all you have to do is attach the micro:bit to your iPad, android tablet or your PC and use the IDE app to write code before publishing the finished code to the micro:bit.
Sorry, say that bit again?
Well, all you have to do is attach the micro:bit to your iPad, android tablet or your PC and use the IDE app to write code before publishing the finished code to the micro:bit.
That’s right, in order to teach children how to code, you connect this device up to another computer to publish some code for it
So what er actually have here, is not a computer but something more akin to an Arduino?
Here’s the problem I have with this. If the whole concept of the micro:bit is that it is something that children can learn to program with, then the concept is flawed. It’s flawed because in order to use the micro:bit you must have another computer to program it on. So little Billy who doesn’t have a smartphone, and whose single parent mother is working hard to put food on the table, not iPads in their hand is
shit out of luck will be somewhat disadvantaged.
Oh, but he could develop at school right? Well, when I was at school there was 1 BBC micro for a class of children. This is assuming the infrastructure is there, that it is available for Billy to use after hours and that is really only an hour or two Monday to Friday.
Yeah, but don’t kids get given iPads at school these days?
Do they? I don’t seem much evidence of cash-strapped LEA’s doing this. It’s entirely possible that LEA’s with bigger budgets might do this, but this can lead to a two-tiered education for out children.
But let’s for a second assume that your LEA has bottomless pockets and have rolled out IPads to all students. Are we sure we want to teach kids to program but Only within an Apple ecosystem? for that matter the IDE and platform that is developed by Microsoft. but more on that later.
How about smartphones? loads of kids have smartphones right?
sure a lot of kids do have smartphones and it is a problem that schools are having a lot of problems with – some teachers will tell you that smartphones offer too much of a distraction while others love the concept of BYOD (bring your own device) in a classroom environment.
It’s true that smartphone uptake amongst year 7 is probably very high, but I would think that it’s more likely that year 7 smartphone usage will be using apps like Crossy road, Angry Birds and Snapchat. I very much doubt that your average year 7 will happily whip out their phone and start coding for micro:bit. How many of you here have written more than a text on a mobile?
The problem I have with BYOD in the classroom is that there is no standard platform. which means that some of the kids with zippier, newer phones will have an advantage over kids with a slower phone or an older platform. that is assuming that your platform is supported in the first place.
This program has the same flaw as 3D TV – you need an accompanying piece of not necessarily commonplace technology to use it. The Raspberry Pi costs £25 requires a monitor a keyboard and a mouse. The monitor can be a TV and the mouse and keyboard can be obtained relatively cheaply, say £5 bought online? so you can be computing for £30
So far the cost of entry with the raspberry pi is£5 what’s the cost of entry for the micro:bit. What’s the cheapest computer I could get to run on this? surprise it’s a raspberry PI, so the cost of entry to use the micro:bit is £30!
so discounting the micro:bit, I can already be programming in Scratch or Python or Java on Raspberry PI with micro:bit there will be a web-based IDE which hasn’t been publicised much though word is that there will be a drag and drop solution that will then download to the micro:bit
Another problem this is a free giveaway to year 7 pupils FOR ONE YEAR ONLY!
which means that should the program be deemed a failure, then the micro:bit will disappear faster than the crowd at an opening night party for a Broadway play when the first bad review comes in,
Should it be a success, then it becomes a purchase for either the school or for parents to take care of and right now we still don’t know the price. If the micro bit costs £10 then the initial outlay to get a development platform is £40!
Remember when I mentioned that Microsoft are behind the hardware and software? Here’s another point to consider. The main selling point of the micro:bit is that it is a way of doing the “internet of things” in a way that school children can understand. The problem is there is already hardware and software platforms that do this called Arduino . It has already been used in numerous projects and both the hardware and software is open source.
Micro:bit currently isn’t although this will happen, but just not yet.
This means there’s now yet another platform offering IoT functionality that further muddies the water. I am sure that industry professionals will continue to use existing platforms which seems to be mostly Arduino meaning that unless there are follow-up classes for pupils to learn about these other platforms they will enter industry unable to make simple IOT projects – which kind of defeats the object of the micro:bit in the first place right?
Right now, apart from the board, there are scant details on how this will all work. I don’t want to be a negative Nelly about this, but the raspberry PI is an easier sell than a small piece of circuit board.
I had a chat with Mike, this is what he said
–Mike’s Prediction —
I predict that – unfortunately – the micro:bit will be a massive failure. Children that are interested in coding will already be working with technologies such as the Pi. Those that had little interest in embedded computing will do the minimum required to pass the course, and it will then sit in a drawer. I think that the official programming language will do little, as there will be little to no commercial uptake of the micro:bit – as technology companies won’t see the first practitioners reach the job market for a few years, and when they do you can almost guarantee that the embedded computing platforms of tomorrow won’t be the micro:bit. I think that to improve adoption there needs to be a more engaged attitude from pupils, and I have the opinion that most of the students today care about angry birds, Facebook and not much more than that. I also believe that this project will need a wide variety of projects that can be done using the technology. Ideally, I believe that these technologies projects should support and be supportive of other subject areas. For example: how about combining the embedded micro:bit along with a drama course to provide automatic sound and lighting queues. Now – this is a silly example : the computer that you are running to program the micro:bit is more powerful than the micro:bit itself, but the idea that you can trigger events based on a simple interface to play sound effects or run lighting etc from a small box might be a project to get the principles across to pupils. I think that however such joined up thinking, combining multiple disciplines will be difficult for schools to implement and I, therefore, predict that it will become a boring and inaccessible technology failure.
For those of you in the know, for the past couple of years I have been working on an animated short film. It’s a long process to make a short animation and with lots of assets to keep track of. I use a production chart to keep tabs on everything. here is a snapshot of the production chart as it stands:
Now the eagle-eyed amongst you will notice that it’s a spreadsheet In the past I have tutted and rolled my eyes when people have complained that when they use a spreadsheet to catalog their DVD collection, they couldn’t per pixel scroll, it would snap to the nearest cell. And then a patiently explain that a spreadsheet is not designed to catalog a collection of DVD’s, A spreadsheet is really good a totaling columns of numbers and/or applying formulae on them. A DVD catalog is best done with a database.
Yes I know I should use a Database to store the production chart. It is a more effective way to store this information. Each scene is a record that can have a series of fields applied to them. we could poll the database for complete scenes and get an accurate percentage of how much of the film is animated or rendered or needs work(etc)
thing is I am, to my own surprise, a little bit old school. I learnt to breakdown sound using a mixing desk, and large sheets of paper, jogging through soundtracks, listening for the pops and whistles and decoding them into the phonemes that made up the characters speech. and this is a digital equivalent of the old school way of creating a production chart – It’s a digital analogue of an Analogue er.. analogue
today , kids examine wave forms or use software tools to provide easier breakdowns, and whilst I like them and do use them a fair bit, sometimes , I think that younger animators, fresh into the field, are lacking some of these old school skills.
part of my old school curmudgeon-ness is the creation of dope sheets and production charts. there was something exciting about transferring your sound breakdown to a dopesheet ready to animate, it was a prelude to the storm of creation that leads to the initial pencil tests. I loved the way the Production charts would fill up with checks and notes becoming more full as the deadline approached.
working on this project has been great fun. the biggest problem has been scheduling the time to make the animation and learning to use the software. Part of that has been learning some of the limitations of the software and the creation of new software tools to allow me to work with the software the way I want to work with it. I was using synfig stage last night and it struck me I have talked a lot about it at Oggcamp and other tech shows without really showing it. I started using it and it worked straight away (more or less) and so showing it working didn’t feel important because it actually was working. I suppose I should make a video demonstrating the tool and the problem it solves.
Priority though is on the film. Right now with about 16 scenes left to animate there’s a definite feeling its starting to come together as a film and part of me will be glad to get it finished, to move on to the next thing. part of me also misses my old school beginnings. and I hope maybe one day in the future, I will do a proper old school 2D short using an actual pencil on real paper.
I’ve recorded an episode of HPR, and it’s out today! In this episode I chat about my Ubuntu Ebook Template Project – I’m still battling with giving a voice to my ebooks but hopefully a wise HPR community member might be able to offer some advice.
Why not record your own HPR episode. It’s super fun and easy to do. Head over to Hacker Public Radio and see how easy it is to contribute.
I have been developing a little application in python.
“oooh!” I hear you cry, “Get you using free and open source development languages to develop a cross platform application. Was it a liberating, life changing experience?”
Well yes. Yes it was. And no as well
Here’s the yes, the pros of development. The plan was to develop a small application to run on the Mac Mini we use here in the bunker as a DVD player. Having spent some time reading up on Cocoa and straining my eyes trying to read the display ( its connected to the TV via a scart composite cable so the desktop display isn’t great) I decided to develop the application on my Windows Desktop using python. The rational being that it is cross platform, if I can write it to work in a windows environment, it should work in a Mac environment right?
I installed a portable python and Boa constructor onto my trusty USB Key based development studio ( more on this another time) and in an afternoon I had managed to write my first ever python WX app !
booting up the Linux laptop I installed python and wx in a short period of time (Ubuntu software centre is the gift that keeps on giving! ) and yup, the application ran fine
So far, so good, I have developed in one platform and deployed to another. now the next step. compile it into a stand alone exe that can run in Mac OS. And this is where we come to the cons
Mac Software centre
Mac OS has a software centre for getting software. I wanted to be certain that I had the latest version of python. so I typed it in the search field. After sifting through downloads of Monty Python’s flying circus shows and films I found it. all for the princely sum of £1.99
really? £1.99 for software that’s free and easily download-able from pythons own website after a cursory search on google? Why anyone would think that Apple where a money grubbing organisation trying to fleece owners by charging for free content?
next up was installing wx. A relatively painless experience. The was an installer on their website that allowed the installation of the package and I was finally able to test the application in the Mac OS and it worked! Kind of. Because of font differences. one of the buttons needed to be enlarged and moved – cue 20 minutes of putting values in, running the module , stopping the module, changing values etc just to more-or-less replicate what showed up first time on windows and Linux.
In order to convert the python script into a stand alone app was going to use py2app which makes creating app files easy. This is the point where I lost the best part of a day struggling with a number of issues, chief among them is the fact that as a Windows/Linux user I am used to right clicking a link in my web browser and saving links. to right click on mac you hold down ctr (yeah its so intuitive to me too) Apparently Mighty mouse which are 4 button mice do actually have a right click functionality, but it needs to be set up from the finder by assigning the secondary click function to the right button- don’t panic, its not as scary as it sounds, but you got to question why such simple design paradigms have to be defined. When i plug a new mouse into my windows or Linux box, I don’t have then tell the OS what buttons do what It works out of the box.
To install Py2app I was going to have to install another python package called easyinstall. This is the point where I lost a lot of time on the documentation. which , while it showed me what to type in terms of setting it up. I was typing the commands into a python shell window and trying to run it as a python module with no luck until Mike told me that it was terminal prompts. Just a line explaining that this was to be run from a terminal would have saved me 4 hours of frustration trying to get easy install er… installed
installing app2exe (again)
However, once I had installed easy install, installing py2app was a dream. its a lot easier to use terminal commands when you actually know they are terminal commands!
making the application
I then started setting up the solution to create the app. Py2app basically takes a number of arguments, the first is the location of the main.py file, followed by resources. in this case the application icon. The problem here was my USB Key. On Mac OS it appears with the title “NO NAME” and while py2app could parse the space in the location of the initial .py file, apparently for the icon file it didn’t recognise it. Cue renaming of drive to replace spaces with underscores
This time I wan the py2app applet and SUCCESS!!!! a program appeared in my Release folder in my user profile. After 2 evenings of frustration, weeping and much gnashing of teeth, I had built my first Python Mac OS application!
Lets run it to see how it works.
And it crashed. Something to do with different versions of WX!!!! AGGGGGGGHHHHH!!!!
ho hum back to the drawing board.
- Assemble Avengers
- Content Packaging
- Dr who
- Open Source
- Open University
- Planet Ubuntu
- Quickly Ebook Template
- s book
- Snail Tales
- This Modern Life
- February 2017
- January 2017
- December 2016
- November 2016
- October 2016
- September 2016
- August 2016
- July 2016
- June 2016
- May 2016
- April 2016
- March 2016
- February 2016
- January 2016
- December 2015
- November 2015
- October 2015
- September 2015
- August 2015
- July 2015
- June 2015
- May 2015
- April 2015
- March 2015
- February 2015
- January 2015
- December 2014
- November 2014
- October 2014
- September 2014
- August 2014
- June 2014
- May 2014
- April 2014
- March 2014
- February 2014
- January 2014
- December 2013
- November 2013
- October 2013
- September 2013
- August 2013
- July 2013
- June 2013
- May 2013
- April 2013
- March 2013
- February 2013
- January 2013
- December 2012
- November 2012
- October 2012
- September 2012
- August 2012
- July 2012
- June 2012
- May 2012
- April 2012
- March 2012
- January 2012
- December 2011
- November 2011
- October 2011
- September 2011
- August 2011
- July 2011
- June 2011
- May 2011
- April 2011
- March 2011
- February 2011
- January 2011
- December 2010
- November 2010
- October 2010
- September 2010
- August 2010