titaniumbunker.com

Evil geniuses and world domination are 2 of our goals... we also like Dr Who

Archive for the ‘ Technology ’ Category

@tesco cashpoint error in the wild

no comment

An error on a Tesco cashpoint – Warndon

The file c:\VLOGDIR\ATMFIX.LOG could not be opened. The process cannot access the file because it is being used by another process.

Who needs to log stuff anyway…I wonder what that other process is?

How good is your code

no comment

In the last post about making the magic work I discussed how to get continuous integration working with the selfietorium project, and getting it building and deploying a deb file.

This time I’m talking about measuring code quality through SonarQube and Codacy.

At (real) work, we’re planning on using SonarQube to measure code statistics – its a tool that will tell you whether your variable names match coding standards, or whether your code is duplicated, or has unused references etc.  But I found out that SonarSource host an instance of SonarQube that can be used to analyse open source projects.  As selfietorium is an open source project, I signed up.

For those looking to a more private solution, then it is possible to run your own sonarqube server, and that might be a topic for a future post – but for now I’m going to set up SonarQube.com to analyse selfietorium.

Logging on to SonarQube :

SonarQube uses github authentication, so connecting is easy. Once you are logged in you’ll need to create a security token.  SonarQube works by having code pushed to it – so in the previous blogpost we used Travis CI to push the code to SonarQube.  To make this happen we need a security token.

To create a security token :

Once you have authenticated with SonarQube and logged on, click your name in the top right of the page, and select “My Account”.

  • From here click “Security”
  • Click Generate to generate a token – give your token a suitable name.  Once your token is generated make a note of it – you are going to need it later.

Back in Travis CI, the sonar-scanner line instructs a plug in to push the code to sonarqube using the configuration section and a new file that needs to be added to the project root:

sonar-project.properties

Selfietorium’s configuration file can be found here.  SonarQube does have documentation about setting up the properties file, and it isn’t hard to set up.

We can now use the token we got from SonarQube to tell Travis-CI what to authenticate itself as.  This is stored in the environment variables section, using the techniques I touched on in previous post, in particular the section on “Keeping Secrets”

Next time you build your project, it will be pushed to SonaqQube (along with the sonar-project.properties), and analysis performed against the code.

SonarQube is a great tool, but it doesn’t give us what we really want – a nice graphic we can add to our project read me – after all, that’s what’s important right?

Using Codacy

Ggenerating the markup for the all important badge.

Like SonarQube, Codacy uses github for authentication.  To set up a project for analysis, it’ just a case of clicking your project and clicking go.  It’s a much simpler setup that SonarQube.  Getting that all important badge is also a breeze.  Click on the Settings button from the dashboard.
From here you can generate markup for different documentation systems including html, rst, and markdown.  Just copy the Markdown and paste it into the appropriate document on your github repository and now you’ll get a badge rewarding you for making the code better.

 

Correction : selfietorium enclosure v1

no comment

Version 1 selfietorium enclosure presented by titaniumworkshop R&D engineer Alan Hingley

It has been bought to my attention that the initial version of this post incorrectly identified the titanium bunker department responsible for the development of the version 1 selfietorium enclosure as titanium shed. This was obviously incorrect, as titanium shed was the original project code for the department now known as titanium workshop. Accordingly the credit for this work should be titanium workshop.

Selfietorium version 1 enclosure

no comment

Version 1 selfietorium enclosure presented by titaniumshed R&D engineer Alan Hingley

Titaniumshed has now produced an initial version of the selfietorium enclosure.

Making the magic work

no comment

You’re a wizard, Harry… if only you could just get continuous integration to work

I’ve been focusing my efforts on the selfietorium lately and in particular how to combine all the various support systems with GitHub, where the source is stored. This blog post details making the magic work: to get Continuous Integration, build, package and release uploads working.

Continuous Integration is the foundation from which the other support services will hang.  There’s no point in performing code analysis on code that doesn’t build or pass its tests.  So, let’s get started.

In previous projects – like Snail Tales – we have created Jenkins installs and created build scripts for all of this to work.  For the selfietorium project we are using Travis-CI.

Selfietorium is a raspberry pi based Python project, and there is great support in Travis-CI for Python – some languages such as c# are not 100% supported, so Travis-CI may not be suitable for all uses.  Before you start looking at using Travis-CI for your solution, you should probably check that the language is supported by taking a look at the getting started page in the Travis-CI Docs.

Techies amongst you might be thinking

Mike – what are you going to build?  Python is an interpreted language – there is no compiler for Python

And that’s true enough.  I aim to use the Travis-CI build system to run my unit tests (when I write some) and package my Python code into a Debian .deb file to allow easy installation onto a Raspberry Pi.

So let’s get cracking

To start with, you’ll need an account on Travis-CI.  Travis-CI uses GitHub for authentication, so that’s not too difficult to set up – just sign in with your GitHub account.

Now that you have an account what do you do next?  There are a couple of things you need to do to make your project build:  Create your project within Travis-CI and create a .travis.yml file

The .travis.yml file contains all of the steps to build and process your project, and it can be somewhat complicated.  What is amazingly simple though is setting up a GitHub repository to build.  Travis-CI presents me with all of the repositories that are capable of being built.  From here I picked the TitaniumBunker/Selfietorium repository, and that was pretty much it.

Picking which repository to build is probably the simplest part of this set up.

Once your repository is set up it needs to be configured – the docs are an absolute must here. There is no IDE to manage your configuration – all that stands between build success and multiple frustrating build failures is you and your ability to write a decent .travis.yml file.

Nothing will build until you next push something to your GitHub repository.  Push something to your repository and Travis-CI will spring into life, and potentially fail with an error, probably looking something like this:

Worker information
hostname: ip-10-12-2-57:94955ffd-d111-46f9-ae1e-934bb94a5b20
version: v2.5.0-8-g19ea9c2 https://github.com/travis-ci/worker/tree/19ea9c20425c78100500c7cc935892b47024922c
instance: ad8e75d:travis:ruby
startup: 653.84368ms
Could not find .travis.yml, using standard configuration.
Build system information
Build language: ruby
Build group: stable
Build dist: precise
Build id: 185930222
Job id: 185930223
travis-build version: 7cac7d393
Build image provisioning date and time
Thu Feb  5 15:09:33 UTC 2015
Operating System Details
Distributor ID:	Ubuntu
Description:	Ubuntu 12.04.5 LTS
Release:	12.04
Codename:	precise
...
...
...

There’s a lot of cruft in there – but the lines that are interesting are:

  • version – The version line hints that the Travis-CI worker code is on GitHub.  It is.
  • Could not find .travis.yml, using standard configuration. – The build fails to find a .travis.yml file and defaults to building a Ruby project.
  • Description: Ubuntu 12.04.5 LTS – the build workers seem to be Ubuntu based.
  • Cookbooks Version a68419e – Travis cookbooks are used with chef to set up workers

.travis.yml file

The .travis.yml file is effectively a script that executes as the build life cycle executes. The Customizing the Build page says that a build is made up of 2 main steps

  1. install: install any dependencies required
  2. script: run the build script

These 2 sections are essential for any .travis.yml file. There can be more than just these 2 sections, and the Customizing the Build page details a whole bunch of extra steps that can be added to your .travis.yml file.

The. travis.yml file for selfietorium looks like this:

dist: trusty
sudo: required

addons:
  sonarqube:
    token:
      secure: '$SONARQUBE_API_KEY'


language: python

python:
 - "2.7"
# uncomment and edit the following line if your project needs to run something other than `rake`:

before_install:
  - sudo apt-get -qq update
  - sudo apt-get install -y build-essential devscripts ubuntu-dev-tools debhelper dh-make diffutils patch cdbs
  - sudo apt-get install -y dh-python python-all python-setuptools python3-all python3-setuptools
  - sudo apt-get install -y python-cairo python-lxml python-rsvg python-twitter
  
install: true
 
script:
   - sonar-scanner
   - sudo dpkg-buildpackage -us -uc 

before_deploy:
  cp ../python-selfietorium_1_all.deb python-selfietorium_1_all.deb
tyle="white-space: pre-wrap; word-wrap: break-word;"
deploy:
  provider: releases
  api_key: '$GITHUB_API_KEY'
  file: 'python-selfietorium_1_all.deb'
  skip_cleanup: true
  on:
    branch: master
    tags: true

So let’s break down what this script does.

dist: trusty
sudo: required

addons:
  sonarqube:
    token:
      secure: '$SONARQUBE_API_KEY'


language: python

python:
 - "2.7"

This section is the pre-requisites section. It tells Travis-CI that the worker that is going to run this script should be an Ubuntu 14.04 LTS (Trusty Tahr) based machine.  Travis-CI will build on either a Virtual Machine environment (with sudo enabled), or a Container – which is I believe based on Docker.  The issue with docker is that while it takes seconds to provision a container based environment, it currently doesn’t have sudo available to it, meaning that performing activities using sudo (for example, installing build dependencies) is not possible in a container based environment.  The Travis blog does state that:

If you require sudo, for instance to install Ubuntu packages, a workaround is to use precompiled binaries, uploading them to S3 and downloading them as part of your build, installing them into a non-root directory.

Now I still have some work to do around dependency resolution – I think it is possible to trim the number of dependencies right down. At the moment the build system installs all of the runtime dependencies which might potentially be overkill for the packaging – however they still might be needed for unit testing. Further work is required to look into that. If these dependencies can be removed, then the build could potentially be done in a container, speeding up the whole process. I can almost hear the other fellow techies…

But Mike, why don’t you use Python distutils, and use pypi to install your dependencies?

A fair question.  Using pypi would mean that I could potentially install the dependencies without needing sudo access – the issue is that python-rsvg doesn’t seem to be available on pypi, and only seems to be available as a Linux package.

In this section I’m also telling Travis-CI that I would like to use SonarQube to perform analysis on the solution [1] and that the solution language is Python 2.7.  I think the general opinion of developers out there is:

Use Python 3 – because Python 2 is the old way, and Python 3 is the newest

I’d like to use the new and shiny Python 3, but I fear that there may be libraries that I am using that have no Python 3 implementation – and that fear has led me back into the warm embrace that is Python 2.7.  I plan to perform an audit and determine whether the project can be ported to Python 3.

before_install:
  - sudo apt-get -qq update
  - sudo apt-get install -y build-essential devscripts ubuntu-dev-tools debhelper dh-make diffutils patch cdbs
  - sudo apt-get install -y dh-python python-all python-setuptools python3-all python3-setuptools
  - sudo apt-get install -y python-cairo python-lxml python-rsvg python-twitter

In this section I am installing the various Linux packages required to perform the build.  These are standard commands for installing packages onto an environment.

install: true

And here the installation does nothing.  As I said at the top of this article, there is no build for Python programs.

script:
   - sonar-scanner
   - sudo dpkg-buildpackage -us -uc

Right about here is where I would be running any unit tests – but I don’t have any yet.  This script then sends the code to SonarQube – a topic for a future post – and then calls dpkg-buildpackage to create the binary package.  At the end of this step we have a deb file that could potentially be deployed.

before_deploy:
  cp ../python-selfietorium_1_all.deb python-selfietorium_1_all.deb

Before I deploy the deb file, I need to copy it, so I copy the generated deb file into the current working directory.

deploy:
  provider: releases
  api_key: '$GITHUB_API_KEY'
  file: 'python-selfietorium_1_all.deb'
  skip_cleanup: true
  on:
    branch: master
    tags: true

Finally, we deploy the file.  The provider: releases line tells Travis-CI to use the GitHub Releases provider to push the build artifact to a GitHub release.

It uses a secret API key to gain access to the project releases.  The file is the name of the generated file, and the skip_cleanup prevents Travis-CI from resetting the working directory and deleting all changes made during the build.  The on section controls when files are deployed.  With this setting, only changes made to the master branch, that are tags trigger the deployment.  GitHub releases are actually tags, so creating a release creates a tag on a branch.  For selfietorium we create releases on the master branch.  The release deployment then pushes the build artifact to GitHub effectively offering it as the binary file for that release tag – and uploading it to the GitHub Release.

Keeping Secrets.

In order for Travis-CI to upload your build artifact, it will need to identify itself, and to do that we create a Personal Access Token.  Using this Token, the GitHub Releases provider can now communicate with GitHub as if it was you.  We can’t just add the GitHub token to our .travis.yml file.  Well, I suppose we could, but then we shouldn’t be surprised if other files start appearing in our releases.  The .travis.yml file is publicly available on GitHub – so we need a way of storing the token securely, and injecting it into the script during execution.  Travis-CI offers an ability to store environment variables within a project.  These variables are hidden when the log files are produced if you clear the display value in build log check box.  To use that variable in your .travis.yml file you’d refer to it like this:

$GITHUB_API_KEY

Travis-CI Environment Variables

Grabbing a Status badge

Within the Travis-CI project settings screen, clicking the status badge offers the ability to generate suitable markdown for GitHub.

Adding a spiffy status badge to your GitHub ReadMe.md markdown could not be easier

Quick Recap.

So what we’ve done so far is:

  • Configured Travis-CI to build when we push to the repository.
    • Eventually this will allow for unit tests to be run – but at the moment there are no unit tests for selfietorium.
  • Configured Travis-CI to package the application into a .deb file when a release is created.
    • Releases are effectively tags within a git repository.
  • Configured Travis-CI to deploy out build artifact back to GitHub using a Personal Access Token.
    • Personal Access Token is securely stored in a Travis-CI environment variable.
  • We’ve created some spiffy markdown for a status badge that we can incorporate into our repository markdown.

Debian file built using Travis-CI and deployed to GitHub

Here’s what it looks like when you run the installer under Ubuntu:

Selfietorium installation through Ubuntu Software Centre

In a forthcoming as yet un-written post, I’ll document how to set up the packaging folder so that selfietorium is installable and executable on a Linux system.  It will probably borrow quite heavily from this page I wrote about content packaging.

Useful tools

Travis WebLint – check your Travis configuration for obvious errors.

Footnotes

  • [1] I’ll return to that in a later post when I’ll talk about Continuous Inspection

These aren’t the chairs you’re looking for @StaplesUK

no comment

I know it’s sad times for staples UK – I spent many a happy time in staples, refreshing my manilla folders for my family research – but I can’t help it’s a little early for it all to start to fall apart.

My office chair – actually bought from Staples only a few years ago is starting to look its age, and I thought about replacing it – so I clicked on the “See all Deals” button under “Big Chair Event” and presented with a list of manager and executive chairs.

 

Now I’m not really a manager type – I like to get my hands dirty  (in as much as I don’t like to get my hands dirty – that’s why I work with code) so I was thinking about a mesh chair.  So I clicked on Mesh Seating :

No mesh seating here...

No mesh seating here…

Also missing are Draughtman Chairs. Interestingly I can find a mesh seating section  – http://www.staples.co.uk/mesh-seating/cbk/670.html

So what’s happening?

Well – comparing the draughtsman, mesh seating and ergonomic chairs links – against the working links, it seems that the culprit seems to be : cm_sp.

For example – here is the failing Mesh Seating link :
http://www.staples.co.uk/mesh-seating/cbk/670.html&cm_sp=W16_11_017_02UK-_-Na-_-Na?web_track_id=135829767&position_id=2&promo_code=989989999&lcb=10

And a slightly modified (and now working)  mesh seating link :

http://www.staples.co.uk/mesh-seating/cbk/670.html?cm_sp=W16_11_017_02UK-_-u_ad_4_href&web_track_id=135829704&position_id=2&promo_code=989989999&lcb=10

The highlighted Na-_-Na looks suspiciously like Not Applicable, or potentially “NaN” truncated to fit.

 

Searching on fighting knives throws error :

no comment
fighting knives - search error when searching

fighting knives – search error when searching

Thanks to Stuart Baldwin for pointing this one out : searching for anything on fightingknives.info for anything breaks the site, returning the message :

A potentially dangerous Request.Path value was detected from the client (&).

Looking at the favicon it appears to be a DotNet Nuke site – wow… that’s old – so old that I think this was originally running on the .NET 2 framework,

Anyway – the reason for this is the search url that the site navigates to when searching :

http://www.fightingknives.info/fighitngknivesinfo/search-results&Search=test

From the stack trace it seems that this site is running under .NET framework v4, and there were changes made to the v4 framework that extended request validation from only .aspx requests, to all requests.

To ‘fix’ this the site owner can add :

<httpRuntime requestValidationMode="2.0" />

To their web.config file, to prevent this from happening – or alter their application pool to use the older .NET frameworks (should be fine in version 2, may be fine in version 3 and 3.5)  I say’fix’ because really they should be perhaps looking to update to a newer version, or re-writing their search facility to not pass potentially dangerous characters into their own requests.

Thanks Stuart

On holibobs…

no comment

My self and the good lady wife are currently holidaying on there island of Madeira, and we’re having a great time. While out for an evening stroll we spotted these wonderful balancing stones – which I photographed this morning.

Mysterious stone columns

And it got me thinking about application architecture.Take this pile of stones.

Three layers of stone architecture

At first glance it looks pretty cool right, and it certainly is a great of engineering. But it’s pretty hard to replace the top layer. Put on a layer that had a different weight distribution and the whole stack becomes unstable. And the lower down the stack you attempt to replace a layer, the greater the difficulty involved, as that layer and every layer above it is affected by a change.

From a software point of view what does it mean then?

Well each layer is built depending on the layer(s) below it. In software terms it would be like the business layer opening and holding a SQL connection and transaction and then calling multiple data layer calls using that connection and transaction. The business layer has knowledge and a dependency on the data layer. A better approach would be to handle idbconnection and idbtransaction objects, but what about a web service layer?

I’m not an architecture expert, so this is something I’ll have to think about, but I think it might make an interesting article for the internal newsletter at work.

Three Network’s knowledge server is down…

no comment
Knowledge server down...

Knowledge server down…

Other companies might call them web servers… not Three.

It’s about time…

no comment

I’ve subscribed to a new podcast recently.  I go through periods of engagement and disengagement with genealogy, and at the moment i’m pretty engaged – probably because my Ancestry subscription is about to renew.

Anyway I received an email from Ancestry last week telling me that they had developed a podcast show with Tony Robinson, and that the first few episodes are available to download.  So I fired up my trusty podcatcher software – BeyondPod – and hunted around looking to subscribe –

The page for the podcast is here.

I thought I’d look at the instructions to subscribe to the podcast through this handy YouTube video that ancestry has thoughtfully put on the page.

But as an android user it seems that I had to download a podcast tool called Stitcher.  Really all I wanted was the RSS address – so I tweeted ancestry, as I couldn’t find the address anywhere.

I then browsed the site using a laptop, and lo and behold : RSS icon.  As Ancestry UK says – it looks like the RSS icon is missing from the mobile interface.

Its about time - Laptop

Viewing the page on deektop – RSS icon

RSS icon missing when viewing the page in mobile

RSS icon missing when viewing the page in mobile

I’ve downloaded the episodes – and I’ve listened to episode 3 so far on my way into work so far.

I did spot that there were issues with Tony’s script though – at roughly 25 seconds in :

“…It would be unusual to read anything about the drive for voting rights for women in Britain without seeing mention of the Pankhurst name.  That’s not just down to sisters Christabel,  Sylvia and Adela though. Their mother Emmeline was the founder of the women’s social and political union and she lead the British suffragette movement in the early 1900’s.  Their Mother Emmeline was the founder of the women’s social and political union and she lead the British suffragette movement in the early 1900’s.”

Looks like there was an editing issue with episode 3.  The audio quality for episode 3 certainly seems lower than the other episodes.

 

Nice touch though : Mark Hamil is in Episode 4 !!!

Categories

Archives

Tags