SLT mobile site Yellow screen of death.
Just been shown this error message for Sandwell Leisure Trust’s Mobile website on Rach’s Mobile.
I was able to re-create the issue using the mobile address, in a desktop browser.
She even worked out how to recreate the issue.
- Browse to http://m.slt-leisure.co.uk/
- in the menu (Top Right) Click Activities
- Search for Gyms – Notice that as you type there is a UI bug that leaves the “L” banner on the screen
- Click on Gyms (this will also work with other activity types)
- From the “All Centres” drop down click on any leisure centre.
- Error shown.
“L” still shown but has no content.
What is interesting is that the page here appears to be performing a post back – but there seems to be no real need to do so. Information for all the relevant sites is already presented in the page. Unless there is additional information there’s no reason why that couldn’t be added through an AJAX call.
The micro:bit goes live
well it seems with very little fanfare, the micro:bit is now being distributed to teachers and students throughout the country.
With my previous post here about the micro:bit. I predict minimal uptake or impact in school lessons. The teachers will not have had any time to incorporate them into lesson plans and the probably wont happen now until possibly September. so these year 7 pupils will actually be in year 8 by the time they actually get to do anything in the classroom with a micro:bit
The launch has been dogged by delays according to the BBC because 1 million units had to made at once, and there were design issues (watch battery could be a choking hazard)
Thing is, other companies do this sort of thing with way more complex boards Microsoft, Sony and Nintendo manage to manufacture boards without these delays. The micro:bit is a smaller, simpler board so should be easier to mass manufacture right?
so the Watch battery is its power source? instead of a watch battery, why not a solar panel? or a bigger battery like a 9v battery? if this was just to make a small portable device that can be programmed, then this would be perfect, but its not. and the reason is obvious. The BBC are betting that wearable tech is the next new thing. pupils make their own micro:bit based wearable tech. problem is with wearable tech is at the moment, the data you get is completely banal, The apple watch basically, acts as a blue-tooth screen for an iPhone, and fitbits are essentially, pedometers that instead of counting steps, store data long term (say over a day) and then download that data to a computer. You cant see your results directly on fitbit. Additionally these devices aren’t waterproof.
But one of the major problems I have is that the micro:bit is free for this year only, resulting in one of 3 outcomes.
- They prove a success in the classroom, Schools will buy a bunch of them and keep them in the classroom, contrary to the founding ethos of the micro:bit
- They prove a success in the classroom and the cost of obtaining micro:bit are left with the parent, creating a 2-tier education system where disadvantaged children lose out because their mum can’t afford £10 on a board, especially if there’s no way to work with it out of school
- They prove a failure, the hobbyist market continues to buy them on occasion but the year 8 cohort will have wasted a year learning how to make a LED flash on and off on a hardware platform that has no further relevancy in an education system that is geared towards attainment
now its possible that some children will be inspired enough to start making other things but I don’t think the numbers justify the sheer amount of money that has been sunk into the micro:bit. Don’t forget we still don’t know about how open the development platform is
Before everyone falls over themselves to jump on this platform, we should also consider the micro:bits main competition. It has competition? yes the Codebug which is a little pricey but has been open source for a while and of course the Raspberry Pi Zero which is is roughly half the price of the micro:bit but currently sold out at the moment.
The original computer boom in the 80’s was based on cheap computing hardware, however let’s get something into perspective, back in 1982, a 48k ZX spectrum would have cost £175 which taking into account inflation would actually be somewhere in the region of £600 while a Commodore 64 cost £400 which equates to £1,372! These are hardly cheap computers compared to a raspberry pi zero or micro:bit as a percentage of a monthly salary.
These are undoubtedly exciting times for computing, with an unprecedented number of truly low cost computers, readily available with a huge potential in using them to do more than just play games. However I still think the micro:bit seems over engineered, poorly thought out with a software platform that seem to be locked behind a Microsoft or apple ecosystem with the vague promise that some day all this will be open source. The success of the micro:bit is dependant on how enthusiastically it is taken up by teachers. If they fail to respond to it, the micro:bit could end up nothing more than an interesting curiosity.
In the previous entries, I had built a small section of LA,placed the Hollywoodland sign, put some buildings in there and some cars. Now its time to think about the actual intro.
Talking to the Director, he wanted to have the majority of the cast over the rain in the sky, before sweeping through the streets of LA with some of the cast and crew credited in shop windows but that would require a huge section of LA to be built, so I decided to split the credits between cast and crew, with the cast being shown in the street somehow.
I decided to make a proper animatic, I used the overture to help me time out the intro. The Cast names were added as text overlays and I added wipes in and out to give it a more retro look and because cutting to the next name seemed very jarring. Once I had the main cast in there, I was able to check how long the camera would need to hold before i moved it to the Theatre, approximately 3 minutes or so. This meant that the camera is essentially static for nearly 3 minutes before moving to the chinese theatre showing the crew in the street. There are 12 crew members to place in the street, but where? That’s when I had the idea of using the star of building LA part 1, The El capitan theatre
The El capitan theatre is just down the street from the real Chinese theatre, and has a foyer with 6 posters in it, Looking at the foyer, you can see that one side is mirrored from the other.
Left Side posters
Right Side posters – note the text is backwards
What we want to do is replace the posters with 6 different posters with 2 crew members on each and 2 with one each on
Ok, so in 3ds max, I created a set of polygons on top of each poster. I then set out to find suitable posters that could be altered to add crew members. I spent ages removing type from the posters before adding the crew member names and credits:
I planar mapped my altered poster imaged on these newly created polygons
I then made a camera move that swooped from the hollywoodland sign down the El Capitan theatre, looking at each poster before swinging around and moving towards the Chinese theatre, I animated some cars so they could be seen
While I was working in GIMP/Photoshop, I decided to remove the shadow from the Chinese theatre front wall, and swap out the poster for the film it was showing ( 300 ) to the one in the script ‘the Royal Rascal’
it looked ok, but it really lacked pizzazz, looking at the film this is the establishing shot of the chinese theatre:
I decided I would replicate the same text I used a font called K22 Spotty Face and used that to make a texture that was mapped to a polygon roughly in the same place
the intro, was a massive animation in 3dsmax, which meant I had to tweak the particle system to show the rain. with all the objects in the scene, it became hard to actually update the display, it would actually take time for the screen to redraw, so previewing the results in realtime was out of the question. I made a series of small preview renders at low quality to let me see how the animation was coming along.
Once timed, it was ready to render. Render time was 4 hours after than I dropped it into the animatic, and rendered it out to an .avi file, which I then dropped into my animatic and rendered out for a final piece of animation
Phew! thats the hardest shot in the show out of the way- next up a starry night sky with some rolling clouds
PhonySession – now on nuget.org
In a previous post I wrote about needing to fake up a HTTP Context for testing a controller and I hinted that perhaps I would create a NuGet package to make this easier. Well I am happy to announce that I have done just that.
NuGet is an open source package manager for Visual Studio and .NET. Developers can take code that they use and package it for distribution from the Nuget Gallery. This allows other developers an easy way to reference and include functionality from libraries. Want to a a unit testing framework to your library? reference if from Nuget. Visual Studio projects support the concept of automatically refreshing dependancies from Nuget when loading, and Nuget Packages can have a defined set of dependancies. I developed PhonySession partly because I wanted a framework I could potentially reuse, and also as an exercise in packaging my .NET library for Nuget.
PhonySession is a nuget package that can be added to a test project allowing a developer to ‘fake up’ a HTTP context, allowing controllers to be tested without needing to stand up servers. Setting up the context is easy. PhonySession uses other NuGet packages, and the dependency management system will automatically add these dependencies to any project that PhonySession is added to.
PhonySession dependency graph
As always code is the key – so here’s an example of setting up a HTTP context, and adding a file from an embedded resource.
var fakeHTTPSession = new TitaniumBunker.PhonySession.FonySession();
fakeHTTPSession.AddFileUpload(new PhonyUploadFile("Screensjot.jpg", GetResourceAsStrream("TestAPI.img100.jpg"), "JPG"));
prodcontroller.ControllerContext = fakeHTTPSession.BuildControllerContext(prodcontroller);
prodcontroller.Url = new UrlHelper(fakeHTTPSession.BuildRequestContext());
This code fragment :
- Creates an instance of the Product Controller – the controller we want to test
- Creates an instance of the FonySession object
- Adds a file to that session object
- Sets the controllers context to the output from BuildControllerContext
- Sets the controllers URL property to a new URL Helper object.
So the eagle-eyed amongst you might notice that when I create a PhonyUploadFile, that I specify “JPG” as it’s MIME type, and that the correct MIMETYPE should be “image/jpeg”. My tests don’t check the MIME type – there’s no reason I couldn’t check the MIME Types – I just don’t think there’s much benefit in testing that, as the file upload functionality is not something provided by my application, rather something provided by the framework.
It’s not perfect – there are a couple of names that I should change (fonysession?) but from a functionality point of view it all works. It may not support 100% of all test requirements (I wrote this for a controller testing a particular issue) and it might be that this framework will have to grow to include other functionality.
PhonySession is hosted on GitHub here and is licensed under an MIT Licence.
Icon Credits : “Disguise” logo by Helen Tseng is licensed under CC BY 3.0 US – downloaded from https://thenounproject.com/term/disguise/24240/
Having previously created the landscape of LA and placed the model of the chinese theater. for a flythrough for the intro of Pied Pipers Production of Singin’ in the rain.Next thing I did was to make some rain
Makin’ it Rain
to make the rain effect, I created a simple particle flow system and altered its settings to look like it was falling like rain. I then built a simple raindrop and then made the particle system reference this one raindrop . I turned off random rotation so the raindrops are heading straight down. Particle systems are expensive in terms of redraw and update speed so I will hide it untill I need to render it.
A single raindrop
I then placed a camera in the scene, set up some controls for it and after a bit of trial and error , made the beginings of what could be an interesting camera move.Here’s roughly what I had:
I then went on a download spree over on 3D warehouse, looking at the collections of models in LA and Hollywood checking against the map to find buildings next to the Chinese theatre. Then I got some more generic buildings and come cars
Changin’ the Format
here’s the next problem. 3D ware house has files in sketchup format, which if you remember from last time, I don’t have sketchup. but 3D warehouse also has the option to download in google earths model format – .KMZ
I renamed each of the .kmz files to .zip before extracting to new folders. Each folder has a collada file and a folder of directories. Luckily 3DS Max will import .dae files
Except when it doesn’t. these files simply crashed 3DS Max. So I tried a different approach. I fired up a copy of Blender and imported the .dae file it loaded so I exported out to .fbx
El Capitan Theater in Blender
after a little of trial and error, I managed to get a model into 3DS Max ! behold the El capitan theatre!
The same El Capitan Theater model in 3DS Max
hmmm. Looks a little strange. lets take a look at the 3D model in the browser again:
The El Captian model in 3D warehouse
Yeah it looks like as part of the dae import, it welds/deletes faces and/or verts seems Dae importer is not very well supported in blender so, back to the drawing board. This time I tried to import the .dae into Maya and straight back out to fbx and back into 3DS Max
El Capitan Theatre, converted via Maya
Ok, that’s looking good. Let’s go ahead and put some buildings in there!
Buildin’ the town
I dropped in some generic looking buildings on Hollywood Boulevard, built some very simple buildings ( literally cubes with texture maps on them) in there and made a starry night sky dome.
Then I found a model T ford on 3D Warehouse. I downloaded , got rid of the open bonnet, and merged the objects to make 4 new object, the car body, the back wheels, and 2 front wheels
I then modelled 2 low poly cones and fitted them to the headlamps. These cones were given a gradient in their diffuse and opacity channels to make them fade out. In effect faking the light cone you see in headlamps in fog
Model T Ford with headlamps
I then patented all this to a dummy, I made copies of the dummy and animated them driving down Hollywood Boulevard.
Paintin’ the town
The opening is in black and white, So I decided that I would have to tweak all the textures, I made an action Photoshop to convert an image to grayscale and ran it as a batch on a copy of the textures (if I ever want to swap them back then it should be easy)
the only building I am going to worry about is Graumans chinese Theater itself. It’s current textures are promoting the film ‘300’ so I need to change it to promote the film in the script.
Lightin’ the lights
Lighting this set was hard, mainly because I suck at lighting. The first thing I did was make a large blue spotlight toward the top of the skydome . This represents the moonlight
I Created some large omni lights at the edges of the planes and set them to white with -1 intensity which sucked the colour out of the scene, darkening the edges of the map when the cameras moving you hopefully won’t notice the square edges of the landscape.
I then stuck a series of lights on each car ( in hindsight I should have lit the cars first then animated them) I put a couple of low intensity omni lights in front of the car and a free spot to represent the headlights.
I found a Streetlamp model from 3D warehouse and created a couple of lights an omni light for the glow and a spot pointing downwards. These were patented to a dummy. I was then able to mass duplicate 20 down Hollywood boulevard
A streetlamp lit rigged and ready to duplicate
A lot of the buildings in my section of Hollwood Boulevard have lit signs .I will mke those textures be self illuminated.
Next up, Balancing Textures and animating the camera move
I have been asked to create a series of video projections for The Pied Pipers Musical Theatre Group’s upcoming production of Singin’ in the Rain. Since I won the NODA national poster competition I have been pretty busy. At the moment, everyone wants me to design backgrounds that are to be projected for 3 different shows.
Singing in the Rain is first- after getting some information from BAWDS I was able to work out a frame size.
First challenge: The director wanted a fly-through through 1920’s LA to the Graumans Chinese theatre.
Well it looks like SketchUp Make could solve this problem. I could use it to generate terrain and place buildings accurately – or so I thought.
Thing is, right now, I don’t use windows at home. I have an ancient Mac mini, so I tried installing SketchUp Make on that. Latest version installed, only to tell me when starting that the OS was the wrong version and so it wouldn’t run- thanks apple
Linux doesn’t have a version, so I tried to install SketchUp under WINE. this installed and ran (sort of) but there was no 3D view. So in desperation I contacted Mike who had a windows laptop and was able to extract chunks of landscape and the Hollywood sign and Chinese theatre to a number of .3ds files.
First up – let’s make the land.
Opening up a copy of 3ds max one lunchtime, I was able to import the landscape. As part of that, mike had very acurately placed the hollywood sign for me. I also had 30 or so chunks of land
Terrain Data extracted from SketchUp
Opening Photoshop I loaded the textures into a large file and using the 3D view as a guide, arranged the textures together:
Then I went through each layer cropping the footer display to create a large texture that could be mapped onto all the planes at the same time.
Texture when cleaned up
Back in max, I combined all the landscape planes together and then made another plane above the land with roughly the same density as the merge plane object. I then applied a Conform spacewarp to the object and suddenly my new plane was fitting snugly over the Holly wood hills
Combining the landscape planes
I now had a single surface with no co planar polygons. I made a snapshot this mesh and deleted the faces and vertices that hadn’t been affected by the conform operation.
Single surface generated from the terrain data
Slap the text ure on and VOILA!
Texturing the landscape
It’s not 100% but its a lot quicker than manually editing and welding thousands of verts and performing countless STL checks
a quick render:
Rendering the Hollywood hills
And I was ready to swap out the HOLLYWOOD sign for the earlier HOLLWOODLAND sign. I fudged it a bit but it still looks pretty good:
Next Up I will add the Chinese Theatre, and then populate the Rest of LA with generic buildings
Thomas thought that testing the controller would be a good thing.
I’m working on a home project at the moment – it’s an MVC/Entity Framework based project, and I have been stumped for the past 3 weeks on something – How can I test it?
Yes, finally after nearly 3 years of development and work, The Snail Tales project is finished. I had actually finished it late last year but decided to get christmas and new year out of the way before releasing to Snail Tales.
Here’s the finished film:
I will be collating all the character and background files and creating a public repository for them
On paper it seems an awful long time to make a piece of animation. But as well as the games I made as part of my job I moved house, got engaged, had to learn hot to use Synfig, and get S-Cargo and the continuous integration system working.
I recorded my presentation at OggCamp late last year – I will upload that shortly. In the meantime, here’s the presentation I did the year before, detailing how Synfig Stage and continuous integration will work:
Windows 10 IOT on a Raspberry Pi 2
I’ve just installed Windows 10 IOT edition on a Raspberry Pi 2 – and I have my suspicions that this isn’t necessarily the same operating system that I am running on my laptop.
Let’s have a bit of context
The challenge to IT – and indeed to us developers – is that users are no longer experiencing our applications or operating systems on the boring beige box like they used to. People are just as likely to use their phone or a tablet as a laptop or a desktop to run that shiny new app. This is something we need to consider when designing the user experience for our products – be they operating systems or applications.
So Microsoft has been working towards unifying their operating systems – which is why you can now get windows 10 running on mobile phones such as the Lumia 550 or Lumia 950 and even the Surface product range. I believe that Microsoft are talking here about the main kernel for their OS, rather than the whole OS. A window manager for desktop PC’s would be very different from a window manager for Mobile phones or tablet based systems. Except that I don’t believe they have gotten around to their IOT offerings yet.
Microsoft’s process for interacting with Raspberry Pis, and turning them into IOT devices is simple enough. Instructions and write ups for the Raspberry Pi can be found here but Windows 10 IOT also supports Minnowboard MAX and the Qualcomm DragonBoard 410c – but with DragonBoards costing roughly £60, and minnowboards going for roughly £100 , the obvious choice is a raspberry pi costing £30. I happen to have a spare raspberry pi hanging around, so I thought I’d give it a go. I set up the pi using the IoT Dashboard Tool.
After my machine had booted, I clicked around on the web based dashboard, looking at my wondrous new machine that was connected to the TV in the other room.
Yes- so that’s definitely my Raspberry Pi2 running Windows 10.
Clicking on some of the options I found a debug page, which listed 2 errors – which I thought was strange considering that the device was doing nothing.
Out of the box – 2 errors. Hardly inspires confidence.
Being the inquisitive sort I clicked on the first error, which confusingly is at the bottom of the list :
|1/21/2016, 10:50:46 PM
||Critical unknown a0795c9de6cdb1f43e165a29b7f6d42caeb2b
Clicking on the Name took me to a detailed screen showing more information about the error :
Confused by an obscure error? This page will help clear it right up.
The friendly name for that error : WindowsPhone8ExecManService
I’m not a gambling man, but I reckon that all Microsoft have done here is compile their windows 8 phone OS for the ARM chip on the Raspberry Pi, and replaced the front end string resources with “Windows 10”. For those interested, the Chip in the Raspberry Pi is a quad-core ARM Cortex A7 – the same processor running on the Microsoft Lumia 550 – which while based on the SnapDragon 210 SOC, has ARM Coretx A7 Processors.
Was able to find something for the WindowsPhone8ExecManService error on stackoverflow :
The value EM_WATCHDOG_TIMEOUT likely indicates that you have blocked the UI thread with a long running piece of code or a wait of some description.
Mark Radbourne [MSFT]
Digital sign in error at millennium point
Adding new hardware to a digital sign?