Samiam’s Scribble Pad

May 19, 2018

My Left Arm and the Royal Wedding

Filed under: Uncategorized — admin @ 5:34 pm

Everything old is new again. My left arm has been in the wars since 1980. It has taught me a lot and I still need it. It’s hard to know where to start this story or where it ends. Basically in 1980 I was an average eight year old kids attending primary school in the eastern suburbs of Adelaide. My parents split up a few years earlier and my father took up a job in North Africa teaching the nomadic farmers around Ksar Chellala in Algeria about modern farming practices of raising livestock such as inoculation against disease and the like. As a stock inspector with the South Australian department of agriculture this didn’t seem out of the ordinary. Foreign aid in agriculture in the eighties was a lucrative venture for those willing to travel. Myself and my two siblings stayed in touch with their father abroad via audio cassettes so we could hear each other’s voices albeit with six weeks latency. Now we get annoyed by a few milliseconds on Skype. So where does the left arm fit in with this story? The three Hodge kids were packed off in nice fresh tracksuits and traveled unaccompanied from Adelaide airport to Heathrow. We were well cared for by Qantas flight staff and were greeted by our father at the other end. But quite the journey for children which were used to flying from Adelaide to Kingscote Airport on Kangaroo Island, a twenty minute flight never leaving the state. Once in London, we met our second cousins, and enjoyed sights of royalty and pretty gardens. Bath, Oxford and Brighton followed. A trip over the channel in a hovercraft led to a train trip from the coast to Paris, where we found Eiffel Tower and the Louvre, before heading to the south of France to Ville de France where Dad had learned French before heading to Algeria. We also did a trip around the south of Ireland in a Caravan drawn by a patient Clydesdale, where my left arm gets to be part of the story. Bringing the horse in from the stable where it was tied up for the night. The three Hodge kids were riding the horse which has a very wide back in comparison to an eight year old’s legs. To dismount from an eight foot high horse resulted in me fracturing my ulna and radius. Dad and I were rushed to Dublin in an Ambulance leaving my elder brother and sister with the host family, apparently they had a wonderful time playing with an Apple Lisa computer so all jokes about Irish being backwards are lost on me. Anyway I can remember much about the hospital apart from an intramuscular injection in my rump and meeting a kid in the bed next to me who insisted on saying he had the name Sydney which was just a way of teasing me because I was Australian. How was I to know that the inhabitants of our ladies hospital for sick children were just trying to be social. To this day I have a terrible phobia of needles. Anyway we left the hospital in a cast and after a checkup in London we headed off to the arid plains of Algeria in an compound of expats in Ksar Chellala. We met nomadic elders in Bedouin tents who have lived this way for centuries herding their flock and travelling with an extended family and the amazing Arabian horses. We travelled to Roman ruins in Tunisia and saw the forts of the French occupation and the Muslim culture. It was very enriching for a kid from East Adelaide Primary School, whose family from both sides were humble wool farmers. That being said the plaster cast wore loose and when the cast was removed in London the doctor decided that muscle would grow over my second elbow and all would be well. The orthopaedic surgeon at the Women’s and Children’s Hospital had different opinions. I had it reset under anaesthetic and had a metal pin inside the length of my ulnar. It stayed there for a good twelve months before it was removed under a second surgical procedure. So that was two summers in plaster. Then the winter after I fractured my elbow playing football, I still managed to ride my bicycle home after the match, favouring the other arm for steering. Always the left arm. I think it was 1981 that I had the surgery and I was reminded last night that this was the year of the Charles and Diana Royal Wedding. So why the story telling now. On Thursday I had another procedure on my left arm. A few more bike falls in my mid forties left scar tissue so my ulna nerve is no longer getting the signal through to my pinky and ring finger. So it is time to fix that up and fish out about five bone nodules on the front of the joint. So tonight Charles and Di’s youngest is marrying his heartthrob from the States, and nothing has changed I still have a bung left arm. But my fracture taught me to say “J’ai tombe d’un cheval” when the Arabs stared at my cast and asked me what happened. I am still scared of needles and the nurse had to hold my hand when I had the drip put in on Thursday. Somethings never change but to know the world is bigger than a few suburbs of Adelaide when you are eight years old gives you a perspective of the world that doesn’t show its wisdom in a few scars on your left arm. Now my biggest fear is I won’t be able to type properly anymore. But this came out OK with just one right thumb and an iPhone.

March 24, 2018

Let’s have a thought about Net Neutrality and you…..

Filed under: Uncategorized — Tags: — admin @ 7:48 am

OK so the FCC has decided and net neutrality is dead

But for the most part it had no effect on your life, facebook still works, netflix still works, what is the problem.

The problem is that the internet is supposed to be free, not just for some but for everybody.

This became a first hand problem for me over the past week. I have been using the internet as a way to make my living since about 1995.

The thing is you have a connection to the internet, via a dialup modem, university network, ADSL whatever and then from there you connect to the “rest of the internet”.

Back a long time ago the infrastructure of what was hosting the other end of the internet, sometimes mattered, if you were connecting to the protien database and it was down for maintainence, you had to do something else until it was back up. But you didnt need to think too hard about how you connected to it, you just put in the internet address, and zoom you were there. Obivously this was possible because lots of network engineers had paved the way with their souls, but apart from that it was pretty much free to use as you please.

Zoom forward from 1995 to 2018 where we are no longer in a net neutral world. I am trying to get some source code from My internet connection from my house to the Croydon exchange is fast enough it syncs at 19Mbps down and 1 Mbsp up. But where does it go to from there? Obviously it goes to my ISP who will remain nameless, an then it goes one hop at a time until it reaches the repisitory at When it does so it goes there over “public roads” and “toll roads”. The issue is that if I was destined for there might be some peering arrangement set up with my internet service provider so that traffic will always be nice and quick. Hence the average internet user will say ISP X is great, my Netflix never stutters, join ISP X. So this change of network structure from allowing the traffic to flow freely, but instead being in partnership with the provider of traffic, means the little guys are at a disadvantage over the big guys, meaning it becomes a user pays system. I get it nothing in the world is free.

Why am I having a sulk about this, capitalism isnt new? Well it is 7:12 am and I want to do some study into Mask RCNN in Pytorch, which is all out there and freely available, the great library of Alexandra hasnt burned yet, knowledge is free, this is great, but I cannot get to it because of net neutrality is gone. I am waiting for a download of PyTorch 0.3 with cuda 8.0 for unbuntu and the server that it is hosted on isn’t one that gets popular votes from my ISP as being a cool and groovy place to so where the ISP will pave the road with gold so I can get fast connectivity. So instead I am back to dialup speed trying to expand my knowledge. This is how net neutrality harms the community. I cant learn because of toll roads on the internet.

I am not a happy camper. When I got in contact with my ISP about it , they said they cannot guarantee anything outside of the network.

Anyway bugger it I will just go for a walk and let it download and hope for the best.

But when it is a 4.5Gb download from somewhere that will not resume, I am out of luck, I will have to go back to the underground sneaker network from the late eighties where people who have the data can be contacted one at a time.

The internet is dead.


June 24, 2017

The robots are coming to take your jobs… welcome to last century.

There has been a lot of hype in the press about the advent of artificial intelligence being able to automate away jobs.
The thing is that if I was in the job of being a bullock driver delivering sacks of wheat, or a schooner sailor doing the same. I really ought to think about my career options.
In my own careers I have seen and even been responsible for automation reducing the labour force.
The first example was in my previous career in Molecular Biology. I got my first break in the field helping out a wonderful woman Michelle Walker. She worked partime in the JCW lab at the Uni of Adelaide. She was pregnant with her second child and as a result she couldn’t work with the radio isotopes that you use to sequence DNA. Namely phosphorus thirty two attached to adenosine tri phosphate, nothing that nasty a beta emitter with a half life of 30 days. But while you have child onboard it isnt worth taking the risk.
Anyway Michelle and John Cronan trained me up how to sequence DNA. You need to understand what you are doing, but in the end its not much more complicated than cleaning glass and mixing up reagents like baking a chocolate cake. It does have a certain amount of craft for getting it right. Bubbles between two layers of glass separated by 0.3 mm filled with a liquid poly acrylamide solution with a reagent that will harden in a few minutes makes it tricky. Needless to say after your twentieth run you get good at pouring gels. timing of the polymerase reaction and stopping it with the four letters of the genetic code, and putting them in the correct order in the lanes of the gel. On top of being careful with the hot (radioactive) reagents.
I did all this on a volunteer basis so I could get lab experience, because we were in the middle of the recession that we had to have and I had yet to graduate.
Anyway sequencing DNA was a laborious process taking about 6-12 hours of fairly specialised labour to get about six hundred base pairs of DNA sequence. Given that we were sequencing the genomic DNA of yeast for pyruvate carboxylase, this is the dead parts of the DNA sequence that dont make the enzyme as well as the important bits that do. It could add up to hundreds of hours of work.
Lets fast forward three short years later. I graduated from Honours from the same JCW lab, and got a gig with Prof Peter Hoj, out at the Waite Campus. I had cloned some modified genes into a bacterial plasmid and it was time for sequencing. So I was all ready to go clean the glass mix up my reagents and get busy doing some DNA sequencing to confirm that I got what I thought I had made with the mutagenesis of a Barley gene.
This is where the automation kicks in. We dont do our own sequencing any more, we send it to another central service which uses a machine, and you get back the sequence on a floppy disk, perfect 1200 base pairs, done by a machine.
So to summarise, Sam the DNA sequencer gets replaced by a machine.
OK onto another career. I worked on a sequence on Harry Potter six, the Half Blood Prince. I was working as a lighter on the room transformation. The sequence has about forty animated props, the plate, the actor, a match move for the actor, the lighting as a 360 lat long image. Nothing that complex by today’s standards. But this is back on a 32 bit machine where you can only access 4Gb of RAM at a time. Lighting this shot use to take quite a process. I would calculate the ray traced passes in advance, stored them on disk as a sequence of images, and then look up those images as a cache of the calculations, to overcome the memory limitation of the computer, this process was done by me by hand each time the shot was run. Along with splitting it up into passes, foreground, background etc. Hold out for the actor. So each time the animation would update it would take me about 4 hours of jiggery pokery to run the rendering calculation on a bank of computers called the render farm. A lot of skill was required to memorise all the steps involved.
Fast forward 7 years, now we have automated the lighting process. Building a new version of animation could take 2 minutes to build and be submitted onto the farm. Meaning you can work on about ten times the amount of shots as a lighter as you could seven years ago.
So to summarise automation reduced the amount of labour required in the computer graphics industry.
The bit that I left out is I am responsible to make sure that we can get the most out of the automation of the production of computer graphics. That is my job, I make the robots that are coming to take away your jobs.
Anyway if I was a truck driver at the moment I would be considering that the shots in the movie Logan of the autonomous trucks on the highway are a sign of the future. Dont fret. It happened to the schooner sailor in the wheat belt of South Australia some time ago. I am sure they found something else productive to do.

April 30, 2016

The refound lost artform of Slack

Filed under: Uncategorized — admin @ 8:14 am

I was just thinking back over my career as a digital artist in the Multimedia and Film Visual effects industry.

Back in 1999 we were short in modelling resources on a project so we hired a bloke who will go by the code name Chappy.

Chappy was a nice enough bloke, but had basically been long term unemployed and was overjoyed at the possibility of having fulltime employment.

He really overdid it coming to work in a evening suit covered in cologne on his first day. This was a little bit of a warning sign that he didnt really understand the norms of the digital workplace, for those not in the know, a tshirt hoody jeans and sneakers are pretty much the uniform.

Anyway we overlooked his attire and put him to work in the tools that he claimed that he had the skills to operate.

Things were amiss, what he claimed to be able to do and what he was delivering was a little out of whack. Anyway with training we thought Chappy could attain the skills to get the job done. We gave him the benefit of the doubt.

Then another thing came up that rang some alarm bells. Remember this is 1999 so there is no streaming YouTube on your desktop. Chappy had formed a TV Soap addiction while being long term unemployed. Something I am happy to admit to taping episodes of Bold and the Beautiful on VHS so we could keep up with the happenings of the fashion industry on day time soap land. Anyway, this didnt occur to Chappy. He thought he could watch it while he was at work, so he bought in his mini TV set it up on his desk and watched while he was at work. Not during an allocated TV watching slot that was made available as company policy, he just helped himself to some company time to feed his media habit.

So the alarm bells were ringing, Chappy was not good at his job, he was using company resources, the time we were paying him to do modelling, for his own benefit feeding his media habit. Not unlike a cashier operator taking some resources out of the till for their own benefit. We decided that this wasn’t what the company needed so we had to let Chappy back into the dole queue to feed his media habit at the taxpayers benefit rather than at the company’s expense.

So lets fast forward 17 years now it is 2016, and I am still pretty much doing the same thing, making media for companies to sell to consumers. Rather than multimedia for the fitness industry, it is visual effects for the Hollywood film industry.

Now we have lots of people with media habits. The media isn’t a daytime soap opera. It is thousands of things: social media, podcasts, tv episodes, digital newspapers, opinionated bloggers etc etc. So while we are at work there are little blocks of time that are fragmented so small that you can take a micro break and take in one of these many forms of media. When I was working in Molecular Biology, if you were incubating something for 20 minutes and your lab book was up to date, why not take in the view and have a cigarette. So I have nothing against taking little breaks to punctuate the day.

But things have got so out of hand

People will spend up to six hours of the day consuming their media content while “working”. I have listened to music while working. If it is an album that I have known for many years, it is just comforting noise, doesn’t occupy my thoughts. But when your eyes are watching footage and listening to dialogue, I cannot fathom how this couldn’t disrupt your concentration. That said people have listened to talkback radio in the workplace for decades, so maybe I am barking up the wrong tree.

Anyway nowadays we are going to a workplace with an Internet blackout for security reasons, people see it as some sort of injustice that they cannot take their little microbreaks or extended binges. Yet back in 1999 we had to let someone go for pretty much the same reason.

Why have the standards changed so much?

How much do our media habits cost the workplace?


December 3, 2015

MtoA OpenEXR Meta Data via Python Scripting Arnold and Maya EXR

Filed under: Uncategorized — admin @ 1:04 pm



Basically this post is a call for help, and once I have it solved it will become a reference for anybody else trying to do the same thing in the future.

In the Maya Arnold Render Globals there are some fields to Add MetaData with a “Name”, “Type” and “Value” field available.

Which is cool if you are clicking, but if you want to push these buttons with code, what are the attributes called or the methods are to set them

so basically the answer is usually documented in the source


class EXRDriverTranslatorUI(templates.AttributeTemplate):
    def changeAttrName(self, nodeName, attrNameText, index):
        # Get the attribute name, type and value
        attrName = nodeName+'['+str(index)+']'
        metadata = cmds.getAttr(attrName)
        result = metadata.split(' ', 2 )
        result += [""] * (3-len(result))
        # Get the new name
        name = cmds.textField(attrNameText, query=True, text=True)
        # Update the name in all the templates
        templatesNames[:] = [tup for tup in templatesNames if cmds.columnLayout(tup, exists=True)]
        for templateName in templatesNames:
            cmds.textField(templateName+"|mtoa_exrMetadataRow_"+str(index)+"|MtoA_exrMAttributeName", edit=True, text=name.replace(" ", ""))
        # Update the metadata value
        metadata = result[0]+" "+name.replace(" ", "")+" "+result[2]
        cmds.setAttr(attrName, metadata, type="string")
    def changeAttrType(self, nodeName, menu, index):
        # Get the attribute name, type and value
        attrName = nodeName+'['+str(index)+']'
        metadata = cmds.getAttr(attrName)
        result = metadata.split(' ', 2 )
        result += [""] * (3-len(result))
        # Get the new type
        typeNumber = cmds.optionMenu(menu, query=True, select=True)
        type = cmds.optionMenu(menu, query=True, value=True)
        # Update the type in all the templates
        templatesNames[:] = [tup for tup in templatesNames if cmds.columnLayout(tup, exists=True)]
        for templateName in templatesNames:
            cmds.optionMenu(templateName+"|mtoa_exrMetadataRow_"+str(index)+"|MtoA_exrMAttributeType", edit=True, select=typeNumber)
        # Update the metadata value
        metadata = type+" "+result[1]+" "+result[2]
        cmds.setAttr(attrName, metadata, type="string")
    def changeAttrValue(self, nodeName, attrValueText, index):
        # Get the attribute name, type and value
        attrName = nodeName+'['+str(index)+']'
        metadata = cmds.getAttr(attrName)
        result = metadata.split(' ', 2 )
        result += [""] * (3-len(result))
        # Get the new value
        value = cmds.textField(attrValueText, query=True, text=True)
        # Update the value in all the templates
        templatesNames[:] = [tup for tup in templatesNames if cmds.columnLayout(tup, exists=True)]
        for templateName in templatesNames:
            cmds.textField(templateName+"|mtoa_exrMetadataRow_"+str(index)+"|MtoA_exrMAttributeValue", edit=True, text=value)
        # Update the metadata value
        metadata = result[0]+" "+result[1]+" "+value
        cmds.setAttr(attrName, metadata, type="string")
    def removeAttribute(self, nodeName, index):
    def addAttribute(self, nodeName):
        next = 0
        if cmds.getAttr(nodeName, multiIndices=True):
            next = cmds.getAttr(nodeName, multiIndices=True)[-1] + 1
        cmds.setAttr(nodeName+'['+str(next)+']', "INT", type="string")
    def updateLine(self, nodeName, metadata, index):
        # Attribute controls will be created with the current metadata content
        result = metadata.split(' ', 2 )
        result += [""] * (3-len(result))
        # Attribute Name
        attrNameText = cmds.textField("MtoA_exrMAttributeName", text=result[1])
        cmds.textField(attrNameText, edit=True, changeCommand=pm.Callback(self.changeAttrName, nodeName, attrNameText, index))
        # Attribute Type
        menu = cmds.optionMenu("MtoA_exrMAttributeType")
        cmds.menuItem( label='INT', data=0)
        cmds.menuItem( label='FLOAT', data=1)
        cmds.menuItem( label='POINT2', data=2)
        cmds.menuItem( label='MATRIX', data=3)
        cmds.menuItem( label='STRING', data=4)
        if result[0] == 'INT':
            cmds.optionMenu(menu, edit=True, select=1)
        elif result[0] == 'FLOAT':
            cmds.optionMenu(menu, edit=True, select=2)
        elif result[0] == 'POINT2':
            cmds.optionMenu(menu, edit=True, select=3)
        elif result[0] == 'MATRIX':
            cmds.optionMenu(menu, edit=True, select=4)
        elif result[0] == 'STRING':
            cmds.optionMenu(menu, edit=True, select=5)
        cmds.optionMenu(menu, edit=True, changeCommand=pm.Callback(self.changeAttrType, nodeName, menu, index))
        # Attribute Value
        attrValueText = cmds.textField("MtoA_exrMAttributeValue", text=result[2])
        cmds.textField(attrValueText, edit=True, changeCommand=pm.Callback(self.changeAttrValue, nodeName, attrValueText, index))
        # Remove button
        cmds.symbolButton(image="SP_TrashIcon.png", command=pm.Callback(self.removeAttribute, nodeName, index))
    def updatedMetadata(self, nodeName):
        templatesNames[:] = [tup for tup in templatesNames if cmds.columnLayout(tup, exists=True)]
        for templateName in templatesNames:
            #Remove all attributes controls and rebuild them again with the metadata updated content
            for child in cmds.columnLayout(templateName, query=True, childArray=True) or []:
            for index in cmds.getAttr(nodeName, multiIndices=True) or []:
                attrName = nodeName+'['+str(index)+']'
                metadata = cmds.getAttr(attrName)
                if metadata:
                    cmds.rowLayout('mtoa_exrMetadataRow_'+str(index),nc=4, cw4=(120,80,120,20), cl4=('center', 'center', 'center', 'right'))
                    self.updateLine(nodeName, metadata, index)
    def metadataNew(self, nodeName):
        cmds.rowLayout(nc=2, cw2=(200,140), cl2=('center', 'center'))
        cmds.button( label='Add New Attribute', command=pm.Callback(self.addAttribute, 'defaultArnoldDriver.custom_attributes'))
        cmds.setParent( '..' )
        layout = cmds.columnLayout(rowSpacing=5, columnWidth=340)
        # This template could be created more than once in different panels
        cmds.setParent( '..' )
    def metadataReplace(self, nodeName):
    def setup(self):
        self.addControl('exrCompression', label='Compression')
        self.addControl('halfPrecision', label='Half Precision')
        self.addControl('preserveLayerName', label='Preserve Layer Name')
        self.addControl('tiled', label='Tiled')
        self.addControl('autocrop', label='Autocrop')
        self.addControl('append', label='Append')
        self.beginLayout("Metadata (name, type, value)", collapse=True)
        self.addCustom('custom_attributes', self.metadataNew, self.metadataReplace)

So after that it isn’t rocket surgery

There is a compound attribute with strings in it called “custom attributes” as seen in the screen shot below


So if you want to bang another another value into that custom attribute you can

#Print attributes
for i in maya.cmds.getAttr("defaultArnoldDriver.customAttributes",mi=True):
    print maya.cmds.getAttr("defaultArnoldDriver.customAttributes[%d]" % i)
def addAttribute(nodeName, value):
    \add an attribute
    next = 0
    if cmds.getAttr(nodeName, multiIndices=True):
        next = cmds.getAttr(nodeName, multiIndices=True)[-1] + 1
    cmds.setAttr(nodeName+'['+str(next)+']', "%s %s %s" % (value["type"],value["name"],value["value"]), type="string")
#Call function above
example = {"type":"FLOAT","name":"example","value":2.0}        
nodeName = "defaultArnoldDriver.customAttributes"
#Print attributes again after adding
for i in maya.cmds.getAttr("defaultArnoldDriver.customAttributes",mi=True):
    print maya.cmds.getAttr("defaultArnoldDriver.customAttributes[%d]" % i)
#This updates the GUI    
from import customShapeAttributes
c = customShapeAttributes.EXRDriverTranslatorUI('aiAOVDriver')

But that doesn’t update the GUI, but may make an EXR with meta data.

So I think I need to do it with a little more care.

But in reality it is there in the data structure, the gui will be updated if you just add and then delete an element, because it will call updatedMetadata method in the class above.

So I guess I could just call that by myself.

If I knew how to get to the method of a callback on an AE template installed by a Python Class.

Basically that is the start of the reverse engineering.

Will update once it is properly solved.

Also it doesn’t enable me to set these values with an expression per frame, but that shouldn’t be too tricky to do.

September 12, 2015

Hypocrisy and the Fully Connected Graph of the Internet and the First World Problem

Filed under: Uncategorized — admin @ 11:30 am

While discussing this article


Yesterday I proposed that there is a symptom of overworking that causes the pace of life to be too fast, which can make you ill. While pulling weeds by hand watering the garden at the community plot, I found the panacea for this. Quite simply is to go slowly while you are not working and allow your thoughts to decompress. Albeit, a first world problem.


But what annoyed me about the discussion on social media about this was it was immediately dismissed as a first world problem. Which leads back to the tokenism of growing a handful of vegetables in an organic plot and the saving grace of pulling weeds out by hand is the solution to the problem, of working too hard. Obviously the success of technology means that food production is automated which would give us more leisure time to pursue other things of interest rather than subsisting on a third world hand to mouth existence of food production and consumption by the simplest means.


But how ironic is it to call out a problem on the Internet as a “first world problem”. In the third world is there a fair trade semiconductor network for your self-righteous thoughts to travel on the internet?


Then I thought more about this issue, without being a hypocrite how can you use the term “first world problem” on the Internet.


Basically you would need to make sure that all of your data packets were travelling on computers/routers/cables/satellites that were made in factories with fair trade agreements. Installed by workers that were union members. Also the materials would need to be mined from lands where respects had been paid to the traditional owners. You would need to make sure that there was no history of tyrants governing the people who mined, refined and manufactured the components of the Internet. You better hope the IT technician that solved the problem with the routing issue, wasn’t getting paid any more than the person working in the mine where the silica for CPU the router that is being used. Otherwise is isn’t really fair trade.


Oh that is right, the Internet is a product of the first world, so the Internet itself IS a first world problem.


So until we get an Internet of trained axolotls that carry messages on the ankles by their own free will. Please don’t use the “first world problem” as a put down without considering that you are massively ironic.

August 23, 2015

Replacing Capacitors in a Cisco 877

Filed under: Uncategorized — Tags: — admin @ 12:06 pm

Just when you think you have reached the level of maximum geekiness, you step it up a notch.

I work making visual effects for Hollywood movies, that is pretty geeky.

I am a mature aged student studying Computer Science, reading pure math on a Sunday, that is pretty geeky.

I have participated in a Community Wireless Networking group Air-Stream in fact I designed their logo, that is pretty geeky.

As a result I happened upon having a DSL router, that is made by Cisco that brings the internet into my home.

I bought it second hand off ebay in November 2014.

Installing that and being able to navigate Cisco’s operating system IOS is a pretty high level of geeky.

But when said modem starts playing up, and switch it off and switch it off again gets too annoying.

You COULD throw is away and get a new one.

OR you could google the hell out of the issue and find out about which capacitors are worn out.

see this thread here:







So this afternoon I will head to the local Jaycar and see if I can get my hands on

6800µF 105°C 6.3V 15mm Aluminum Electrolytic Capacitor for under $5 each.

Then on Wednesday night, I am going to turn up to the local hackspace after work and see if I can have a crack at soldering in the new parts.

Then I will be back to the previous glory of this





Why just have the internet when you can solder your own internet together.

Today’s life is all too prefabricated.

I got a recipe for coconut and pumpkin soup.

The recipe asked for a can of pumpkin.

I only had a pumpkin grown in the neighbour’s back yard given to us as a gift.

Needless to say I was still able to make the recipe without getting the pumpkin put into a metal can and shipped half way around the world.

Make your own pumpkin soup, make your own internet better.

May 23, 2015

Steering behaviour for wander as 1d noise of angular value.

Filed under: Uncategorized — admin @ 12:29 pm

Found a cool tute on a wander behaviour for my cat simulator–gamedev-1624

This is a classic Ordinary Differential Equation for solving 2d position and velocity.

I was thinking about this.

And it looks like a lot of steps to produce a fairly continuous angular rotation with a random nature to it.

Continuous is both its first and second derivative.

The I thought, that is exactly what Simplex Noise does, produces gradients that are continuous in the first and second derivative.

Also this system only solves the position and velocity of the avatar, from this you need to derive the transform from the position and the direction of travel.

What it doesnt give you is the angular velocity. This is required if you need to work out how much of a turn left or a turn right animation to blend in.

So I thought an alternative would be work in polar coordinates for velocity so you have an angular position and angular velocity and either a constant or varying forward position.


Anyway now I think about it is a bad idea, but it was fun while it lasted.

February 23, 2014

Inverse decay of light and an alternative to traditional image based lighting and a move to incident light fields.

Filed under: Uncategorized — Tags: — admin @ 4:06 pm

Let me take you back, way back to year 9 in high school in 1985, where I was introduced to light and photography and the inverse squared rule of light decay.


In black and white photography, you expose of piece of photosensitive paper to light for a period of time using an enlarger then go an develop that exposure with a chemical reaction in a tray and the darkness develops in front of your eyes.

The more light that the paper gets, the darker the colour will be. That is why you use a negative in the enlarger so you mask of the black areas.

Here comes the inverse squared law. If you want to make a print of a a4 piece of paper you might have worked out you need to expose for 10 seconds to get the image that you want.

But then you want something that is bigger. An a3 print, you need to wind back the enlarger so that the image is projected onto a greater area. The lamp still has the same brightness the negative still masks off the same amount of light.

The paper still has the same chemical response. So you expose the a3 sheet for the same 10 seconds. The image comes out very pale.


The inverse squared rule of decay. Because the light is further away from the photosensitive paper, not as much light reach per unit area, so to get the same chemical reaction for the same density of blacks, you need to expose for longer.

The rule is as follows, if you double the distance between you and a light source, you end up with one quarter the amount of light per unit area.

So it follows to get the same exposure you might need to expose the a3 sheet for more like 40 seconds compared to the a4 sheet of photosensitive paper.

Just to prove I am not making this up look at this Wikipedia page

So the inverse square law is real, I have seen it in action when developing prints in 1985.

The reality is that luminance is measure in candella per square metre see

So on a film set you can get an approximation of this by shooting a high dynamic range image, by assembling a number of low dynamic range images at a number of different shutter speeds.



But something is usually forgotten with this process.


  1. Calibration
  2. The effect of distance and the inverse square law

The calibration could be easily overcome by using a light emitter of known energy in an off state and and on state.

So to do this you have an LED of a known size at known distance from the camera.

You measure the luminance at 1 cm from the light source with a and get a result of light sources luminance in candella per meter squared.

Then you create a HDR of this same light source at a fixed distance from the light source, say 1m.

If this is a standard LED then you dont need the radiometer every time.

If you had access to the radio meter you could just measure the energy of your light sources in candella per meter squared on set.

From this you can then derive what a pixel value on the film back of the image taking the HDRI is equivlant to in candella per meter squared.


So we have a HDRI and we know the energy levels of the light arriving at the film back of the camera taking the HDRI.

Now to go further.

If you want to know the energy levels of the light at the surface they are being emitted from, you need to reverse the inverse square decay algorithm.

So if you have two light sources in your HDRI with equivalent pixel values, then the luminance of those two light sources is equivalent at the film back of the camera.

But what if the distance of those lights sources were 1 m away from camera and 2m away from camera, both occupying the same pixel area in a cubic cross HDRI.

It follows that the one 2 m away would be 4 times the intensity as the light that is 1 m away.

Someone else has covered projecting the HDRI spherical projection onto geometry here:

This is valid for using this as a measure of the light at the surface of the object as an albedo.

But if you want to use this as a light source for illuminating synthetic object, with correct attenuation.

You need to take into account the inverse squared falloff of the light from its surface to reaching the film back where it is measured and the luminance at the light source.

Further more you can put some importance sampling into your light sources.

Here is a cool paper

Anyway, this page

Explains the concept a whole lot better than me.

But this came to me at 11pm on a Saturday night, when I was trying to go to sleep, so I thought I would scribble it down on a piece of paper so the insomnia didnt get the better of me.

Now it is 3:32 on a Sunday afternoon, the lawn has been mowed and my blog entry is complete for now.

May 25, 2013

Thread about tiled normal and colour maps in the Maya viewport

Filed under: Uncategorized — Tags: — admin @ 12:19 pm

Thread on the Area Forums

The script that saved my bacon:

  • Multi Channel Setup MEL
  • Mel script itself

    Which used an Add/Multiply node instead of a multiLayer texture.

    Works with Mental Ray and in inbuilt renderer, but with 3delight and normal maps not so much.

    Now our playblasts can look sweet with tiled UVs and Viewport 2.0

Older Posts »

Powered by WordPress