Samiam’s Scribble Pad

April 30, 2016

The refound lost artform of Slack

Filed under: Uncategorized — admin @ 8:14 am

I was just thinking back over my career as a digital artist in the Multimedia and Film Visual effects industry.

Back in 1999 we were short in modelling resources on a project so we hired a bloke who will go by the code name Chappy.

Chappy was a nice enough bloke, but had basically been long term unemployed and was overjoyed at the possibility of having fulltime employment.

He really overdid it coming to work in a evening suit covered in cologne on his first day. This was a little bit of a warning sign that he didnt really understand the norms of the digital workplace, for those not in the know, a tshirt hoody jeans and sneakers are pretty much the uniform.

Anyway we overlooked his attire and put him to work in the tools that he claimed that he had the skills to operate.

Things were amiss, what he claimed to be able to do and what he was delivering was a little out of whack. Anyway with training we thought Chappy could attain the skills to get the job done. We gave him the benefit of the doubt.

Then another thing came up that rang some alarm bells. Remember this is 1999 so there is no streaming YouTube on your desktop. Chappy had formed a TV Soap addiction while being long term unemployed. Something I am happy to admit to taping episodes of Bold and the Beautiful on VHS so we could keep up with the happenings of the fashion industry on day time soap land. Anyway, this didnt occur to Chappy. He thought he could watch it while he was at work, so he bought in his mini TV set it up on his desk and watched while he was at work. Not during an allocated TV watching slot that was made available as company policy, he just helped himself to some company time to feed his media habit.

So the alarm bells were ringing, Chappy was not good at his job, he was using company resources, the time we were paying him to do modelling, for his own benefit feeding his media habit. Not unlike a cashier operator taking some resources out of the till for their own benefit. We decided that this wasn’t what the company needed so we had to let Chappy back into the dole queue to feed his media habit at the taxpayers benefit rather than at the company’s expense.

So lets fast forward 17 years now it is 2016, and I am still pretty much doing the same thing, making media for companies to sell to consumers. Rather than multimedia for the fitness industry, it is visual effects for the Hollywood film industry.

Now we have lots of people with media habits. The media isn’t a daytime soap opera. It is thousands of things: social media, podcasts, tv episodes, digital newspapers, opinionated bloggers etc etc. So while we are at work there are little blocks of time that are fragmented so small that you can take a micro break and take in one of these many forms of media. When I was working in Molecular Biology, if you were incubating something for 20 minutes and your lab book was up to date, why not take in the view and have a cigarette. So I have nothing against taking little breaks to punctuate the day.

But things have got so out of hand

People will spend up to six hours of the day consuming their media content while “working”. I have listened to music while working. If it is an album that I have known for many years, it is just comforting noise, doesn’t occupy my thoughts. But when your eyes are watching footage and listening to dialogue, I cannot fathom how this couldn’t disrupt your concentration. That said people have listened to talkback radio in the workplace for decades, so maybe I am barking up the wrong tree.

Anyway nowadays we are going to a workplace with an Internet blackout for security reasons, people see it as some sort of injustice that they cannot take their little microbreaks or extended binges. Yet back in 1999 we had to let someone go for pretty much the same reason.

Why have the standards changed so much?

How much do our media habits cost the workplace?

Sam

December 3, 2015

MtoA OpenEXR Meta Data via Python Scripting Arnold and Maya EXR

Filed under: Uncategorized — admin @ 1:04 pm

snap-clockworkorange-latest

 

Basically this post is a call for help, and once I have it solved it will become a reference for anybody else trying to do the same thing in the future.

In the Maya Arnold Render Globals there are some fields to Add MetaData with a “Name”, “Type” and “Value” field available.

Which is cool if you are clicking, but if you want to push these buttons with code, what are the attributes called or the methods are to set them

so basically the answer is usually documented in the source

 

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
class EXRDriverTranslatorUI(templates.AttributeTemplate):
    def changeAttrName(self, nodeName, attrNameText, index):
        # Get the attribute name, type and value
        attrName = nodeName+'['+str(index)+']'
        metadata = cmds.getAttr(attrName)
        result = metadata.split(' ', 2 )
        result += [""] * (3-len(result))
 
        # Get the new name
        name = cmds.textField(attrNameText, query=True, text=True)
 
        # Update the name in all the templates
        templatesNames[:] = [tup for tup in templatesNames if cmds.columnLayout(tup, exists=True)]
        for templateName in templatesNames:
            cmds.textField(templateName+"|mtoa_exrMetadataRow_"+str(index)+"|MtoA_exrMAttributeName", edit=True, text=name.replace(" ", ""))
 
        # Update the metadata value
        metadata = result[0]+" "+name.replace(" ", "")+" "+result[2]
        cmds.setAttr(attrName, metadata, type="string")
 
    def changeAttrType(self, nodeName, menu, index):
        # Get the attribute name, type and value
        attrName = nodeName+'['+str(index)+']'
        metadata = cmds.getAttr(attrName)
        result = metadata.split(' ', 2 )
        result += [""] * (3-len(result))
 
        # Get the new type
        typeNumber = cmds.optionMenu(menu, query=True, select=True)
        type = cmds.optionMenu(menu, query=True, value=True)
 
        # Update the type in all the templates
        templatesNames[:] = [tup for tup in templatesNames if cmds.columnLayout(tup, exists=True)]
        for templateName in templatesNames:
            cmds.optionMenu(templateName+"|mtoa_exrMetadataRow_"+str(index)+"|MtoA_exrMAttributeType", edit=True, select=typeNumber)
 
        # Update the metadata value
        metadata = type+" "+result[1]+" "+result[2]
        cmds.setAttr(attrName, metadata, type="string")
 
    def changeAttrValue(self, nodeName, attrValueText, index):
        # Get the attribute name, type and value
        attrName = nodeName+'['+str(index)+']'
        metadata = cmds.getAttr(attrName)
        result = metadata.split(' ', 2 )
        result += [""] * (3-len(result))
 
        # Get the new value
        value = cmds.textField(attrValueText, query=True, text=True)
 
        # Update the value in all the templates
        templatesNames[:] = [tup for tup in templatesNames if cmds.columnLayout(tup, exists=True)]
        for templateName in templatesNames:
            cmds.textField(templateName+"|mtoa_exrMetadataRow_"+str(index)+"|MtoA_exrMAttributeValue", edit=True, text=value)
 
        # Update the metadata value
        metadata = result[0]+" "+result[1]+" "+value
        cmds.setAttr(attrName, metadata, type="string")
 
    def removeAttribute(self, nodeName, index):
        cmds.removeMultiInstance(nodeName+'['+str(index)+']')
        self.updatedMetadata(nodeName)
 
    def addAttribute(self, nodeName):
        next = 0
        if cmds.getAttr(nodeName, multiIndices=True):
            next = cmds.getAttr(nodeName, multiIndices=True)[-1] + 1
        cmds.setAttr(nodeName+'['+str(next)+']', "INT", type="string")
        self.updatedMetadata(nodeName)
 
    def updateLine(self, nodeName, metadata, index):
        # Attribute controls will be created with the current metadata content
        result = metadata.split(' ', 2 )
        result += [""] * (3-len(result))
 
        # Attribute Name
        attrNameText = cmds.textField("MtoA_exrMAttributeName", text=result[1])
        cmds.textField(attrNameText, edit=True, changeCommand=pm.Callback(self.changeAttrName, nodeName, attrNameText, index))
 
        # Attribute Type
        menu = cmds.optionMenu("MtoA_exrMAttributeType")
        cmds.menuItem( label='INT', data=0)
        cmds.menuItem( label='FLOAT', data=1)
        cmds.menuItem( label='POINT2', data=2)
        cmds.menuItem( label='MATRIX', data=3)
        cmds.menuItem( label='STRING', data=4)
        if result[0] == 'INT':
            cmds.optionMenu(menu, edit=True, select=1)
        elif result[0] == 'FLOAT':
            cmds.optionMenu(menu, edit=True, select=2)
        elif result[0] == 'POINT2':
            cmds.optionMenu(menu, edit=True, select=3)
        elif result[0] == 'MATRIX':
            cmds.optionMenu(menu, edit=True, select=4)
        elif result[0] == 'STRING':
            cmds.optionMenu(menu, edit=True, select=5)
        cmds.optionMenu(menu, edit=True, changeCommand=pm.Callback(self.changeAttrType, nodeName, menu, index))
 
        # Attribute Value
        attrValueText = cmds.textField("MtoA_exrMAttributeValue", text=result[2])
        cmds.textField(attrValueText, edit=True, changeCommand=pm.Callback(self.changeAttrValue, nodeName, attrValueText, index))
 
        # Remove button
        cmds.symbolButton(image="SP_TrashIcon.png", command=pm.Callback(self.removeAttribute, nodeName, index))
 
    def updatedMetadata(self, nodeName):
        templatesNames[:] = [tup for tup in templatesNames if cmds.columnLayout(tup, exists=True)]
        for templateName in templatesNames:
            cmds.setParent(templateName)
            #Remove all attributes controls and rebuild them again with the metadata updated content
            for child in cmds.columnLayout(templateName, query=True, childArray=True) or []:
                cmds.deleteUI(child)
            for index in cmds.getAttr(nodeName, multiIndices=True) or []:
                attrName = nodeName+'['+str(index)+']'
                metadata = cmds.getAttr(attrName)
                if metadata:
                    cmds.rowLayout('mtoa_exrMetadataRow_'+str(index),nc=4, cw4=(120,80,120,20), cl4=('center', 'center', 'center', 'right'))
                    self.updateLine(nodeName, metadata, index)
                    cmds.setParent('..')
 
    def metadataNew(self, nodeName):
        cmds.rowLayout(nc=2, cw2=(200,140), cl2=('center', 'center'))
        cmds.button( label='Add New Attribute', command=pm.Callback(self.addAttribute, 'defaultArnoldDriver.custom_attributes'))
        cmds.setParent( '..' )
        layout = cmds.columnLayout(rowSpacing=5, columnWidth=340)
        # This template could be created more than once in different panels
        templatesNames.append(layout)
        self.updatedMetadata('defaultArnoldDriver.custom_attributes')
        cmds.setParent( '..' )
 
    def metadataReplace(self, nodeName):
        pass
 
    def setup(self):
        self.addControl('exrCompression', label='Compression')
        self.addControl('halfPrecision', label='Half Precision')
        self.addControl('preserveLayerName', label='Preserve Layer Name')
        self.addControl('tiled', label='Tiled')
        self.addControl('autocrop', label='Autocrop')
        self.addControl('append', label='Append')
        self.beginLayout("Metadata (name, type, value)", collapse=True)
        self.addCustom('custom_attributes', self.metadataNew, self.metadataReplace)
        self.endLayout()

So after that it isn’t rocket surgery

There is a compound attribute with strings in it called “custom attributes” as seen in the screen shot below

snap-clockworkorange-20151203-120441

So if you want to bang another another value into that custom attribute you can

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
#Print attributes
for i in maya.cmds.getAttr("defaultArnoldDriver.customAttributes",mi=True):
    print maya.cmds.getAttr("defaultArnoldDriver.customAttributes[%d]" % i)
 
 
 
def addAttribute(nodeName, value):
    """
    \add an attribute
    """
    next = 0
    if cmds.getAttr(nodeName, multiIndices=True):
        next = cmds.getAttr(nodeName, multiIndices=True)[-1] + 1
    cmds.setAttr(nodeName+'['+str(next)+']', "%s %s %s" % (value["type"],value["name"],value["value"]), type="string")
 
#Call function above
example = {"type":"FLOAT","name":"example","value":2.0}        
nodeName = "defaultArnoldDriver.customAttributes"
addAttribute(nodeName,example)    
 
#Print attributes again after adding
for i in maya.cmds.getAttr("defaultArnoldDriver.customAttributes",mi=True):
    print maya.cmds.getAttr("defaultArnoldDriver.customAttributes[%d]" % i)
 
#This updates the GUI    
from mtoa.ui.ae import customShapeAttributes
 
c = customShapeAttributes.EXRDriverTranslatorUI('aiAOVDriver')
 
c.updatedMetadata(nodeName)

But that doesn’t update the GUI, but may make an EXR with meta data.

So I think I need to do it with a little more care.

But in reality it is there in the data structure, the gui will be updated if you just add and then delete an element, because it will call updatedMetadata method in the class above.

So I guess I could just call that by myself.

If I knew how to get to the method of a callback on an AE template installed by a Python Class.

Basically that is the start of the reverse engineering.

Will update once it is properly solved.

Also it doesn’t enable me to set these values with an expression per frame, but that shouldn’t be too tricky to do.

September 12, 2015

Hypocrisy and the Fully Connected Graph of the Internet and the First World Problem

Filed under: Uncategorized — admin @ 11:30 am

While discussing this article

http://www.lindsredding.com/2012/03/11/a-overdue-lesson-in-perspective/

 

Yesterday I proposed that there is a symptom of overworking that causes the pace of life to be too fast, which can make you ill. While pulling weeds by hand watering the garden at the community plot, I found the panacea for this. Quite simply is to go slowly while you are not working and allow your thoughts to decompress. Albeit, a first world problem.

 

But what annoyed me about the discussion on social media about this was it was immediately dismissed as a first world problem. Which leads back to the tokenism of growing a handful of vegetables in an organic plot and the saving grace of pulling weeds out by hand is the solution to the problem, of working too hard. Obviously the success of technology means that food production is automated which would give us more leisure time to pursue other things of interest rather than subsisting on a third world hand to mouth existence of food production and consumption by the simplest means.

 

But how ironic is it to call out a problem on the Internet as a “first world problem”. In the third world is there a fair trade semiconductor network for your self-righteous thoughts to travel on the internet?

 

Then I thought more about this issue, without being a hypocrite how can you use the term “first world problem” on the Internet.

 

Basically you would need to make sure that all of your data packets were travelling on computers/routers/cables/satellites that were made in factories with fair trade agreements. Installed by workers that were union members. Also the materials would need to be mined from lands where respects had been paid to the traditional owners. You would need to make sure that there was no history of tyrants governing the people who mined, refined and manufactured the components of the Internet. You better hope the IT technician that solved the problem with the routing issue, wasn’t getting paid any more than the person working in the mine where the silica for CPU the router that is being used. Otherwise is isn’t really fair trade.

 

Oh that is right, the Internet is a product of the first world, so the Internet itself IS a first world problem.

 

So until we get an Internet of trained axolotls that carry messages on the ankles by their own free will. Please don’t use the “first world problem” as a put down without considering that you are massively ironic.

August 23, 2015

Replacing Capacitors in a Cisco 877

Filed under: Uncategorized — Tags: — admin @ 12:06 pm

Just when you think you have reached the level of maximum geekiness, you step it up a notch.

I work making visual effects for Hollywood movies, that is pretty geeky.

I am a mature aged student studying Computer Science, reading pure math on a Sunday, that is pretty geeky.

I have participated in a Community Wireless Networking group Air-Stream in fact I designed their logo, that is pretty geeky.

As a result I happened upon having a DSL router, that is made by Cisco that brings the internet into my home.

I bought it second hand off ebay in November 2014.

Installing that and being able to navigate Cisco’s operating system IOS is a pretty high level of geeky.

http://forums.whirlpool.net.au/archive/2333379

But when said modem starts playing up, and switch it off and switch it off again gets too annoying.

http://forums.whirlpool.net.au/archive/2425478

You COULD throw is away and get a new one.

OR you could google the hell out of the issue and find out about which capacitors are worn out.

see this thread here:

http://forums.whirlpool.net.au/archive/1737576

 

IMG_6255

IMG_6257

IMG_6258

IMG_6259

 

So this afternoon I will head to the local Jaycar and see if I can get my hands on

6800µF 105°C 6.3V 15mm Aluminum Electrolytic Capacitor for under $5 each.

Then on Wednesday night, I am going to turn up to the local hackspace after work and see if I can have a crack at soldering in the new parts.

Then I will be back to the previous glory of this

ADSL2_November_2014_SNR_after_a_week

 

November2014_ADSL2

 

Why just have the internet when you can solder your own internet together.

Today’s life is all too prefabricated.

I got a recipe for coconut and pumpkin soup.

The recipe asked for a can of pumpkin.

I only had a pumpkin grown in the neighbour’s back yard given to us as a gift.

Needless to say I was still able to make the recipe without getting the pumpkin put into a metal can and shipped half way around the world.

Make your own pumpkin soup, make your own internet better.

May 23, 2015

Steering behaviour for wander as 1d noise of angular value.

Filed under: Uncategorized — admin @ 12:29 pm

Found a cool tute on a wander behaviour for my cat simulator

 

http://gamedevelopment.tutsplus.com/tutorials/understanding-steering-behaviors-wander–gamedev-1624

This is a classic Ordinary Differential Equation for solving 2d position and velocity.

I was thinking about this.

And it looks like a lot of steps to produce a fairly continuous angular rotation with a random nature to it.

Continuous is both its first and second derivative.

The I thought, that is exactly what Simplex Noise does, produces gradients that are continuous in the first and second derivative.

Also this system only solves the position and velocity of the avatar, from this you need to derive the transform from the position and the direction of travel.

What it doesnt give you is the angular velocity. This is required if you need to work out how much of a turn left or a turn right animation to blend in.

So I thought an alternative would be work in polar coordinates for velocity so you have an angular position and angular velocity and either a constant or varying forward position.

 

Anyway now I think about it is a bad idea, but it was fun while it lasted.

February 23, 2014

Inverse decay of light and an alternative to traditional image based lighting and a move to incident light fields.

Filed under: Uncategorized — Tags: — admin @ 4:06 pm

Let me take you back, way back to year 9 in high school in 1985, where I was introduced to light and photography and the inverse squared rule of light decay.

enlarger

In black and white photography, you expose of piece of photosensitive paper to light for a period of time using an enlarger then go an develop that exposure with a chemical reaction in a tray and the darkness develops in front of your eyes.

The more light that the paper gets, the darker the colour will be. That is why you use a negative in the enlarger so you mask of the black areas.

Here comes the inverse squared law. If you want to make a print of a a4 piece of paper you might have worked out you need to expose for 10 seconds to get the image that you want.

But then you want something that is bigger. An a3 print, you need to wind back the enlarger so that the image is projected onto a greater area. The lamp still has the same brightness the negative still masks off the same amount of light.

The paper still has the same chemical response. So you expose the a3 sheet for the same 10 seconds. The image comes out very pale.

Why?

The inverse squared rule of decay. Because the light is further away from the photosensitive paper, not as much light reach per unit area, so to get the same chemical reaction for the same density of blacks, you need to expose for longer.

The rule is as follows, if you double the distance between you and a light source, you end up with one quarter the amount of light per unit area.

So it follows to get the same exposure you might need to expose the a3 sheet for more like 40 seconds compared to the a4 sheet of photosensitive paper.

Just to prove I am not making this up look at this Wikipedia page http://en.wikipedia.org/wiki/Inverse-square_law

So the inverse square law is real, I have seen it in action when developing prints in 1985.

The reality is that luminance is measure in candella per square metre see http://en.wikipedia.org/wiki/Candela_per_square_metre

So on a film set you can get an approximation of this by shooting a high dynamic range image, by assembling a number of low dynamic range images at a number of different shutter speeds.

see:

http://www.researchgate.net/publication/220506295_High_Dynamic_Range_Imaging_and_Low_Dynamic_Range_Expansion_for_Generating_HDR_Content/file/d912f508ae7f8b733c.pdf

 

But something is usually forgotten with this process.

 

  1. Calibration
  2. The effect of distance and the inverse square law

The calibration could be easily overcome by using a light emitter of known energy in an off state and and on state.

So to do this you have an LED of a known size at known distance from the camera.

You measure the luminance at 1 cm from the light source with a http://en.wikipedia.org/wiki/Radiometer and get a result of light sources luminance in candella per meter squared.

Then you create a HDR of this same light source at a fixed distance from the light source, say 1m.

If this is a standard LED then you dont need the radiometer every time.

If you had access to the radio meter you could just measure the energy of your light sources in candella per meter squared on set.

From this you can then derive what a pixel value on the film back of the image taking the HDRI is equivlant to in candella per meter squared.

Great!

So we have a HDRI and we know the energy levels of the light arriving at the film back of the camera taking the HDRI.

Now to go further.

If you want to know the energy levels of the light at the surface they are being emitted from, you need to reverse the inverse square decay algorithm.

So if you have two light sources in your HDRI with equivalent pixel values, then the luminance of those two light sources is equivalent at the film back of the camera.

But what if the distance of those lights sources were 1 m away from camera and 2m away from camera, both occupying the same pixel area in a cubic cross HDRI.

It follows that the one 2 m away would be 4 times the intensity as the light that is 1 m away.

Someone else has covered projecting the HDRI spherical projection onto geometry here:

https://www.fxguide.com/fxguidetv/fxguidetv-165-scott-metzger-on-mari-and-hdr/

This is valid for using this as a measure of the light at the surface of the object as an albedo.

http://en.wikipedia.org/wiki/Albedo

But if you want to use this as a light source for illuminating synthetic object, with correct attenuation.

You need to take into account the inverse squared falloff of the light from its surface to reaching the film back where it is measured and the luminance at the light source.

Further more you can put some importance sampling into your light sources.

Here is a cool paper

http://renderwonk.com/publications/s2010-shading-course/snow/sigg2010_physhadcourse_ILM.pdf

Anyway, this page http://webstaff.itn.liu.se/~jonun/web/IBL.php

Explains the concept a whole lot better than me.

But this came to me at 11pm on a Saturday night, when I was trying to go to sleep, so I thought I would scribble it down on a piece of paper so the insomnia didnt get the better of me.

Now it is 3:32 on a Sunday afternoon, the lawn has been mowed and my blog entry is complete for now.

May 25, 2013

Thread about tiled normal and colour maps in the Maya viewport

Filed under: Uncategorized — Tags: — admin @ 12:19 pm

Thread on the Area Forums

The script that saved my bacon:

  • Multi Channel Setup MEL
  • Mel script itself

    Which used an Add/Multiply node instead of a multiLayer texture.

    Works with Mental Ray and in inbuilt renderer, but with 3delight and normal maps not so much.

    Now our playblasts can look sweet with tiled UVs and Viewport 2.0

December 22, 2012

Sketchy model for printing

Filed under: Uncategorized — admin @ 3:46 pm

simple model

simple model


Here is the STL

An STL for printing

Based on a few conversations online I have determined there is a 3d printer at the Grote St library at the Adelaide City Council for free use.

So I knocked up a quick model in Wings3d.

Maia also made one too, which she said is a jewellery holder

Maia's Computer model of holding jewellery

I will upload her OBJ too and a screen shot of her model

But it is too large for upload without compression

Sam

November 12, 2011

trying to calculate localVisibility with Spherical Harmonics

Filed under: Uncategorized — Tags: , , — admin @ 5:59 pm
Spherical Harmonics Coefficients of Local Visibility

Spherical Harmonics Coefficients of Local Visibility

In hidsight, baking a lookup table of vectors and the spherical harmonics, might lead to artifacts, but with 1024 samples it doesnt look too bad

Feel free to download the source to sing along, its all based on the Sony Paper from 2003 see the PDF from SCEA

Python executable that write Spherical Harmonics Renderman header: sh1.py

Please find the resulting header file attached: SH.h

And a simple shader to calculate transmission based on the table of stratified samples above: localVisibility.sl

Its a bit clunky because I couldn’t work out how to do two dimensional arrays in RSL, im glad to fix it up if it is possible.

The preprocessor thing isnt that sweet either :/

As below:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
#include "SH.h"
 
surface localVisibility(
        uniform float maxDistance = 10; 
        uniform string outputFolder = "";
)
{
        SHVECTOR
        SHSPH0
        SHSPH1
        SHCOEFF0
        SHCOEFF1
        SHCOEFF2
        SHCOEFF3
        SHCOEFF4
        SHCOEFF5
        SHCOEFF6
        SHCOEFF7
        SHCOEFF8
        vector Nworld = vector(transform("world",N));
        point Pworld = transform("world",P);
        uniform float numSamples = 1024;
        uniform float numCoeffs = 9;
        varying float results[9]={0,0,0,0,0,0,0,0,0};
        uniform float i,j;
        varying float faceforward = 0;
        varying float occl = 0;
        for(i=0;i<numSamples;i=i+1){
                float Hs = samplesVector[i].Nworld;
                point destinationWorld = Pworld + samplesVector[i]*maxDistance;
                point destinationCurrent = transform("world","current",destinationWorld);
                if (Hs > 0){
                        faceforward += 1;
                        float isHit = comp(transmission(P,destinationCurrent),0);
                        if (isHit > 0)
                        {
                                occl += 1;
                                for(j=0;j<numCoeffs;j=j+1){
                                        varying float value = isHit;
                                        if (j == 0)
                                        {
                                                value *= samplesCoeffs0[i];
                                        }
                                        if (j == 1)
                                        {
                                                value *= samplesCoeffs1[i];
                                        }
                                        if (j == 2)
                                        {
                                                value  *= samplesCoeffs2[i];
                                        }
                                        if (j == 3)
                                        {
                                                value *= samplesCoeffs3[i];
                                        }
                                        if (j == 4)
                                        {
                                                value *= samplesCoeffs4[i];
                                        }
                                        if (j == 5)
                                        {
                                                value *= samplesCoeffs5[i];
                                        }
                                        if (j == 6)
                                        {
                                                value *= samplesCoeffs6[i];
                                        }
                                        if (j == 7)
                                        {
                                                value *= samplesCoeffs7[i];
                                        }
                                        if (j == 8)
                                        {
                                                value *= samplesCoeffs8[i];
                                        }
                                        results[j] += value;
 
                                        }       
                                }
                        } 
                }
        for (j=0;j<numCoeffs;j=j+1){    
                results[j] /= faceforward;
                }
        occl /= faceforward;
        faceforward /= numSamples;
        Ci = color(results[0],results[1],results[2]);
        Oi = 1;
        Ci *= Oi;
 
}

I dont think it is working yet, but it compiles and renders

Sam

November 6, 2010

Open Computer Graphics Storage Formats

Filed under: Uncategorized — admin @ 12:42 pm

We all know that OpenEXR has pretty much standardised the storage formate for image planes

But now there are few new contenders for

With the future of the industry being more widely outsourced this standardisation between packages seems pretty important.

Somethings happened a long time ago

  • Documents: PDF
  • Edits: EDL
  • Halfbaked scene descriptions: FBX
  • Render Intermediates: RIB

But I really hope that people can start to be a little more uniform with thier formats that they are using

Older Posts »

Powered by WordPress