Not very useful for most, but an interesting optical effect for the rest;
visual white noise can be used to study the nature of optical sensors.
The analysis math is beyond me but I am assured that it can give you the field of view and
the 1st and 2nd order temporal response.
The generator, that is an optical display, is easier to understand.
I have seen it done using expensive high speed LED displays and
so was interested to see what could be achieved with a common CRT using python and OpenGL.
White noise is merely a random sequence with none (or few) repeating patterns.
In audio or visual applications it somehow provides a stimulus that covers a spread of
frequency and amplitudes.
For the visual white noise generation we just need to display a sequence of frames at a set frequency.
Each frame will consist of a grid of virtual LEDs which are set to a random level or colour.
Since the screen is not curved (as is the case of the afore mentioned LED displays) the outer grid
squares must be larger than the inner grid squares so that each covers the same angle when viewed
from a set distance. For 6 degree separations and a 90deg field of view it will look something like this:
I have programmed a small visual white noise generator in python script.
Python, using the pygame and pyopenGL modules, provides an easy to use interface to the OpenGL
video display driver system available on most PCs and most platforms
(Linux, WindowsXP have been tried - nothing to see on an old Windows 98SE machine, you might need to use Windows emulation on a Mac).
Download white.zip and decompress to setup everything quickly with a test directory called 12x12.
Or just have a look in white.py and maybe default.cfg
Googling any of the above should find what you need.
Be careful you get modules that match the version and platform of the python you are using.
The psychopy (www.psychopy.org) people have a nice zip
package for Windows and their requirements are greater than mine.
For Linux I can recommend the Envy video driver hunter (for ATI and NVidia cards).
Some applications seem to crash openGL, so a certain degree of patience and rebooting may be
required.
To begin with just try a simple setup (one that requires no geometry to be set).
To just see 4 frames of 256 x 256 random noise, just run python white.py
If you placed default.cfg in the same directory you will get the same result as shown
in the animated gif at the top.
If you don't give a -c config_filename after the command, the program tries to use
default.cfg. If it can't find this, it just used default values.
You can die from boredom or just type ESC to end the program.
You'll note after running the program that 9 new files have appeared in your directory.
One new file is an experiment log (called test.log) - a hang over from similar programs I have written - it keeps
track of what you last did. The name is set using the -l filename option either as an argument after
the command or within the config (cfg) file.
There are now four data files (.bin) which contain the values used in the display - one for each frame.
You could have provided your own.
One to three bytes per pixel apply.
The number of bytes depends on the way the colour is defined.
One byte assumes sets a graduation between two colours (black and white by default)
Two bytes sets the relative proportion of each of two colours.
Three bytes sets the actual colour as a RGB (red green blue) value.
The number of bytes and the two colours can be set using the -u option.
You can replace these data files with your own, just use the same name and format.
Next time you run the program, it will be your data being used.
There are now also four image files (.tga).
These are exact images of each frame and can be used to dramatically increase load time.
The -t option can be used to prevent loading and saving of files.
Test modes are:
There are a number of key controls which might be useful:
Note: If you zoom in a little with this 4 frame trial, you get to a point where the coincidences
in the repeating patterns start to match up and you will see interesting swirling patterns and
what almost seem like veins of flowing blood.
To use as angle corrected virtual pixels (as shown in the second image on this page)
you will need to set up a few more options,
preferably in a config file and preferably in a separate directory.
As an example of how to do this, create a directory called 12x12 and
in this directory create a config file called 12x12.cfg
(you will already have it if you used the white.zip source).
This would look like: All these configuration file options are the same as for the command line options and are
list (in brief) by running python white.py -h
but a brief description here might also be helpful:
The -r option is just the window resolution.
In a real experiment it would be the same as the screen resolution.
The -o option sets the viewing conditions to 60Hz frame rate
(you get this off your monitor), the monitor width (any standard unit),
the viewing distance, the degrees per virtual pixel that you required,
the number of pixels wide and high.
The -e option sets the number of frames (ie textures) and the size of each texture.
OpenGL maps textures into the required space, so the textures size really sets the fineness
of the resulting pixel borders.
The size must be an exponent of 2 greater than 64 and usually less than 2048.
To now run this angle corrected configuration, return to the parent directory and run
python white 12x12. The 12x12 argument is just a directory path (relative or absolute)
to the directory where you want everything to happen - all the log, data and image files will be created in the new directory.
The code has a lot of junk in it that probably isn't even needed (logging, config files etc).
The bit where all the grunt happens is in load_textures() and draw().
load_textures() creates a series of pygame display surfaces (our frames) and loads them
into the video cards memory as textures.
Yes, the number of frames is limited by the video card memory
- but modern cards can have an astonishing amount of this.
draw() just draws a box on the screen and fills it with a different texture each time.
It also draws a little square in the bottom left corner that alternates from black to white with
each frame. When run with the monitor at 60Hz, the little box will appear grey if you look at it
directly, but side one it will flicker at 30Hz.
If your monitor is doing 75Hz or greater, I doubt you will see the flicker at all.
If you see any flicker even when looking at it directly, something is wrong.
Check your video cards display driver setup and check that openGL is synching to vertical blank
on this monitor (for some reason that I don't understand it matters in a two monitor setup).
When you close your program, the terminal window (if you have one open) will give an estimate
of the frame rate.
Coders may notice in the OpenGL initialisation code that I have tried to turn off all anti-aliasing features.
This was intended to both improve speed and avoid blurring between neighboring virtual pixels.
However, at lease on my machine, when I zoom in on the simple display, these is still a smooth
graduation between openGL pixels.
A note on scaling of images in the draw function.
I have scaled everything so that 1.0 represents the height of the viewing window.
A hint for use in Linux, some windows managers allow a borderless window.
This is usefull for two monitor applications where I have found it difficult to use the
pygame fullscreen mode.
People more familiar with X windows servers may be able to convince pygame to only attack one monitor,
but I cannot, so I opted to use twinview mode in these situations and just drag pygame to the monitor or
monitors of interest.
Download
Requirements
Usage
QuickStart
Setting Geometry For Viewing Angle Correction
-r 640 480
-o 60 12 6 6 12 12
-e 128 512
Improvements Required before Serious Use
Coders
Possible Extended Application
With a hacked 3 channel LED micro-mirror projector
(I believe there are currently 2 models available in the $2000 range)
it would be possible to extend the colour range of the sensor analysis into infrared, ultaviolet and
possibly polarised light.
The hack would required replacement of some or all of the high power LEDs with either LEDs of
a different spectrum or with variable monochromatic light sources
(you don't want to know how expensive these are).