Git Product home page Git Product logo

isetbio_v0.1's Introduction

isetbio_v0.1's People

Contributors

davidbrainard avatar npcottaris avatar wandell avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

isetbio_v0.1's Issues

Dependencies

A) Nicolas and I have been thinking about isetbio dependencies. We will set things up so that it is easy to copy into isetbio 'external' routines, such as those from the PTB, that isetbio requires. We are (intentionally) setting this up so that any change made to a PTB routine will be clobbered when we copy over -- that is, modifying PTB routines that are supplied with isetbio is in effect not allowed. I think this is a good way to go because when we work on isetbio we should treat PTB routines as read only. We can, of course, modify them if we want, but the way to do that is to let me know what needs to be changed, perhaps by emailing me a modified copy of the routine for review and possible incorporation into PTB. I think other ways of doing this will lead to madness. But, now is the time to suggest something else if you disagree.

B) We have written some utilities that allow us easily to take PTB and other local toolboxes off our path before running isetbio validations. We think this will 'smoke out' whatever oddball dependencies that have crept in. We have this set up so we can also limit which Matlab toolboxes are on the path when we test -- we have them all on our machines by default because they come with the site license. So the question is, what Matlab toolboxes do we want to allow isetbio to depend on. Presumably signal, which is hard to do much of anything without. Anything else? Once we know the answer to that, we can test.

C) Nicolas came up with a slick solution to the name space problem, in which you can point at a local routine temporarily when its name conflicts with something else on the path. I believe he has this illustrated in one of the validation scripts, where the xyz2lab routine in isetbio collides with a recently added Matlab function of the same name. We put the engine that handles this into PTB, as we are likely to want to call it from non isetbio code in our lab, and then copied it over into isetbio's PTB collection.

Fast (hashed) validations not passing for Brian

The full validations work, but not the fast (hashed). This may be fixed by reducing the precision to which we round to 11 or perhaps 9 digits before hashing. I hope so. Otherwise we'll have to figure out what is going on. In that case, the next question might to ask whether the hash algorithm is stable across versions of Matlab.

How can we set an arbitrary frequency support for the OTF?

The following code

        oi = oiCreate('human', pupilRadiusInMeters);

        % Compute the optical image
        oi = oiCompute(scene,oi); 

        % get the optics
        optics = oiGet(oi, 'optics');

        % set new OTF supports
        optics = opticsSet(optics, 'otffx', [-200:1:199]);
        optics = opticsSet(optics, 'otffy', [-200:1:199]);

        % update the oi struct with the new optics struct
        oi = oiSet(oi, 'optics', optics);

       % recompute oi
(*)    oi = oiCompute(scene,oi);  

        % get the new OTF
        optics = oiGet(oi, 'optics');
        OTF = opticsGet(optics,'otf data');

crashes in the marked (*) line. This can be traced back to line 79 of customOTF(), which tries to interpolate the default OTF to the desired one. The crash happens because the sizes of fx and fy (80x120) (which are related to the optical image size) do not match the size of the default OTF (60x60).

If line (*) above is omitted the returned OTF is the default 60x60x31.

Related to this, shouldn't we compute a new OTF instead of interpolating the 60x60 OTF?

Thanks,
Nicolas

Please check t_colorSpectrum (quick thing for Brian)

I updated t_colorSpectrum to use display objects and displayGet. I think I figured out the isetbio approved way of doing gamma correction for the displayed image, but since I was pretty much guessing it would be worth your having a quick look at that section. Particularly check the comments. Note that comment about a unit convention difference for spectra between isetbio and PTB. At least I think I guessed correctly the convention used in isetbio.

The PTB convention is worse, I think, but is so deeply woven into all PTB calculations that it shouldn't be changed. It is mainly something to be aware of when importing spectral data from PTB-land, and can be handled at import time.

RTB3 uses the same convention as isetbio, because that is what the renderers use. We have import functions there that bring data in from PTB and handle the conversion, and this seems to work fine.

Validation TODO list

Because it might be a little while before we come back to this, I am writing down my current TODO list for the validation code.

A) Move the UnitTest object to its own UnitTestToolbox in a separate repository. The motivation for this is that both Brian and I would like to use the UnitTest code in non-isetbio projects, and we may not always want isetbio on our Matlab path for such projects. Move the UnitTest documentation to that repository's wiki and point to it from isetbio.

B) Think a little bit about how best to manage the configuration of the UnitTest code for different projects. This will concern primarily where the validation data and scripts are stored. It may be that we should provide a UnitTestConfigTemplate.m file, and that this would be copied and modified for each project to allow appropriate settings. Currently the UnitTest object uses Matlab preferences for this sort of thing, and thus at any moment there is only one set of active preferences on a machine. Currently the config file approach is the best I can think of, and a local config file could be invoked by scripts that run a set of validations. When run in standalone mode, I don't think validation scripts rely in any way on preferences, so that they will run OK without the config being right. [If they do rely on local preferences in stand-alone mode, we may want to remove this dependence.] There might be better ideas about how to do this. We will continue to want to publish validation scripts with individual projects (e.g., isetbio scripts should get published on the isetbio page).

C) Validation data, both FAST and FULL, should be written at the same time, so that they remain consistent. Currently it requires a separate run of a validation script to generate each type of data.

D) FAST AND FULL validation data should go into separate folders. And I think only the FULL data should keep the history. The FAST should just overwrite. We want to move the isetbio FULL validation directory outside of the isetbio repository - it is too big to keep there. Preferences should allow the empty string for the validation dir to mean 'don't write this sort of validation data'. The best way might be to have a separate repository 'isetbio-fullvalidationdata' and put it there. Then we can keep it up to date and those who want it can get it as needed. The FULL data is very useful for tracking down what is causing a validation failure, but not needed just to check that things are running correctly.

E) Change the names of 'validation/validationscripts' -> 'validation/scripts' and of 'vadidation/validationdata' -> 'validation/data'. This latter should just have the most recent FAST validation data (see D above).

F) Think through the help text on the high level validation scripts and what scripts we'd like to have as presets.

G) In individual validation scripts, add help text that shows how to pass arguments to control behavior and convert to the assert format that was recently added to UnitTest. For the former, the place to start is with the skeleton script. The help text can then be propagated along and a string replace used to change the name in each individual script when it is created. I think we want text that can be selected and executed for things like 'generate validation data for this script', 'publish this script'. But we need to keep in mind that for those options, the current preferences for UnitTest will be used.

F) Once we make these changes, we should rename the current isetbio repository to 'isetbio-ver0.1' and make a new clean repository 'isetbio' that has only the current dev branch as the master branch and no history. Then make a new dev branch to work on. This is because there is no (easy) way to get all the huge stuff out of the current repository. If we keep the ver0.1 version around, it will give us any history we need.

Help functions and documentation conventions

We realized that having some sort of help for methods (what are the methods for a scene object) that was a bit easier to use than doing the help and seeing the long help text would be nice.

Haomiao figured out a potential way to get autocomplete to work on methods, by modifying some .xml file that Matlab uses.

It would be good to figure out if this can work, and whether we can autogenerate the right stuff so that changes in the help text make it through in some systematic way to the autocomplete.

Project branching model

I thought I'd share what's worked for us in the past as far as a project branching model given today's discussion. This is probably similar to how things are working now but I've found it helpful to write these details on the project wiki if the group can come to some consensus.

We generally have two main branches with infinite lifetime: master and develop.

-The master branch reflects production-ready state (stable) and produces releases.
-The develop branch reflects the latest development state (unstable) and produces nightly builds (may not be applicable for this project).

No one should push directly to the develop or master branch. Instead each contributor should create a "topic" branch from the develop branch and implements a feature. When the feature is complete the user creates a pull request to merge their topic branch into the develop branch. At this point the new code can be reviewed by the group before the pull request is accepted.

This is a simplified version of the branching model detailed here: http://nvie.com/posts/a-successful-git-branching-model/ .

Dependencies

I think we've decided that our goal state is for isetbio to require just the statistics and image processing toolboxes. Plus it can use the parallel processing support when that is available but should not crash without it.

We (Penn) will soon test whether the current validations run with just signal/image on the path. We may need to add stats and/or optimization, but we will see. We will pull in all the PTB stuff we use under external.

UnitTest issue

Running validateFastAll produced this error:

Error using UnitTest.initializePrefs
Method 'initializePrefs' is not defined for class 'UnitTest' or is removed from MATLAB's search path.

Error in validateFastAll (line 10)
UnitTest.initializePrefs();

I couldn't find initializePrefs in the dev branch inside of @UnitTest

Validation scripts plan

This is a summary of my current thinking about validation scripts, based on what I started while at Stanford but won't have a chance to get back to for a week or so. Nicolas and I will then work on this.

a) I created a "validation" directory in isetbio (dev branch). This will have well-named subdirectories in it that will give a hint about what is being validated in each.

b) We will focus on the irradiance/isomerizations validation script(s) that compare PTB and isetbio computations as a start. I'd like to get a few of these into standard format and working with Nicolas' validation objects in a clean and clear way.
i) I think these should be written with a passable flag that either causes them to make plots or doesn't. Plots are great for debugging but slow if we're going to run the validations often.
ii) We will work on developing a convention about what should happen when a validation fails -- could be exit on error or could be carry on but not exit, so that all validations can be run through and those that failed flagged at the end. It is possible that how this works should be conditional on some flag.

c) There will be a top level validation script that will run all of our standard validations, as they develop.

d) We will try to set up some auto-publish feature, so that there appears on the web a list of validation programs and what they test. The idea would be to have this pretty much autogenerated from a standard form of comments in the validation scripts, so that the documentation only needs to be written once.

e) Once we get the basic structure into place, we can sort through the existing validation scripts and make them consistent with the structure we decide we like.

f) I think the validation scripts, although not quite tutorials, will be very useful as a way of showing how things get done.

Psychtoolbox integration

We decided to make an external subdir for PTB stuff and put the needed PTB functions into there. This is set up, along with a +ptb directory.

a) We need a script that will autocopy from the PTB master to the isetbio master, so that the isetbio versions stay in register with the PTB master. It's possible this should detect any cases where the isetbio version has been edited, so that someone can think about folding the change into the PTB master. I think it's OK for PTB master versions just to flow quietly into the isetbio versions.

b) The package won't work for the whole thing, because if you call a PTB function as ptb.CheckWls and it calls some other PTB function without the .ptb, it won't work. Sigh. Also, the package directory can't have subdirs.

c) But the package directory can be useful for ptb wrapper functions that we call from isetbio. And maybe there is some better way to use packages in Matlab than I could figure out.

d) We need to modify our BrainardLab startup file so that there are clean options for taking PTB and other lab toolboxes off the path, so that we can test for unintentional dependencies in isetbio. May also want to arrange to have a path with only a few standard toolboxes that we are willing to rely on, rather than all of them as we standardly run in the Brainard Lab.

Nicolas and I will work on all this.

ieLUTInvert etc.

In going through the tutorials I have been trying to eliminate direct loads of files that were in subdirectory cMatch of scripts/color, and replace these either by calles to ieReadSpectra or displayCreate.

Because one of the reads was of a monitor inverse gamma table (in monitorGam, variable monitorInvGam, a 1000 by 3 inverse gamma table for a typical monitor).

I thought I could get this by

d = displayCreate;
monitorGam = displayGet(d,'gamma');
monitorInvGam = ieLUTInvert(monitorGam,8)

where the 8 makes a reasonable size gamma table.

But, the returned inverse table has only one column. In addition, plotting it looks like a straight line, suggesting it isn't an inverse lookup table.

The example code given in the help text for ieLUTInvert doesn't make sense to me. First, it accesses a display field directly which seems to violate our preferred style. Second, it contains a raise of the gamma table to the exponent 0.6, which I don't get.

I suspect I'm not understanding something basic about the way isetbio handles all this sort of thing.

The two relevant tutorials are t_colorMatching (which actually seems to be working OK but I no longer understand why) and xNeedChecking/display/t_displayRendering. Once we have this sorted out, which should check the way ieLUTInvert is used in other ones I might have modified (I can do this), because I'm no longer sure I deduced correctly how this stuff works.

s_initISET does not clean up

This may be a 2014b issue only. Not sure.

Example follows.

>> clear all

>> close all

>> v_sceneFromRGB

>> whos

>> whos global
  Name           Size                Bytes  Class     Attributes
  vcSESSION      1x1             139052945  struct    global    

>> s_initISET
(windows still open)

>> whos
  Name           Size               Bytes  Class     Attributes

  d              0x1                  320  struct              
  vcSESSION      1x1             69546115  struct    global   

>> vcSESSION.SCENE{1}

ans =

     []

>> vcSESSION.SCENE{2}

ans = 

type: 'scene'
data: [1x1 struct]
name: 'eagle - LCD-Apple'
spectrum: [1x1 struct]
illuminant: [1x1 struct]
distance: 0.5000
wAngular: 2.2436
filename: '/Users/Shared/Matlab/Toolboxes/isetbio/isettools/data/images/rgb/eagle.jpg'

This does not happen always. For example it does not happen with
v_sceneReIllumination.m

sceneGet minor

A) I couldn’t find the ‘photons’ get in the help for sensorGet, although using it did work.

B) I had trouble with using ROI based gets for photons, and in general am a bit unsure of ROI gets and when they take rect input and when they take a list of locations. This may be slightly more general than just sceneGet.

New MATLAB class for validating ISETBIO computations

Introduction

The issue of how to validate ISETBIO computations has been raised a few times. ISETBIO includes several 'v_*' scripts which can be used to check whether a component is broken. These scripts, although extremely useful, they cannot effectively detect a scenario where a piece of ISETBIO code still runs (i.e., does not crash), but produces different results than earlier versions.

A new Matlab class for handling code validation

Toward this end, I developed a new MATLAB class, named @UnitTest, for validating and managing results obtained by any validation script. Validation scripts managed by a @UnitTest object can be used not only to detect if code crashes, but also to probe the outcomes of the computations against a ground truth data set, which itself is also managed by the @UnitTest object. The ISETBIO team would have to agree on when a data set should be added to the ground-truth data set. The @UnitTest class is committed to the Penn branch and is located in isetbio/isettools/scripts/validate/@UnitTest.

Details on the operation of the @UnitTest class

  1. A @UnitTest object executes identified code segments (which I call probes) in a validation script, and stores the results obtained. If the code in a probe crashes, the raised exception is captured and the probe is labeled accordingly, without interrupting the flow of the validation script.
  2. @UnitTest objects assemble the outcomes of all probes in the validation script and appends the result in a validation data file whose name is derived from the validation script name. In other words, each validation script is associated with a unique validation data set.
  3. The user can select that the data obtained by a particular validation run will serve as ground truth. Ground truth data are appended in a different file, also derived by the validation script name, so each validation script is associated with a unique ground truth data set.
  4. The user can also select to either append the obtained data to the existing data set history, or overwrite the existing history and start a new one. This can be done for both the validation and the ground truth data sets.
  5. @UnitTest objects can contrast any data set from the recorded validation history to any data set in the recorded ground truth history. The user is presented with the entire history of validation runs and ground truth runs and he can select which validation run is to be contrasted against which ground truth run.
  6. The data comparison process performed by @UnitTest objects is agnostic to the nature/structure of the data. @UnitTest objects recursively transverse the data (usually structs) that are to be contrasted, and compares corresponding fields. If validation and ground truth data are not identical, either in the field names or the field values, a message is generated about the mismatch. If the mismatched field values are numerical and more than 10, they are plotted against each other for visual comparison. This can be useful in cases where the mismatch is in the 'photons' field, which are 3D matrices.
  7. @UnitTest objects store various information about each run, including date, machine architecture, matlab version, git branch (not yet implemented), and even a snapshot of the validation script at the time that it was run. If the selected validation and ground truth data do not match, the @UnitTest object displays the corresponding validation scripts side by side, with lines in which the scripts differ labeled as such.

Example of integrating a @UnitTest object with a validation script

The following code shows how a @UnitTest object can be integrated in a validation script.

%% Initialize ISET
s_initISET

%% Initialize a @UnitTest object to handle the results
unitTestOBJ = UnitTest(mfilename('fullpath'));

%% Add a unit test probe

% First step: define a string that includes the code segment that we wish to validate. 
% All commands must be in quotes, and the last statement must assign 
% the data of interest (i.e., what will be contrasted against the ground truth 
% data set) to a variable named 'result'.
multiLineCodeSegment = [ ...
' parms.angles = linspace(0, pi/2, 5);' ...
' parms.freqs  =  [1, 2, 4, 8, 16];' ...
' parms.blockSize = 64;' ...
' parms.contrast  = .8;' ...
' s = sceneCreate(''frequency orientation'', parms);' ...
' result = s;' ...      % store the generated scene struct for validation against ground truth
];

% Second step: tell the @UnitTest object to add a probe with the above code 
% segment and give it a unique name. This name will be used to identify this probe 
% and to compare the associated data to the ground-truth data.
unitTestOBJ.addProbe(...
'name',  'frequency/orientation',...    % a unique name to identify this probe
'commandString',  multiLineCodeSegment ...   % code segment to be executed
);

%% Add more probes
...

%% Save this data set. 
% Here, you will be given options whether you wish to save these data as validation,
% as ground  truth, or both, and whether you wish to append to, or overwrite the 
% recorded history.

unitTestOBJ.saveData();

%% Compare validation to ground truth data

unitTestOBJ.verbosity = 0;
result = unitTestOBJ.validateAgainstGroundTruth();

Extended example

For an extended example, please see isetbio/isettools/scripts/validate/v_sceneUnitTest.m under the Penn branch.

Here are some things to try:

  • Run the script as is. All probes should validate OK against the current ground truth data set, except for the 'Moire Orient' probe, which raises an exception (Undefined function 'MOTarget' for input arguments of type 'struct')
  • Try changing the v_sceneUnitTest.m script. For example, in the probe named 'Slanted Bar', change the 'edgeSlope' parameter from 1.3 to 1.5. Re-run the script using the default options. You will get a message that the command strings for the 'SlantedBar' probe are different, and that the 'photons' fields have different values. A plot with the validation data set, the ground truth data set and the difference should also be generated. If you choose to look at the scripts, you will see that the changed line is marked with a DIFF.

Several scripts in isettools/scripts/optics (mostly related to ray tracing) not running

  1. s_opticsDepthScene.m First piano3d.mat was under the scenes directory, not the scene3D directory. After this was fixed, it bombs in line 73, traced to ioDepthCombine (line 37)
  2. s_opticsDLPsf.m Line 139 Index exceeds matrix dimensions
  3. s_opticsMicrolens.m Only runs if you have a microlens license
  4. s_opticsRTGridLines.m Line 72. Undefined function : rtGeometry
  5. s_opticsRTPSF.m Traced back to opticsRayTrace : Ray Trace routines not found
  6. s_opticsRTPSFView.m Line 17: Undefined function 'rtSynthetic'
  7. s_opticsRTSynthetic.m Line 24: Undefined function 'rtSynthetic'
  8. t_oiRTCompute Traced back to opticsRayTrace : Ray Trace routines not found
  9. t_opticsBarrelDistortion.m Traced back to opticsRayTrace : Ray Trace routines not found

checkerboard.m

We closed the checkerboard.m issue, by removing isetbio's version of this function. That is fine with me. Note, however, that the version in the image processing toolbox did something different from isetbio's version (since the validation failed to produce the same answer.)

I just want to confirm before we leave this that we are happy with this change in behavior, and that exactly what checkerboard returns isn't crucial, as long as it is basically a checkerboard.

Validation script standalone behavior broken

When I was doing validition work last week, validation scrips behaved well when run standalone. That is, if I just (for example) ran v_Colorimetry it would behave nicely, printing out all its messages and allowing me to see what it was doing.

Today, on my home machine at least, that behavior has changed. It runs but is silent. It does return a big cell array (which is ugly and not what it was doing before).

Does anyone know why this behavior has changed?

I think it is very important that it be possible to run validation scripts one at a time and have them be useful entities, and for this to happen by default without having to remember some set of arguments.

gather function for gpuarray produces error in oiPad on system without parallel toolbox

Not sure what checking has to happen. But v_ISETBIO now fails for me.

We need a way to establish the presence of special toolboxes and to protect against these failures.

The error is arises from the introduction of the 'gather' function for gpu computing. The try/catch can get into a state when padval is not set. (Error message below).


Undefined function or variable "padval".

Error in oiPad (line 51)
[r,c] = size(padarray(photons(:,:,1),padSize,padval,direction));

Error in opticsOTF>oiApplyOTF (line 79)
oi = oiPad(oi, padSize, sDist);

Error in opticsOTF (line 46)
oi = oiApplyOTF(oi, scene, 'mm');

Error in opticsSICompute (line 64)
oi = opticsOTF(oi,scene);

Error in oiCompute (line 78)
oi = opticsSICompute(scene,oi);

Error in v_oi (line 19)
oi = oiCompute(oi,scene);

Error in v_ISETBIO (line 27)
v_oi

oiGet enhancement

We should have a oiGet(oi,'irradiance roi',roi) feature in the oiGet.

The functionality is present in plotOI, and within the subroutine plotOIIrradiance

If the enhancement is missing from sceneGet, which should be sceneGet(scene,'radiance roi', roi)
which it seems to be, we should add it there, too.

The ROI possibility should be there for luminance (scenes) and illuminance (oi) as well.

function matfile in validation is relatively recent

The matfile() function doesn't run in 2011a.

If you don't mind replacing, then let's do it for backward compatibility. Not needed, though, if the replacement is ugly.

v_Colorimetry runs in 2011a, but not the vadliateFastAll form that goes through matfile.

Shouldn't ISETBIO be open?

Did we decide to keep the repository closed for a while? Or do we want to make it publicly viewable already? I sent Franco Pestilli to the site, and it was closed to him, which is how I realized it is not open.

I don't remember what we decided. If it should be open, and we just forgot, I will do it. Otherwise, just close this issue and we can open it up when we are all ready.

sceneSet(scene, 'illuminant', illuminant) does not seem to have an effect on the scene radiance

Use code below to re-create and visualize this issue.

    % Generate Macbeth scene with D65 illuminant
    fluorescentScene    = sceneCreate('macbethfluorescent');
    illuminantPhotons   = sceneGet(fluorescentScene, 'illuminantPhotons');
    peakRadiance        = sceneGet(fluorescentScene, 'peakRadiance');
    photonRadianceMap   = sceneGet(fluorescentScene,'photons');
    wavelengthSampling  = sceneGet(fluorescentScene, 'wave');

    figure(11); clf;
    subplot(3,3,1); hold on;
    plot(wavelengthSampling, illuminantPhotons, 'r-');
    plot(wavelengthSampling, peakRadiance, 'k-');
    legend('illuminant photons', 'peak radiance');
    set(gca, 'FontSize', 12);
    title('fluorescent macbeth');

    wavelengthSubSamplingInterval = 2;
    subplot(3,3, [2 3]);
    plotRadianceMap(photonRadianceMap, wavelengthSampling, wavelengthSubSamplingInterval, 'Radiance')
    set(gca, 'FontSize', 12);

    % Generate Macbeth scene with D65 illuminant
    d65Scene            = sceneCreate('macbethd65');
    illuminantPhotons2  = sceneGet(d65Scene, 'illuminantPhotons');
    peakRadiance2       = sceneGet(d65Scene, 'peakRadiance');
    photonRadianceMap2  = sceneGet(d65Scene,'photons');
    wavelengthSampling2 = sceneGet(d65Scene, 'wave');


    subplot(3,3,4); hold on;
    plot(wavelengthSampling2, illuminantPhotons2, 'r-');
    plot(wavelengthSampling2, peakRadiance2, 'k-');
    legend('illuminant photons', 'peak radiance');
    set(gca, 'FontSize', 12);
    title('D65 macbeth');

    subplot(3,3, [5 6]);
    plotRadianceMap(photonRadianceMap2, wavelengthSampling2, wavelengthSubSamplingInterval, 'Radiance')
    set(gca, 'FontSize', 12);

    % Change illuminant in macbeth d65 scene
    fluorescentIllum    = sceneGet(fluorescentScene, 'illuminant');
    d65Scene            = sceneSet(d65Scene, 'illuminant', fluorescentIllum);
    illuminantPhotons3  = sceneGet(d65Scene, 'illuminantPhotons');
    peakRadiance3       = sceneGet(d65Scene, 'peakRadiance');
    photonRadianceMap3 = sceneGet(d65Scene,'photons');
    wavelengthSampling3 = sceneGet(d65Scene, 'wave');

    subplot(3,3,7); hold on;
    plot(wavelengthSampling3, illuminantPhotons3, 'r-');
    plot(wavelengthSampling3, peakRadiance3, 'k-');
    legend('illuminant photons', 'peak radiance');
    title(sprintf('D65 macbeth re-illuminated \nwith fluorescent illuminant'));
    set(gca, 'FontSize', 12);

    subplot(3,3, [8 9]);
    plotRadianceMap(photonRadianceMap3, wavelengthSampling3, wavelengthSubSamplingInterval, 'Radiance')
    set(gca, 'FontSize', 12);



function plotRadianceMap(radianceMap, wavelengthSampling, wavelengthSubSamplingInterval, titleText)
    [X,Y,Z] = meshgrid(1:size(radianceMap,2), wavelengthSampling, 1:size(radianceMap,1));
    radianceMap = permute(radianceMap, [3 2 1]);
    minRadiance = min(radianceMap(:));
    maxRadiance = max(radianceMap(:));
    radianceMap = radianceMap/maxRadiance;
    h = slice(X,Y,Z, radianceMap, Inf, wavelengthSampling(1):wavelengthSubSamplingInterval:wavelengthSampling(end), Inf, 'nearest');

    for n = 1:numel(h)
        a = get(h(n), 'cdata');
        set(h(n), 'alphadata', 0.1*ones(size(a)), 'facealpha', 'flat');
    end

    shading flat

    axis 'image'
    set(gca, 'ZDir', 'reverse', 'Color', [1 1 0.6]);
    set(gca, 'FontName', 'Helvetica', 'FontSize', 14, 'FontWeight', 'bold');
    xlabel('x'); ylabel('wavelength'); zlabel('y');
    title(titleText);
    colormap(hot(256));
    colorbar('vert', 'Ticks', [min(radianceMap(:)) max(radianceMap(:))], 'TickLabels', [0 1.0]*(maxRadiance-minRadiance) + minRadiance);

    box off;
    grid off;
end

Updated approach for ISETBIO validations

The @UnitTest class has been modified extensively to support validation functions and to automatically push validation results to github (https://github.com/isetbio/isetbio/wiki/ValidationResults).

An executive function (validateAll.m) specifies which functions are to be validated in a run. After instantiating a @UnitTest object, validation functions are added as follows:

unitTestOBJ.addProbe(...
  'name',                'validate quantity X', ... % label to identify the probe
  'functionSectionName', '1. Cool validations ',... % validation section name
  'functionName',        'validateQuantityX', ...   % name of the validation script
  'functionParams',       struct(), ...             % struct with input arguments for validation script
  'onErrorReaction',      'CatchExcemption', ...    % runtime error reaction.
  'publishReport',        true, ...                 % if set to true, push validation results to gitHub
  'showTheCode',          true, ...                 % If set to true, include the validation code
  'generatePlots',        true ...                  % if set to true, allow plot generation
);

Once all validation probes are run, results produced by the successful probes are pushed to github. A summary report for all validation probes (successful or not) is also transmitted.

We have successfully tested this validation paradigm from two different machines at UPenn (both running OSX). Please try running validateAll() on your site (perhaps after changing some parameters in the included validation functions) to see what (if any) issues may arise. If you want, you can add your own validation function. There is a skeleton function (validateSkeleton.m) that contains the minimum amount of necessary code. You can copy this file, adapt it to your own needs, and then add it to the validation run using the 'unitTestOBJ.addProbe(...)' method. It is preferred that validation functions are placed under a subdirectory of ISETBIO/validation/validationScripts/ in order to organize the various scripts. Currently there are three such subdirectories.

The @UnitTest class will soon be able to save select data sets from successful validation probes so that we have a history of validation run data sets, perhaps performed on different machines, different dates etc., as well as one or more ground truth data sets. These data files will undoubtedly get very large over time, so they cannot be saved on github. One idea is to have them stored on our SVN server at UPenn. The @UnitTest object would check out the latest versions of these files, update them with the new results, and push them back to the SVN server. Access to these files would be read/write for ISETBIO team members and read-only for everyone else. If you have alternative ideas, please submit them for discussion.

Thanks,
Nicolas

assert, assertIsZero

I added two methods to UnitTest, assert and assertIsZero. These simplify the code in validation scripts for the most usual sorts of checks we are likely to make. The latter takes a tolerance.

I updated v_skeleton to illustrate how these are used, and kept the longer methods there but commented out. The longer methods might be useful in some cases.

We can chip away at converting the usage in extant validation scripts.

Nicolas, can you have a quick check that the way I coded these methods doesn't violate some basic principle of how methods are supposed to look in Matlab objects, and then close this issue?

ieXYZ2LAB, ieLAB2XYZ, v_Colorimetry

I changed the function names and tracked down places in isetbio where they were used, changing those as well. Now, I have to do this for ISET, sigh.

Anyway, v_Colorimetry no longer uses the override function, so it now works on my computer. That closes off another issue. I wonder if we should ever use the override ... or just stay out of Matlab's way?

Tutorials/scripts clean up

A) I moved all of the tutorials (in directory tutorials) into a subfolder xNeedChecking, and same with all scripts (in directory scripts). This was already set up in validationscripts, although I changed the name of the subfolder to match.

The idea is that we should, eventually, go through each tutorial/script/validation in the xNeedChecking folder and make sure they run and clean them up, then move into the appropriate subdirectory directly under tutorials/scripts/validationscripts.

This way, we can be reasonably confident than when a user executes a tutorial/script/validation it will actually run. [Based on very small n of about 5 tutorials I cleaned up last night and this morning, I'd say the current probability that one will run and that it will be consistent with current isetbio usage is about 0.6, but that in general the needed fixes are very small.]

B) I find that the cleanup and checking generally goes fast and can be done as recreational activity while watching TV. I am going to try to keep going, but others may join in the fun if they like. I don't think perfection should be the goal, but that at the very least they should all run without throwing an error.

D) Tutorials are conceptually distinct from validations in that tutorials are meant to illustrate how to do things, either in isetbio or in terms of teaching underlying concepts. Tutorials should not save or compare to ground truth data or contain any UnitTest stuff. But they should be well-commented and be set up to publish nicely. I am not currently checking this latter fact as I work through them.

E) I do think we want a mechanism to autopublish all of our tutorials, using a scheme parallel to that we use for validation scripts. Nicolas, can you set this up? There should just be a publish all tutorials script that descends the tutorials subdirectories (except for xNeedChecking) and publishes them according to category. We can use the same Info.txt scheme as well as the same first comment line is summary scheme as for validations.

F) I think that scripts may be a category we want to get rid of, with the idea that each script should either be well commented and become a tutorial, or that it should brought into the UnitTest mode and become a validation. Or some scripts may simply not belong in isetbio because they were really part of some substantive iset project that belongs outside of the scope of the core isetbio repository itself. If candidates for the latter are encountered, perhaps they should be moved to a separate subdir of scripts ('xMoveOut') temporarily, for such consideration.

Git Pre-commit Hooks Setup Instructions

Introduction

Git pre-commit hooks could enable an auto unit test for every git commit command. All the unit test information is printed if 'git commit' is called in a command window. If commit is issued by Github software GUI, the auto unit test will also be run, but no output information is visible.

Setup Step-by-Step

In this section, we will introduce how to setup the pre-commit hooks for ISETBIO. This could take you around 10 minutes. Now let's begin.

Initialize Git Repository

At this stage, we assume that you have already got git (version 1.8.2 or later) running on your machine. If not, you could download it from here.
In a command window (terminal), change current working directory to ISETBIO root folder and initialize (or re-initialize) git there. For example:

# Init git folder
$ cd ~/isetbio
$ git init

Create Pre-commit Hooks

In command window (terminal), change current working directory to hooks in .git folder of ISETBIO (e.g. ~/isetbio/.git/hooks). In hooks folder, you could see a bunch of example hooks. Create a new file called pre-commit (with no extension) in this folder and copy the following lines to it.

# pre-commit.sh
echo Start pre-commit testing...
git stash -q --keep-index
./.git/hooks/run_tests.sh
RESULT=$?
git stash pop -q
[ $RESULT -ne 0 ] && exit 1
exit 0

Set Path for Shell Commands

In isetbio/.git/hooks, create a file called run_tests.sh (this is called by pre-commit). Copy and paste the following lines to the file

# Setup alias
matlab="/Applications/MATLAB_R2014a.app/bin/matlab"

# Run unit test in matlab
cmd="cd ../../utility/unit\ test/; unitTest;"
"$matlab" -nodesktop -nosplash -nodisplay -r "$cmd"
if [ "$?" == "0" ];
then
    echo Unit test passed!
    exit 0
else
    echo Unit Test Failed
    exit 1
fi

You should change the first line to the path to your local matlab executable. Also, remember to make this file executable. You could achieve this by

$chmod u+x run_tests.sh

Adding more unit test modules

To add more testing functions, you could create a Matlab function with no output argument. Then add the function name into the try-catch structure in unitTest.m

Notes

  • Only tested on mac: This instruction is only tested on mac. It should also work on Linux machine. However, Windows machines might need some more hacks

  • Bypass the pre-commit hook: To avoid the auto unit test in some situation, you could use

    $ git commit --no-verify

  • GUI vs Command Window: Pre-commit hook will be triggered by both Github GUI or terminal commands. The difference is that in terminal, you could see the test process and error message if encountered.

isetbio script organization

There is isettools/scripts/ and it has many things including a subdirectory called validate. This makes sense to me.

Then there is scripts, which contains things with names like calibration, oneOverF, s_cones2RGC, s_scene2Cones, and toronto.

I am not groking the logic of the division.

Same with isettools/utility versus utility.

I am feeling a huge desire to clean all this up, but thought it would be good to agree on the principles first, if possible. In any case, will probably charge away on the (new) "Penn" branch tomorrow in any case.

validate ...

Validation is not running for me, and I am puzzled how it runs for you.

When I run validateFastAll, it arrives at this line in validate.m

while (scriptIndex < numel(obj.vScriptsList)) && (~abortValidationSession)

The scriptIndex is 0, which is probably corrext. But the obj.vScriptsList is empty, so this code doesn't execute in the while() loop.

I think the reason is because the validate.m file has as its arguments

validate(obj, vScriptsToRunList)

and obj.vScriptsList is not set. It appears (just guessing) that you mean to be checking scriptIndex against vScriptsList (which is a cell array and has lots of good stuff in it). But checking against obj.vScriptsList doesn't work out.

So, I have said much more than I should at this point. I know stuff is working for you better than the complete failure mode on my computer. I diagnosed it up to the point here. But probably there is something really goofy about my us (dev branch? Something).

I did this much debugging and thought I should document and kick it over to you guys.

The bug in v_sceneFromRGB

To try to track down why v_sceneFromRGB was giving different answers on the Penn and Stanford machines, I thought it would be clever to add to that script code that computes the value of the offending field by hand, and ask whether this gave the same answer as the value in scene structure (which is itself the same as what is returned by the call to sceneGet(scene,'wangular')).

That led me to conclude that sceneFromFile has a bug. Line 134 of sceneFromFile sets the horizontal field of view (aka wangular) by hand computing the horizontal size of the image in degrees, using the horizontal size as size(I,2).

Here I is the passed image argument, and you might think this would work. But in this case, I is the string giving the image filename, not the image data itself. The image string is 72 characters long, which is then taken as the number of horizontal pixels in the image.

I believe that the Stanford machines get a different answer for wangular because their full path is different form ours, and thus a different FOV is set.

I changed the offending code bit to size(photons,2), since photons contains the image data.

I remade the validation data for v_sceneFromRGB and v_sceneReIllumination. With luck, these may now validate reliably on all machines.

If these validations now pass on Brian's machines, I think this issue can be closed.

2014b namespace collision

Matlab 2014b has added a routine called xyz2lab, whose calling conventions differ from isetbio's (and it may not work the same, I didn't look).

I updated PTB's PTBAndIsetbioColorimetryTest (it failing is how I found this) so that it explicitly cd's into the appropriate isetbio directory when it tests this routine.

Solutions: i) (quick and dirty, but brittle) put isetbio in front of Matlab's toolboxes? (less brittle but painful): ii) change name of isetbio routine? iii) always call isestbio routine with some sort of wrapper? iv) Replace calls to isetbio's routine with calls to PTB's routine XYZToLab, v) write some sort of isetbio colorimetry object, and then make each current routine a method. These would then be called as (e.g.) isetbio.xyz2lab.

Tutorials, status

I've now been through all of the tutorials. Most ran fine or were easily fixed to run. These are all now filed in their subfolders under tutorials. I focussed mostly on making sure they ran, with a bit of tidying up.

I did not work hard on the comments or on merging together things that seemed redundant with each other. Some of that might be done at some point, but I think the set of tutorials is now in pretty good shape.

Tutorials that I couldn't or wasn't willing to fix are not in a top level isetbio folder xNeedFixingOrDeleting/tutorials. I think I put a note at the top of each (there are only about 5) as to what the issue was. I suspect several can just be deleted without any harm, but Brian will be the best judge of that. But t_opticsImageFormation is a really nice piece of code. What it needs is to have the data it reads be brought into isetbio format, but other than that it is ready to go.

I will move on to filtering through the scripts in the same way as I've done with the tutorials when I get a chance. And Nicolas and I will work on an autopublish scheme for the tutorials, as well as writing something that makes it easy to verify in the future that they all run without error.

OTF and pupil size

The following panel assemblies depict the human OTF as a function of pupil diameter from 1.0 mm to 6.0 mm. The code used to generate these panels is attached at the end.

(1) Perhaps the code is not setup correctly, but if it is, I think that the built-in OTF does not do a very good job for low frequencies. It needs to be computed at a much finer grid. Which brings me to my earlier issue (#26), of how to specify a fine support grid for the OTF.

(2) Also, assuming I did not do something stupid in the code, it appears that there is significant noise in the OTF for certain pupil sizes (e.g., diam=3.0 mm). Is this expected? I think these are also due to the coarse support grid.

(3) Finally, should't the amplitude of the human OTF be much higher than what is shown here ? I am comparing these curves to those shown in Fig. 2 of Watson's JOV(2013) 13(6) paper
(http://www.journalofvision.org/content/13/6/18.short?related-urls=yes&legid=jov;13/6/18)
It this attenuation because of another optical element, like a diffuser? The diffuser method is shown as 'skip'.

Nicolas

testotf_pupildiam_1to3mm

testotf_pupildiam_3 5to6mm

Here is the code:

function testOTF
    s_initISET;

    h = figure(1);
    set(h, 'Position', [100 100 650 760]);
    clf;

    % Pupil diameters to test
    pupilDiametersInMillimeters = (1:0.5:3.0);
    pupilDiametersInMillimeters = (3.5:0.5:6);

    for pupilSizeIndex = 1:numel(pupilDiametersInMillimeters)

        %% Retrieve examined pupil radius 
        pupilRadiusInMeters = pupilDiametersInMillimeters(pupilSizeIndex)/2.0/1000.0;

        %% Create human optics with given pupil radius
        optics = opticsCreate('human', pupilRadiusInMeters);

        %% Initialize optical image with above optics
        oi = oiCreate('human');
        oi = oiSet(oi, 'optics', optics);

        %% Compute optical image for given scene
        scene = sceneCreate('macbethd65');
        %% Make the scene angular size = 1 deg and place it at a distance = 1.0 m
        sceneAngularSizeInDeg = 0.5;
        sceneDistanceInMeters = 1.0;
        scene = sceneSet(scene,'wangular', sceneAngularSizeInDeg);
        scene = sceneSet(scene,'distance', sceneDistanceInMeters);

        %% Compute optical image
        oi = oiCompute(scene,oi); 

        %% Compute RGB rendition of optical image
        opticalRGBImage = oiGet(oi, 'rgb image');

        %% Retrieve the full OTF
        optics = oiGet(oi, 'optics');
        OTF    = fftshift(abs(opticsGet(optics,'otf data')));

        %% Retrieve the wavelength axis
        OTFwavelengths = opticsGet(optics,'otf wave');

        %% Retrieve the spatial frequency support. This is in cycles/micron
        OTFsupport = opticsGet(optics,'otf support', 'um');

        otf_sfXInCyclesPerMicron = OTFsupport{1};
        otf_sfYInCyclesPerMicron = OTFsupport{2};

        %% Convert to cycles/deg.
        % In human retina, 1 deg of visual angle is about 288 microns
        micronsPerDegee = 288;
        otf_sfX = otf_sfXInCyclesPerMicron * micronsPerDegee;
        otf_sfY = otf_sfYInCyclesPerMicron * micronsPerDegee;

        %% Get the 2D slice at 550 nm
        [~,waveIndex]   = min(abs(OTFwavelengths - 550));
        OTF550 = squeeze(OTF(:,:,waveIndex));

        %% Get a 2D slice through origin
        [~, sfIndex] = min(abs(otf_sfY - 0));
        OTFslice = squeeze(OTF550(sfIndex,:));

        %% Generate plots
        plotWidth = 0.3;
        plotHeight = 0.85/numel(pupilDiametersInMillimeters);
        margin     = 0.021;

        %% Plot the 2D OTF
        %subplot(numel(pupilDiametersInMillimeters),3, (pupilSizeIndex-1)*3+1);
        subplot('Position', [0.03 1+margin/2-pupilSizeIndex*(plotHeight+margin) plotWidth plotHeight]);
        imagesc(otf_sfX, otf_sfY, OTF550);
        axis 'image'
        axis 'xy'
        if (pupilSizeIndex == numel(pupilDiametersInMillimeters))
           xlabel('cycles/deg');
        else
           xlabel(''); 
        end
        set(gca, 'XLim', [-50 50], 'YLim', [-50 50]);
        colormap(gray(256));

        %% Plot the 1D OTF slice
        %subplot(numel(pupilDiametersInMillimeters),3, (pupilSizeIndex-1)*3+2);
        subplot('Position', [0.06+plotWidth 1+margin/2-pupilSizeIndex*(plotHeight+margin) plotWidth plotHeight]);
        indices = find(otf_sfX >= 0);
        OTFsfX = otf_sfX(indices)+0.1;
        plot(OTFsfX, OTFslice(indices), 'rs-');
        if (pupilSizeIndex == numel(pupilDiametersInMillimeters))
           xlabel('cycles/deg');
        else
           xlabel(''); 
        end
        text(0.12, 0.005, sprintf('Pupil:%2.1f mm', pupilDiametersInMillimeters(pupilSizeIndex)), 'FontSize', 12, 'FontWeight', 'bold');
        set(gca, 'XLim', [0.1 100], 'YLim', [0.001 1]);
        set(gca, 'XScale', 'log', 'YScale', 'log', 'XTick', [0.1 1 2 5 10 20 50 100], 'YTick', [0.001 0.002 0.005 0.01 0.02 0.05 0.10 0.20 0.50 1.0]);
        set(gca, 'XTickLabel', [0.1 1 2 5 10 20 50 100], 'YTickLabel',  [0.001 0.002 0.005 0.01 0.02 0.05 0.10 0.20 0.50 1.0]);
        box on;
        grid on;
        colormap(gray(256));

        %% Plot the RGB rendered optical image
        %subplot(numel(pupilDiametersInMillimeters),3, (pupilSizeIndex-1)*3+3);
        subplot('Position', [0.09+2*plotWidth 1+margin/2-pupilSizeIndex*(plotHeight+margin) plotWidth plotHeight]);
        imshow(opticalRGBImage);
        axis 'image'

        drawnow; 
    end
end

isettools and isettools/data

I think that the isettools directory should be renamed 'code' and that 'isettools/data' should come out of 'isettools' and live on its own at the top level.

Then we will have code, data, scripts, tutorials, and validation/scripts all of which can contain parallel subdirectory structure. How cool is that.

This proposed change requires fixing up some functions that define where stuff lives in the isetbio directory. Brian indicated he is the one who can make these changes most efficiently, so we'll leave this in his lap for a rainy day activity.

ieLUTLinear behavior, not well defined in all cases of interest.

I read through ieLUTLinear. I wanted to make sure that expanding the columns of the inverse gamma wouldn't make it behave badly. That part looks fine, since it was already written to expect three columns and indeed simply expanded a one column inverse gamma table to have three columns as its first move, if a one column inverse gamma table was passed. So that part is OK.

But I am quite worried about how it handles the dimensionality of the input RGB values to be linearized. It seems to expect these to be N by M by 3, and to convert each plane by the corresponding column of the inverse gamma table. When the input RGB is one-dimensional or two-dimensional, however, what it does may be unexpected. And there are no arg checks on the dimension of RGB, and the help text does not specify the form of the expected input.

This is not completely idle, since t_colorMatching and t_colorSpectrum exercise both of the worrisome cases. t_colorMatching passes a one-dimensional array to be linearized, and t_colorSpectrum passes an N by 3 array (with RGB across the columns). I think from my read of the code that in both these cases the first column of the inverse table will be used for the inversion. This might be good behavior for the one-dimensional case, but seems wrong for the N by 3 case. [For the one-d case, the other obvious option, which I slightly prefer, is to invert the passed single number three times so that the answer comes back as RGB, not as a single vector.]

I think this needs attention, but I don't want to touch this routine since a lot of other stuff may depend on it.

validatation issue

validateAll works fine.

This one has some issue. I tried to debug, but then something else came up.

Thanks, B

validate_PTB_ISETBIO_Irradiance
Error using UnitTest/addProbe (line 11)
Argument 'onErrorReactBy' did not match any valid parameter of the parser.

Error in validate_PTB_ISETBIO_Irradiance (line 6)
unitTestOBJ.addProbe(...

Directory names in validation folder

I think if we have a validation folder, then the names validationscripts and validationdata should be shortened to just validation/scripts and validation/data. The additional validation is taken care of because the directories are in the validation folder.

I am sorry this will make more unit testing work for you. Even so, it seems validation/validationscripts is, well, ...

Use vcGetROIData in sceneGet and oiGet.

There is a vcGetROIData function. The roiXXX gets in scene and oi should probably flow through that function, if they don't already. We should check. When they look right, we can close this thread.

[This is, as far as I can tell, the only hanging part of issue #15. Reposting just it here after closing #15.]

ieLUTInvert second argument

HJ wrote as part of a previous issue:

"We may want to change the second parameter in ieLUTInvert to number of sampling steps or a vector of sampling positions instead of 'resolution'. Let me know which one is more intuitive or preferable."

A) I agree that number of sampling steps is more intuitive than the current factor argument. But whether this should be changed or not depends on how much other code relies on the current convention -- the arg would have to be changed everywhere it is currently used and that may be in many places.

B) Once we decide on the right argument, we should allow it to be passed to displayGet(d,'inverse table').

Integrate Geisler model to ISETBIO

The code runs independently. We should be able to make some comparisons, though, and try to understand how that work would fit with what we are doing.

Extracting OTF data from optical image objects

Currently, to get the OTF data associated with an optical image object, we must call plotOI, which also plots the OTF in addition to returning the data. I believe there is no other way, but I could be wrong. If this is indeed the case, we should probably add oiGet(oi, 'otf data'), which will compute the OTF function and return it without any plotting. Objections, suggestions?

v_Cones and related

v_Cones (in validation/Cones) is now updated into the UnitTest schema.

  1. Documentation for coneCreate/Set/Get (or somewhere) should let the user know what model of the cones you get when you ask for 'human'. I am not entirely sure where this information should best go, but I couldn't figure it out. The parameters, at least some of them, are just entered as numbers in coneCreate. And I am not sure where to find out what is stored inside the ie spectrum 'coneAbsorbance' which is read in by the routine.

  2. One approach to fixing this would be to take advantage of the PTB routines, which have well-defined parameter sets that you can access with well-defined names. See DefaultPhotoreceptors and FillInPhotoreceptors.

  3. The v_Cones script now runs. Would be good to explicitly validate the answers against the PTB computations.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.