Git Product home page Git Product logo

isetbio's Introduction

The Image System Engineering Toolbox for Biology - ISETBio - is a Matlab toolbox for calculating the properties of the front end of the visual system. The toolbox begins with a description of the scene radiance, how the radiance is transformed by the optics into the retinal image, captured of light by the photopigment, the photocurrent responses in the receptors. We are actively working on modeling how the photoreceptor signals are converted into retinal ganglion cell responses.

This repository includes a WIKI that describes the software as well as many examples of how to perform computations to calculate the visual encoding of light in the eye. The WIKI also describes tools to quantify, view and and analyze the information contained at different neural stages.

As of May 27, 2024 To run ISETBio, you must download ISETCam and have it on your Matlab path. See installing ISETCam.

History

  • May 29, 2024 - See changes to ISETCam in the ReadMe for that repository. The validation routines for ISETBio have been moved into a new ISETValidations repository.

Ancient history

The ISETBIO code includes a portion of Image Systems Engineering Toolbox (ISET) that is sold by Imageval Consulting, LLC. That code is designed to help industrial partners design novel image sensors. The ISETBIO portion of the ISET code is freely distributed for use in modeling image formation in biological systems.

ISETBIO also includes the WavefrontOptics code developed by David Brainard, Heidi Hofer and Brian Wandell. That code implements methods for taking adaptive optics data from wavefront sensors and calculating the optical blur as a function of wavelength for model human eyes. The toolbox relies on data collected by Thibos and colleagues. We also gratefully acknowledge important contributions from Jon Winawer.

isetbio's People

Contributors

benjamin-heasly avatar da5nsy avatar davidbrainard avatar elinekupers avatar fh862 avatar fmrieke avatar gholder avatar hjiang36 avatar jamesgolden1 avatar jenmaxwell avatar jwinawer avatar npcottaris avatar render-toolbox avatar tlian7 avatar wandell avatar zhenglyufelix avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

isetbio's Issues

Sample times - potential eye movement and cone adaptation incompatibilities

There are two places where we use discrete sample times that are shorter than the exposure time. These are

  1. sensorGet(sensor, 'sample time interval') - This is used in cone adaptation
  2. sensorGet(sensor, 'em sample time') - This controls the sample times of eye-movement updates.

Do we want to have both available, or should there really only be one sample time that must be smaller than the exposure time?

As currently used, we set the em sample time or the sensor sample time to 1 ms, so no problems arise. But we could have problems in the future if we sample both eye movements and adaptation at different rates. The current code will not support two different sampling times. For now, we could require them to be the same and allow them to co-exist. But to have them be there and be meaningful, we need to write further routines that permit temporal interpolation and to handle the cases in which the sample times differ between eye movements and adaptation.

This issue needs some discussion, testing, and so forth.

HJ/BW

Display object questions

A) There is a comment in displaySet that the spd is the "average not peak". I'm guessing this refers to the temporal average of the spd, but the comment is sufficiently terse that I'm not sure. It would be worth clarifying this in the comment of the code.

B) As far as we can tell, there is not way to get the number of pixels in a display out of the ISETBIO display object. Is this correct? Would that be useful information to store? Currently, ppi is stored (and assumed to be the same horizontal and vertical) but not the display size itself.

C) [This part is not actually a question, just a comment.] The design of the Display object appears to treat the black spd (which I tend to call the ambient spd) as what you get when you use the gamma table to find the output for RGB input [0 0 0]. I think this is a somewhat limiting choice. I have often (well sometimes) had cases where there is enough light reflected from the faceplate of a display that the ambient spectral power distribution is not a linear combination of the display primaries. The way I handled this in the PTB calibration code is that there is an ambient spd stored explicitly as such, and the gamma functions by convention represent the incremental light produced by non-zero RGB values. The gamma table entries for RGB = [0 0 0] are then by definition [0 0 0]. The code that finds display settings then always subtracts the ambient light (or tristimulus values) from the desired light and then uses the gamma table to find the appropriate incremental settings. This approach has worked well for me, so I mention it here. It is not important for me that the ISETBIO approach be changed, I'm just mentioning it in case it is of interest. [This difference does mean that it is a bit hard to go back and forth between PTB and ISETBIO calibration objects and code for displays with non-zero ambient, but so be it. And note that if you use PTB routines to fit gamma functions, they typically enforce the 0 in 0 out convention in the functional form of the fitting.]

Units

I’ve been playing a little with some of the isetbio tutorials (and trying to keep up with the many emails). One general issue I wanted to bring up was units. How much work would it be to specify units in the relevant structures/objects? For example, there could be a units field associated with every data field. I think in the long run this could be very helpful as other users start to use things.

Tutorials needed

I spent some time writing documentation on the wiki this morning. I think someone would now have a fighting chance to know what to do to get started. I'm also sure more tuning would help.

We have tutorials called t_sceneIntroduction and t_oiIntroduction. These and a tutorial that needs to be written (t_sensorIntroduction) seem like the obvious place for someone to start. As the mood strikes, putting some effort into making the first two are really clean (they may be, I didn't look) and writing the third seem like a good idea.

We can then think about t_pixelIntroduction, t_wavefrontIntroduction, t_displayIntroduction, etc. I think a good goal is one introductory tutorial for each data structure that has its own gets and sets.

Thinking about architecture

Following up on our Thursday conversation of last week, I had a go of creating some wiki pages that describe our current "objects" and how they are related. Start here.

https://github.com/isetbio/isetbio/wiki/ISETBIO-Architecture-Vision

I probably forgot some objects, which I invite others to add. Each object then has its own page with some comments about what it is supposed to do and some questions/action items if I had any to list.

This is a try at our creating documentation for how we might make some gradual modifications to rationalize bits of the current code, without going completely overboard and starting from scratch. Ideally, this would become some sort of actual documentation as we proceed, but for the moment it is more conceptual.

If you have thoughts, ideas, questions, perhaps try adding them to the wiki pages I started.

coneTemp merge changed sensor image sizes?

Using the code before the merge commit, my optical images generated sensor images that were the size 164x164. Running the same script with the code after the merge results in sensor images the size of 123x123.

I'm wondering if this was due to some fix introduced in coneTemp or an unintended bug?

photon rate in sensor

The photon rate in sensor appears to be discretized in a way I don't completely understand. Thus if I do something like

stimulus(1:100) = 0;
stimulus(1) = 1;
sensor = sensorSet(sensor, 'photon rate', stimulus)
foo = sensorGet(sensor, 'photon rate')

foo(1) is 0. I might have it wrong, but it seems like it is getting rounded so we have discrete photon counts in each time window - but for the cone model the time windows will be 1 ms or less (for the DEQs to run stably), and certainly can have only a few photons in that window. And that will be worse for rods!

Fred

Human Eye Movement and Cone Mosaic Share RNG Seed

It seems that the functions emGenSequence and humanConeMosaic both make use of the same RNG seed field, 'rSeed', in the sensor object. emGenSequence also sets the 'rSeed' field in the sensor while humanConeMosaic does not. This results in coneAbsorptions always returning the same volt/photon data in a sensor that has been passed through emGenSequence. This is because coneAbsorptions calls sensorHumanResize which calls sensorSet('size') which in turn calls humanConeMosaic. humanConeMosaic uses the RNG seed saved in the sensor. This results in the next call to noiseShot to always return the same value for a given sensor because randn is always returning the same matrix. This makes it impossible to get different noise on the same image via coneAbsorptions. Additionally, the sensor object adds the same randn matrix to different optical images in the noiseShot function.

This does not occur in sensor objects without eye movement as the RNG seed is not stored in the sensor object, resulting in different values from randn. I was wondering if this behavior is intended, and if there is any intended relationship between eye movement and the cone mosaic.

What type of classifier to use for computational observers

If James and I are reading svmClassify.m correctly, it is using an svm with a radial basis kernal by default. I'm not sure this is what we want as the default. Maybe using linear classifiers as a point of departure would be wise?

Getting data to the right place

I think we should move isettools/data -> data. This requires fixing up the functions that know how to load that data so they look in the correct place. Then we can make my dreams come true by renaming "isettools" to "code" without breaking anything.

When we do this, it would be a good moment to think about how we want to handle big data that is stored elsewhere (I guess, on the Stanford server). Probably we want some preferences that say where to look for data, and some search order. The default should be to look in the data folder with the distribution.

I don't think every last detail needs to be worked out before moving the data dir, but thinking about the long range plan when looking through this, and getting the data read/write routines well factorized as a starting point, seems clever.

How does eye movement work in coneAbsorptions.m?

I've read through the coneAbsorptions code. It seems like the sensor is padded to account for the furthest possible location that the eye movement will bring it. Then, the LMS cone absorptions are calculated for all the cones in this enlarged cone mosaic. Afterwards, the original cone mosaic and the eye movement positions are used to get isomerizations due to eye movement.

Scaling the tremor amplitude by a large factor causes the movie to appear different from what I expected.

Here is some code:

fov = 2;
params.freq = 6; params.contrast = 1;
params.ph  = 0;  params.ang = 0;
params.row = 256; params.col = 256;
params.GaborFlag = 0.2; % standard deviation of the Gaussian window

% Set up scene and oi
scene = sceneCreate('harmonic', params);
scene = sceneSet(scene, 'h fov', fov);
oi = oiCreate('human');
oi = oiCompute(scene, oi);

% Create the eye movement object
em = emCreate;

sensor = sensorCreate('human');
sensor = sensorSetSizeToFOV(sensor, fov, scene, oi);
sensor = sensorSet(sensor, 'eyemove', em);
sensor = sensorSet(sensor, 'positions', zeros(500, 2));  % This is for 500 ms
sensor = sensorSet(sensor, 'exp time', 0.001);
sensor = sensorSet(sensor, 'time interval', 0.001);

% Magnify the amplitude to get a better visualization later
amp = emGet(em,'tremor amplitude');
em  = emSet(em,'tremor amplitude',100*amp);

sensor = sensorSet(sensor,'eye movement',em);
sensor = emGenSequence(sensor);

sensor = coneAbsorptions(sensor, oi);
photons = sensorGet(sensor, 'photons');
implay(photons/max(photons(:)));

The resulting movie looks like the image is floating around the window. Looking at the coneAbsorptions.m code, I would have thought it should look like the eye panning around the scene.
Where would I be misunderstanding the code?

Xiaomao

plotFoo should be fooPlot

I think it was bad of me to have plotFoo instead of fooPlot. Just like there is a sensorGet/Set/Create there should be sensorPlot, and so forth for the other main structures. Live and learn.

I have been meaning to simply convert all the plotFoo to fooPlot in ISET. I think we should do it here, too. The path is:

Copy the existing routines to the new names, and put them in the right directory.
Change the old routines to call the new routines.
Add a warning in the old routines that the wrong routine was called and to fix the code.
Wait until the problem is solved.

coneAbsorptions, noise flag

We noticed that coneAbsorptions ignored the setting of the sensor's noise flag. We modified it so that it does not add noise when the noise flag is 0.

If this is fine, you can close this issue. If it's not good, then we need to find another way to get noise free cone responses. David and Xiaomao

plotOI('ls wavelength') bug - around line 826

See the comment in the code around line 826

    % The spaceSamp and nSamp parameters are not clearly enough
    % defined.  The reason we care is because the code is broken when
    % spaceSamp is not 40.  No idea why.  Let's come back and clean it
    % up.

Where should full validation data go?

The full validation dataset is currently about 150 MB. It will probably grow.

Nicolas and I have a copy on dropbox. That may not be the optimal place, although it would be easy to share with a few people.

We could make a separate gitHub repository that has the validation data. This would be OK for now and easy to point people to and maintain. If you put a gun to my head, that is what I would choose at this instant.

There might be a better idea.

Branches?

There appear to be a number of branches of isetbio now -- one is called coneTemp and one is called spectrum.

I think we decided a while back that branches should not exist for too long, just so we don't get really confused. But in any case, I'm simply wondering what these branches are and whether I need to think about them.

unit test ...

Asking for help from the pros, here. I am up to date on the UnitTestToolbox and on the master branch of isetbio.

[ 1] 'v_Colorimetry'
Internal validation : PASSED
Run-time status : no exception raised
Reference to non-existent field 'generateGroundTruthDataIfNotFound'.

Error in UnitTest/validate>doFastValidation (line 388)
if (projectSpecificPreferences.generateGroundTruthDataIfNotFound)

Error in UnitTest/validate (line 249)
abortValidationSession = doFastValidation(obj, fastLocalGroundTruthHistoryDataFile, validationData,
projectSpecificPreferences, smallScriptName);

Error in UnitTest.runValidationSession>validateFast (line 80)
abortValidationSession = UnitTestOBJ.validate(vScriptsList);

Error in UnitTest.runValidationSession (line 27)
validateFast(vScriptsList);

Error in validateFastAll (line 40)
UnitTest.runValidationSession(vScriptsList, 'FAST');

Field consistency

I fear we need a really careful review of object precedence and field consistency.

I discovered that sensorSet('cone type') didn't update the cfa.pattern field, so that two places where the same information could be stored in the sensor structure could get out of whack with each other. This caused some surprising results.

I updated sensorSet('cone type') so that it also updates the cfa pattern. I"m viewing this as a quick patch. We will run into the same problem if one sets the cfa or cfa pattern -- I didn't fix it in that case because there won't always be a cone type field and I am charging on with other things.

My sense is that we shouldn't really have both a cone type and cfa.pattern field in the sensor struct, and that setting/getting cone type should just operate as synonyms for setting/getting the cfa pattern. But there may be reasons the second field crept in.

More generally, as structures get set as part of other structures more and more (cone, pixel, wvf, etc.), who is in charge of redundant parameters probably needs to be carefully specified and implemented.

Deprecated old coneAdapt code

I moved isettools/coneAdapt.m and isettools/coneAdaptation.m to isettools/deprecated, and tutorials/coneAdaptTutorial.m to tutorials/deprecated. This is because those contain the old code that we decided we no longer want. Just letting everyone know where it went, and will now close this as an issue.

If doing this creats an issue, someone else can reopen it and comment.

Buglet in noiseShot

In the noiseShot function, where it calculates the Poisson sampling for means below the poissonCriterion, theNoise and noisyImage are set to the same value, vn. It appears that noisyImage should vn, and theNoise should instead be vn - electronImage. Can this be confirmed?

Another possible issue is that the noisyImage is not guarded against containing negative values. Should this be a concern? I've added a few lines that check whether theNoise + electronImage < 0, and update the values in theNoise appropriately.

The help comments state that the function will sample using a Poisson distribution when the number of electrons is less than 20. However, in the function, this number is set to 15. I believe that the numbers should match but am not sure which one is preferred. I've also added a brief output parameter section in the header to highlight that noisyImage is in units of volts whereas theNoise is in units of electrons.

Validation documentation

The setIsetbioUnitTestPreferencesTemplate discussion and use needs some tweaks. We found it, ran it, and it worked fine for FastAll by default but not for the FullAll (of course). We will think about how to help people like us. We made some edits to the directions at the top.

This doesn't seem like a hugely problematic issue right now because we are the only ones running the validation code. Still ...

Also, there should be a version of this function in the UnitTestToolbox directory that guides set up for any project. Very useful tool!

oi validation now failing, looks like a change in row/col convention in OTF fields

Two validation scripts involving the optical image structure are now behaving differently than they once did, so they fail their comparison against the stored ground truth data.

It looks like (see validation dump below) this is because certain fields (e.g. OTF.fx, OTF.fy, OTF.wave) have been transposed relative to the way they once were.

Was this an intentional change? I can regenerate the ground truth data for the relevant validation scripts, but since the idea of the validation scripts is to catch unintended changes I thought I should check first about what is going on.

DB


Ground truth file : /Users1/Shared/Dropbox/ISETBIOFullValidationData/optics/v_oi_FullGroundTruthDataHistory.mat
Full validation : FAILED against ground truth data of 05-Feb-2015 14:12:32.
> Ground truth info : wal-isc-video012.isc-video.upenn.edu / MACI64, MATLAB 8.4.0.150421 (R2014b) by 'nicolas'
> Local host info : ray.PSYCH.UPENN.EDU / MACI64, MATLAB 8.4.0.150421 (R2014b) by 'dhb'
[data mismatch 1] : 'groundTruthData.humanOI.optics.OTF.fx' is a [ 1 60 ] matrix whereas 'validationData.humanOI.optics.OTF.fx' is a [60 1 ] matrix.
[data mismatch 2] : 'groundTruthData.humanOI.optics.OTF.fy' is a [ 1 60 ] matrix whereas 'validationData.humanOI.optics.OTF.fy' is a [60 1 ] matrix.
[data mismatch 3] : 'groundTruthData.humanOI.optics.OTF.wave' is a [ 1 31 ] matrix whereas 'validationData.humanOI.optics.OTF.wave' is a [31 1 ] matrix.
[data mismatch 4] : 'groundTruthData.humanOIFromScene.optics.OTF.fx' is a [ 1 60 ] matrix whereas 'validationData.humanOIFromScene.optics.OTF.fx' is a [60 1 ] matrix.
[data mismatch 5] : 'groundTruthData.humanOIFromScene.optics.OTF.fy' is a [ 1 60 ] matrix whereas 'validationData.humanOIFromScene.optics.OTF.fy' is a [60 1 ] matrix.
[data mismatch 6] : 'groundTruthData.humanOIFromScene.optics.OTF.wave' is a [ 1 31 ] matrix whereas 'validationData.humanOIFromScene.optics.OTF.wave' is a [31 1 ] matrix.

wave -- is it a row or column vector (or both)

This was raised at the end of #13, which was mainly about validation scripts failing.

I closed that issue and am opening this one as a reminder that we haven't dealt with this issue (which is what caused the validations to fail.)

Cone model

I just added some updates to the cone model. The place to start is with s_coneModelValidate, which should run through. I added some data - which for the time being I placed in a folder within scripts. That could get moved to the Stanford data server as a test (it's very small, but a way to test that approach).

Surprising (to me) side effects of displaying scenes

Making a call to show a scene, (e.g. vcAddAndSelectObject(scene); sceneWindow;) has the unexpected side effect of changing the data in the scene. Pasted below is a short function that illustrates the behavior.

This is problematic not just because it is surprising. Our validation scripts control whether things are displayed or not to run faster under some circumstances but allow useful debugging plots under others. This side effects causes validation to fail if the data are generated with plots turned on and then checked with them turned off, or vice-versa.

The problem seems to be related to a truncation of wavelength outside of 400-700 that happens in one case and not the other. There must also be some global variables hidden somewhere, presumably as part of what happens in ieInit.


function displaySideEffects
%
% Script to demonstrate side effects of displaying
% a scene in a window.

%% Init
close all; ieInit;

%% Create isetbio display
displayToTest = 'LCD-Apple';
d = displayCreate(displayToTest);
gammaTable = displayGet(d, 'gamma table');
if (size(gammaTable,2) ~= 3)
error('Cannot deal with a display that has other than 3 primaries for this test.');
end
nInputLevels = size(gammaTable,1);

% Create a scene using the display
RGBToTest = [0.3 0.73 0.42]';
theRGBImage = ones(20,20,3);
for i1 = 1:3
theRGBImage(:,:,i1) = round((nInputLevels-1)*RGBToTest(i1));
end
sceneDegrees = 10; % need large field
scene = sceneFromFile(theRGBImage,'rgb',[],d);
scene = sceneSet(scene,'fov', sceneDegrees);
sceneSize = sceneGet(scene,'size');

%% Compute optial image, for delta function optics.
oi = oiCreate('human');
optics = oiGet(oi,'optics');
optics = opticsSet(optics,'off axis method','skip');
optics = opticsSet(optics,'otf method','skip otf');
oi = oiSet(oi,'optics',optics);
oi = oiCompute(scene,oi);
oi = oiSet(oi,'fov',sceneDegrees);

%% Extract irradiance
roiPixels = 10;
rect = [sceneSize(2)/2,sceneSize(1)/2,roiPixels,roiPixels];
roiRoiLocs = ieRoi2Locs(rect);
foo = oiGet(oi,'roi mean photons', roiRoiLocs);

%% Display scene, that's all. Should be harmless
%
% But commenting this line in and out affects the
% irradiance contained in the optical image.
vcAddAndSelectObject(scene); sceneWindow;

%% Do the oi thing again
oi = oiCreate('human');
optics = oiGet(oi,'optics');
optics = opticsSet(optics,'off axis method','skip');
optics = opticsSet(optics,'otf method','skip otf');
oi = oiSet(oi,'optics',optics);
oi = oiCompute(scene,oi);
oi = oiSet(oi,'fov',sceneDegrees);

%% Get the irradiance from the new one. It's different.
%
% Seems to have to do with wavelength truncation
fee = oiGet(oi,'roi mean photons', roiRoiLocs);
faa = foo(:)-fee(:);
max(abs(faa(:)))
figure; clf; hold on
plot(foo,'r');
plot(fee,'g');

end

Validation script v_DIsplayLUTinversion.m

This script requires the Matlab optimization toolbox. Maybe that's OK. But if you don't have it the validation script

  • tries to run PTB FitGammaPow, which relies on the optimization
  • throws an error

Not sure what (if anything) should be done about this. At this point it just means that the validation comparison between PTB and ISETBIO on LUT doesn't run. But if you say it is fine on a machine that has the optimization toolbox, then that's good enough for me.

In which case, we could remove this validation script.

Tweak to ieInit

I put a "clear global" into ieInit. I think this makes things clear out a bit better. Not 100% sure but it hasn't done any apparent harm to me yet.

I also made s_ieInit simply call ieInit, since we are trying to make s_ieInit go away. But, I left everything that was there in, just commented out.

Eye movement buglet

When we set the eye movement sample time to 50 msecs (0.05) right after the call to eye movement create in t_eyemovement, it crashes. The error is thrown from the emGenSequence function.

Made new isetbio repository

This cleared out a lot of space. The master branch is what was the dev branch. Wiki and gh-pages also moved over. You will want to check out a clean copy of this as well as gh-pages and the wiki repository if you are using them.

Doing this cleared all history and issues. They are still available in isetbio_v0.1.

There are some image files referred to by the wiki that were not in gh-pages. These had to do with the coneAdapt documentation. I will email Haomiao with a list.

I pulled out a bunch of large data files. I will email Brian and Haomiao to see if we want any of them.

This should be good to go.

I am not sure what will happen to people who we don't know who were using the old isetbio repository and who try to check something out.

Remote data, load times

I brought v_rdata into the UnitTestToolbox world and also did some timing on remote loading. I compared three ways of loading a .mat file. One is using rdate. The second is reading it via a mounted version of Crimson, which I mounted using SSHFS/FUSE on Mac OS/X. The third is reading a local copy from my desktop. The mounted disk appears to do some sort of local caching.

For a 10.6 MB file (scene/benchHDR.mat, variable scene):
SSHFS/FUSE, 2cd time and thereafter: 0.33 seconds
Local: 0.07 seconds
rdata: 1.7 seconds

I also tried a 583.3 MB file (slantedBar/slantedBarMultiLF80Diffract.mat).
SSHFS/FUSE, 1st time: 395 seconds
SSHFS/FUSE, 2cd time and thereafter: 6.5 seconds
Local: 6.4 seconds
rdata: 57 seconds

I think this is going to be too slow for us to rdata. The cached mounted disk might work. We could certainly set up some sort of sync program so that we kept a local copy of the disk on Crimson and kept it up to date when we changed things.

But, I'm curious on how these times are at Stanford when there isn't as much internet between you and the server as there is for us.

sensorDemosaicCones

I've had a read through and done some testing on sensorDemosaicCones.

It was giving slightly different answers than the griddata version I wrote. I tracked this down to two places where x and y were being treated differently in the two pieces of code. I rewrote the code in sensorDemosaicCones to match what I had done previously, with fairly extensive comments and the old code still there but commented out (it is two lines, one the call to meshgrid and one the call to ind2sub.) You might think the two reversals in x and y would simply cancel each other, but this is not exactly true. I think the way I left it matches what the documentation says to do, but I am not 100% confident and could be convinced otherwise. Haomiao, perhaps you can take a careful look over this.

The other buglet is that the code was scaling the LMS values as would be needed to make the cone sensitivities have a peak of unity, which is not how the quantal efficiencies are scaled. This seems like the right ideas, but I think the scale factors should be computed in with the cone sensitivities in energy rather than quantal units. I added a conversion to energy units. Again commented and again worth a careful read over.

Finally, I added a second return argument, which is the demosaic'd isomerizations array. Sometimes one wants this rather than just sRGB. I did not pass this through any scaling or fixing for dichromats, as I think the raw form is most useful when this array is wanted.

I am still getting a difference in the sRGB images I create outside isetbio and with the one created in this routine, which I have yet to track down. It is mysterious because in many other instances the two ways of doing it agree well. But I think this is as likely a bug in my test code as in sensorDemosaicCones.

s_humanLSF - error related to wave, I think

This is a wavelength interpolation error ... one of you probably knows about this already, and it may even be in the Issues somewhere. I added it here because I couldn't find Issue immediately.

Also ... a number of the s_<> scripts are not working now.

Error in interp1 (line 191)
F = griddedInterpolant(X,V,method);

Error in oiSet (line 253)
p = interp1(oiGet(oi, 'wave')', p', val, 'linear*', 0);

Error in opticsSICompute (line 20)
oi = oiSet(oi, 'wave', sceneGet(scene, 'wave'));

Error in oiCompute (line 75)
oi = opticsSICompute(scene,oi);

Error in s_humanLSF (line 35)
oi = oiCompute(scene550,oi);

Help with unit test validation

Validation has stopped working for me.

I did re-run: setIsetbioUnitTestPreferencesTemplate

The error (which I have not tried to debug) reads

UnitTest will use preferences for the 'xxyyzzprojectname' project

A set of existing preferences for project 'xxyyzzprojectname' was NOT found.
Error using UnitTest.usePreferencesForProject>initializePrefs (line 32)

ProjectSpecificPreferences do not exist for project 'xxyyzzprojectname'. Did you run the
'setProjectSpecificUnitTestPreferences.m' script for 'xxyyzzprojectname'?

Error in UnitTest.usePreferencesForProject (line 21)
initializePrefs(initMode);

Error in validateFastAll (line 10)
UnitTest.usePreferencesForProject('xxyyzzprojectname', 'reset');

Here are a few variables ...

ib = getpref('isetbioValidation')

ans =

       projectSpecificPreferences: [1x1 struct]
           onRunTimeErrorBehavior: 'rethrowExceptionAndAbort'
                    generatePlots: 0
                  closeFigsOnInit: 1
                        verbosity: 'low'
                 numericTolerance: 1.1102e-13
              graphMismatchedData: 1
              compareStringFields: 0
                      projectName: 'isetbioValidation'
                validationRootDir: '/Users/wandell/Github/isetbio/validation'
             alternateFastDataDir: ''
             alternateFullDataDir: '/Users1/Shared/Dropbox/ISETBIOFullValidationData'
               clonedWikiLocation: '/Users/Shared/Matlab/Toolboxes/ISETBIO_Wiki/isetbio.wiki'
            clonedGhPagesLocation: '/Users/Shared/Matlab/Toolboxes/ISETBIO_GhPages/isetbio'
                    githubRepoURL: 'http://isetbio.github.io/isetbio'
generateGroundTruthDataIfNotFound: 1
                    listingScript: 'validateListAllValidationDirs'

ib.projectSpecificPreferences

ans =

                      projectName: 'isetbioValidation'
                validationRootDir: '/Users/wandell/Github/isetbio/validation'
             alternateFastDataDir: ''
             alternateFullDataDir: '/Users1/Shared/Dropbox/ISETBIOFullValidationData'
               clonedWikiLocation: '/Users/Shared/Matlab/Toolboxes/ISETBIO_Wiki/isetbio.wiki'
            clonedGhPagesLocation: '/Users/Shared/Matlab/Toolboxes/ISETBIO_GhPages/isetbio'
                    githubRepoURL: 'http://isetbio.github.io/isetbio'
generateGroundTruthDataIfNotFound: 1
                    listingScript: 'validateListAllValidationDirs'

Cone adapt mysteries

There was code in sensor get that allows one to get adapted data. This was incorrectly pointed to in the help text and once we called it correctly (by using sensorGet('adapted data') it crashed. That's because the code itself called sensorGet('adapted volts') which doesn't exist. We changed that line to be sensorGet('adaptation offset') because the left hand side had a variable name called offset. But we aren't sure this is right. We will push this change up to gitHub once Xiaomao's permissions get sorted out (we should be able to deal with that).

Then we realized that we are very confused, because the cone adaptation tutorial itself runs. It calls a function named coneAdapt. We haven't yet traced through this all, because we figure you guys can tell us what is known to be stale and where we should be looking.

Display Structure Change - Ambient Field Added

A new field called ambient is added to display structure. This field is used to store spd when display is set to black in units of energy and is a vector of same length of display wavelength samples (d.wave).

With this change, elements of the first row of the gamma table are all zero. And displayGet(d, 'dark level') is removed. To get the ambient spd, use displayGet(d, 'ambient spd')

  1. Do we need to change displayGet for 'white xyz', 'white spd', 'white xy', etc. to add ambient to white
  2. Need to change validation and PTB-ISETBIO conversion scripts

I hope this does not break something. If you find any bugs, let me know.
Thanks

ValidateFullOne, ValidateFullAndPublishOne

I would like two more top level validation scripts, as indicated in the issue title. These should each list all of the validations and ask the user which one of them to run.

This would be very useful when developing/debugging a new validation scrip, so that you could develop and publish without having to run all the scripts.

If it is hard to publish just one, I'd settle for ValidateFullOne. But sometimes it requires some fussing to get the published version of a script to look right, so it would be nice to be able to iterate on fixing without having to do them all.

What I can imagine might be hard is to rewrite the wiki page with just one, especially if it wasn't there before. So I would also be happy if ValidateFullAndPublishOne said it couldn't publish unless the script had already been published at least once, but would update the html published version so that one could refine.

cone mosaic plotting updates

In response to David's and Brian's emails regarding duplicates of cone plotting functions, I made a couple of additions to the existing conePlot.m and sensorConePlot.m in order to enhance their functionality.

  1. I added a fifth input parameter, whiteBackround, which is a boolean. If it not passed, or if it is set to false, the generated plot has the old appearance (black background, with cones having a Gaussian profile). If whiteBackground is set to true, the generated plot has a white background and cones look like disks.
  2. I added a fourth output parameter, coneMosaicImage. If it is not passed, both functions behave as they did before: they plot the generated image. If coneMosaicImage is passed, both functions return the generated RGBimage without plotting it.
  3. I added a helper function, called conePlotHelper.m, whose purpose is to help generating plots of cone sub-mosaic images. This function expects as inputs a sensor struct, a desired number of cones to plot, and a desired cone size (aperture diameter in pixels), and it returns the variables expected as input arguments by conePlot.m.

doc conePlotHelper for more info.

If there are no objections, I will close this issue.

Nicolas

Get rid of legacy p-files

Brian should go replace the p-files with m-files. There is nothing secret in there. It runs now but it is bad form to have the p-files and when we finish we should probably be able to deprecate ieNotDefined() also.

Ideas for outersegment class: Thursday skype meeting

We have a draft model of code for the outersegment class that I will put into a branch this evening. David and I had a brief discussion about the design of the class, and I've listed some notes from that below. We thought this could serve as a starting point for the Skype meeting on Thursday.

a) are we happy with the set/get methods that call through set/get functions so that both, e.g., os.set and os=outersegSet(...) work?

b) constructor syntax -
1) should it do something by default with no arguments?
Like sensor, oi, scene
2) what syntax for passing arguments to the constructor?
- key/value pair (always)
- defined optional arguments in fixed order
3) what parameters do we want to be able to pass?
for sure: calculation parameters

c) should the outersegment object store the sensor object?
or should it extract the data it needs?
1) how does this interact with handles or structs?
if you store sensor, what if sensor is changed?

d) what is the structure and the syntax for handling different
calculation options?
1) use a structure for algorithm's parameters
2) define a subclass for each version of the calculation
3) define a computational object for each algorithm which
encapsulates the parameters it needs and the algorithm but nothing
else, and attach this to the outersegment object

Let's make a function that classifies on subregions of the sensor - and related honks

For running the SVM classification we will want to pick out little bits of the larger sensor array. This is mainly because we can't really run SVMs on a very big array (e.g., 500 x 500) efficiently. For about 1 deg, we are OK.

So, we will use either a 'saliency' criterion or a 'random sample' criterion to pull out 1 deg regions and check the classification multiple times. If we never see a difference, say after 5 or so samples, we will declare that the two sensor responses are not discriminable.

So, we need a function to pull out the regions in a systematic way.

This also raises the question of whether we should be putting the classifiers in ISETBIO - the classification code is currently in computationaleyebrain. Or whether we make a parallel classification tool directory. And how we will oversee the demise of computationaleyebrain in favor of BL and BL.

iePoisson always returns same answer for a given lambda

The iePoisson function always returns the same answer when the iePoissrnd function is used. This is because the C++ file for the mex function does not set the seed for number generator it uses. Nicolas and I addressed this issue by adding a third and optional parameter to iePoissrnd that allows for setting a seed in the C++ file.

Locally, I am setting the seed to be rand * (arbitrarily large prime number). This makes the behavior of iePoissrnd in line with poissrnd from the stat toolbox, where multiple calls to poissrnd with the same lambda will return different answers. I want to make sure that this is the intended behavior of iePoisson. Additionally, is there a need to add a flag in the parameters for iePoisson to dictate whether or not this random seed is to be set?

lms2xyz

The function lms2xyz appears to have a bug for handling matrices (XW format). For example, this produces an error:

cone = coneCreate('human');
xyz = lms2xyz(cone.absorbance);

The problem is that a transpose on lms is in the wrong line (l 24 rather than l 22).

Question about noiseShot and iePoisson

In the header comments of noiseShot, it states that the function estimates the Poisson distribution using a normal distribution where the standard deviation is the square root of the mean when the mean is large. This is for the purpose of saving computational time.

However, when iePoisson uses the MEX Poisson sampler, it is quite fast. In fact, I ran a few tests in which the MEX sampler was faster than what noiseShot currently does. Therefore, I was wondering if there is still a need to approximate the Poisson distribution in noiseShot as opposed to just sampling entirely from the MEX Poisson sampler.

Sensor wavelength sampling

We hit a buglet where when you have sensor = sensorCreate('human') it becomes a bit unclear how to change the wavelength sampling.

Doing a sensorSet of wavelength on the sensor object changes the wavelength sampling of that object. But there is also a cone object in there, and it has its own wavelength sampling which isn't changed. This causes problems.

The question is, what is the desired behavior. Should the sensorSet of wavelength 'recurse' down and change everything? Or should this be made explicit, so that the user should get the cone object, change its sampling, put it back, and then change the sensor wavelength sampling.

I think similar issues may arise with the wavefront optics stuff, where there is wavelength sampling in the wvf object that may differ from that in the oi object.

Thoughts?

Updates to coneAbsorptions and coneAdapt - possible new object to follow Sensor?

@fmrieke , @DavidBrainard , @xnieamo and I have begun to discuss changes to coneAbsorptions and coneAdapt. We think it may be worth creating a new object to follow the sensor object that combines these aspects of cone responses. The sensor object outputs isomerizations as a function of time, and the new object could take R*(t) and output current as a function of time, pA(t).

The most difficult parts of this work have already been completed in that Fred and @hjiang36 have designed models and written code for computing the cone responses in a number of different ways. This is a question about the architecture of isetbio in terms of whether we want to keep cone responses as a part of the sensor object. In discussing this with David and Xiaomao, we thought this could be a good point to make a new object that handles cone responses ('coneOuterSegment' or 'photoTransductionCascade', perhaps).

Fred has a start on code for integrating eye movements with both the linear and nonlinear (differential equations) cone responses. He suggested creating a branch where he could post his code for us to explore.

We are open to suggestions and objections.

v_sceneExamples now fails, problem seems to be related to sceneGet(scene,'wave')

v_sceneExamples now crashes. It crashes when it tries to make the tungsten scene.

The problem seems to be that this code, in sceneCreate, produces an illuminant structure that is one too many levels indirected with respect to the wavelength specification.

% Create the scene variable
if isempty(args) || length(args) < 2 || isempty(args{2})
scene = initDefaultSpectrum(scene,'hyperspectral');
else scene = sceneSet(scene,'wave',args{2});
end
wave = sceneGet(scene,'wave');

The returned wavelength is a structure, with one field, wave. So to get the a wavelength sampling, you have to ask for wave.wave. Code further below then is unhappy, because it expects the wavelengths sampling to be accesed by just variable wave. (That is what I also expect.)

Any ideas of what change might have produced this new behavior?

[I did check out a fresh version of isetbio, because of Brian's note that he thought the DOS attack on gitHub may have screwed up his local copy. But that was not the problem.]

Merge coneTemp

coneTemp branch has been merged into master. The only conflict happens in file coneAbsorptions. I hope I'm not breaking something.

I'll do more test recently and if you find any bugs there, let me know.
Thanks

HJ

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.