zapit-optostim / zapit Goto Github PK
View Code? Open in Web Editor NEWGeneral purpose optostimulation system
License: GNU Lesser General Public License v2.1
General purpose optostimulation system
License: GNU Lesser General Public License v2.1
e.g. if no input args should it present a random stimulus? What input options are needed?
Cache and write tests to ensure it works as expected.
see:
function startTesting(obj, trialPower, Point2, laserPower)
% take coordinates of two points[x and y Coords], and exchange laser between
% them at freqLaser for pulseDuration seconds, locking it at a given point for tOpen ms
% inputs: obj.chanSamples,
% CoordNum,
% LaserOn
% powerOption - if 1 send 2 mW, if 2 send 4 mW (mean)
Somehow we need to alter how this works. It's also not obvious unless reading this method what powerOption
actually is.
How was @majaskret testing laser power before? It looks like it was done during scanning. Is that so? I am thinking of something more simple. Will it work? Why was the more complicated thing done before?
See lsmacq, which directly calls NI-DAQmx functions through MATLAB's .NET interface. People then need only install .NET along with DAQmx and we don't need ScanImage.
Calibrate with a piece of paper using a predefined grid. Show user the results so they know the thing worked.
Then before experiment load the points, scan through them and record beam position, display intended and actual positions.
So user only re-calibrates if they need to.
.scan
and .light
fields in the structure. One waveform is enough 84f83b8lghtChnl
matrix may have stuff in it that we never use.Once it's all running the pointer
(API) should be separated from the GUI components. That way we can finish work on the API and even provide API-only instructions so people can get started before a mature GUI is ready.
e.g. is it only the code
folder or are the examples
and development
folders in there too?
Existing system only delivers 4 mW max. Why?
ML tells me that that it's important for laser power to ramp down at the end of a trial and this must happen even if the trial ends early. This has supposedly been implemented but I can not see how. There is a rampDown
property in beamPointer
but nothing uses it and it's an orphan.
I see in beamPointer.stopInactivation
that there is:
voltChannel(:,1:2) = obj.chanSamples.light(:,[1 1],1); % just zeros
voltChannel(:,3:4) = obj.chanSamples.light(:,[1 1],1); % just zeros
obj.hTask.writeAnalogData(voltChannel);
But we are running at 1 kHz (it seems) so that is unlikely to be a smooth ramp down. Have to ask @majaskret
Doc new process. Everything from the start: how to set up laser.
Border pixel setting for calibration does not seem to work as intended. Is not symetrical. One side bigger than the other.
It is becoming hard to work on many features without hardware now, so being able to simulate sufficiently to develop is becoming a concern. It's a lot of work, but it'll pay off...
I have: beamPointer2.m
and beamPointer_constantLight.m
, in addition to beamPointer.m
. What do the first two do? Establish how to get rid of them.
ML says he only ever uses beamPointer
.
In stimConfig.get.chanSamples
we wait 1 ms to turn on the laser after the beam should have arrived at the correct location. Should verify that this sufficient.
Around line 284
% allow 1 ms around halfcycle change to be 0 (in case scanners are not in the right spotch
MASK = ones(1,obj.numSamplesPerChannel);
sampleInterval = 1/obj.parent.DAQ.samplesPerSecond;
nSamplesInOneMS = 1E-3 / sampleInterval;
The waveforms being sent to the scanners are 2 cycles long. Maybe more. If we have just one cycle, then we can get a smoother decrease in laser power when we ramp down without having to resort to doing anything smarter than just changing the amplitude of the whole buffer to a single value.
There is a memory leak that happens seemingly at random. I don't know what triggers it or where it's happening. Last time it happened was after sending samples. One possibility is that it is related to the camera. If so, a workaround is to disable the camera feed during experiments:
>> hZP.cam.stopVideo
>> hZP.cam.startVideo
Whether the camera is running can be determined by hZP.cam.isrunning
.
If you experience this issue, please reply here on whether the above works.
If delay between frames is too short, the scanner calib data can be bad. We have no way to test this automatically or to show to user.
Is DAQ.moveBeamXY
in the right place? Should it perhaps be in pointer
? setLaserInMW
is in pointer
...
It will be based on this: https://github.com/raacampbell/AllenAtlasTopDown
The code in controller.drawROI_Callback
(f3989cc) is correct
% Disable button until ROI has been drawn
obj.ROIButton.Enable='off';
obj.model.cam.stopVideo
% Draw box and get coords
imSize = obj.model.imSize;
borderPix = 30;
defaultPos = [borderPix/2, ...
borderPix/2, ...
imSize(1)-borderPix, ...
imSize(2)-borderPix];
roi = images.roi.Rectangle('Parent',obj.hImAx,'Position',defaultPos);
roi.Label='Adjust then double-click';
% The only way I can find to move the label to the centre
roi.RotationAngle=1E-10;
But it spawns a new figure window. At the command line, this does not:
uax = uiaxes;
uax.XLim=[-10,110] ;
uax.YLim=[10,110]
roi = images.roi.Rectangle('Parent',uax,'Position',[10,10,80,80])l
So I'm confused.
Need to get a minimal example that works. e.g. I know that beamPointer.sendSamples
expects a structure as an input but the example is bp.sendSamples(x(ii), 1);
where x(ii)
is a random number.
It is easier to encourage users to upgrade if they don't have any settings at all in the project folder. If they have these then only a git client can easily upgrade. Otherwise anyone can download a Zip and unpack. Plus if the package is uploaded to the FEX, users can upgrade via that. Should explore this, as it would make updating a lot easier. If we move to this model then development will have be more careful to avoid over-use of the master branch.
This is because the GUI is being drawn in the background. How to monitor it to know when it has been built?
Closing the window should not produce error messages.
The checks are not up to date.
Need to document the process for entering correct values for all critical settings like settings.camera.micronsPerPixel
, settings.scanners.voltsPerPixel
, and settings.laser.laserMinMax_mW
. Then choose values that are sensible as defaults in default_settings.m
.
Use the XData
and YData
input args to imagesc
plus the number microns per pixel to plot the image in mm. That way it will be much easier to handle the display code for the sample transform.
See beamPointer. Finish implementing createNewTask2 (or decide to ditch it as suggested by Issue #2)
The user may well need to know which locations are associated with which index in the table. So need a nice plot or table bring that up.
There seems to be a lot of task creation and deletion happening. At the moment (d267d90) sendSamples
works only the first time. After that it hangs and does not work. Need to look into exactly who is using the DAQ and when and formalise all this. There are probably two modes. The first is unlocked for set up. The second is for clocked stuff.
The first case is created in the constructor at the moment. It should have it's own method and task name. In the second case, I don't understand why the task needs to be re-created each time sendSamples
is run.
EDIT: there is also startTesting
that is used, I think, for measuring laser power. It also kills the task and creates its own.
Curve not linear with cheapo laser. Although I have fits that should convert from mW to control values based on sensor valuyes, These are failing near zero. I ask for 0 mW but still get quite a bit of power. I therefore for now hard-code an if
statement to set power to zero if user asks for this 9d85c53
I think the fits should be constrained to go through zero. Maybe I need to re-do the fit with the control values re-scaled according to the max/min of a power meter. Then run one conversion instead of the two conversion weirdness I now do.
We are close, though, regarding the commands themselves.
The calibration seems not to be working since about commit f3989cc. There is a transform present, it's just not very good. i.e. it has improved compared to before calibration. This is very weird. It looks to me like a scaling issue.
There was code for manually specifying an area that can be hit by the laser on a control trial. This has been removed (3078e3b) as it no longer will work. It can be added back into the GUI if needed later.
The image in the camera is currently inverted. We need a way of handling this.
At the moment there is no limit to the number of backup files that will be kept. Should fix this.
At the moment we load the grid .mat files and three different variables appear. There should be one structure. Look into the formats of these variables too. One is matrix and it's not obvious what it represents.
The beamPointer
class contains three versions of createNewTask
(createNewTask
, createNewTask2
, createNewTask3
) and it's not clear why, although they do differ in terms of whether or not there is regen. The first is called internally by the class but the third is used in the README example.
Need to be able to hardware trigger after sendSamples rather than just play right away (see also Issue #44)
I don't understand the following in runAffineTransform
if ~isempty(obj.transform)
% check if there are existing transformations already
numTform = size(obj.transform)+1;
obj.transform(numTform) = tform;
else
obj.transform = tform;
end
The comments state:
% it can be run repeatedly with each new mouse and it doesn't
% require scaling from the start (new transformation matrices
% are added on top of existing ones in function pixelToVolt)
But then in pixelToVolt
we see:
if ~isempty(obj.transform)
for tformMat = 1:length(obj.transform)
[xPos, yPos] = transformPointsInverse(obj.transform(tformMat), xPos, yPos);
end
end
@majaskret is it not the case that xPos
and yPos
simply are based upon whatever the last transform was? Previous tranforms are never reused so shouldn't we just overwrite and always keep the most recent?
Can't neatly change laser power during clocked events. So either grey out the slider during these or actually make this possible (more work for not much payoff).
>> round(actualPixelCoords)
ans =
136 260 <--
136 260 <--
125 652
340 69
333 264
327 461
321 658
316 854
538 74
533 269 <--
533 269 <--
521 664
515 861
737 80
731 275 <--
731 275 <--
721 669
714 866 <--
714 866 <--
930 281 <--
930 281 <--
919 676
919 676
1127 287 <--
1127 287 <--
1117 679
These should not be there! They are duplicates and are actually incorrect values that are messing up the transform.
The skeleton is here: 84320fd But the API does not reference the GUI so the drop down is not active for now.
The beamPointer
class should simply implement the logic of controlling the hardware. At the moment it also contains input
commands that ask the user for settings. These need to be removed, likely replaced by a GUI, and the whole thing refactored into a model/view framework.
There are two Arduinos in the existing setup. One can almost certainly go as its presence is only there because the laser AI port was not switched to the correct impedance setting.
The second Arduino (Due) is being used to implement the laser ramp down. It could be possible to have a ramp down happen via DAQmx: https://forums.ni.com/t5/Multifunction-DAQ/Ramp-down-a-signal-before-stopping/m-p/4254367/highlight/false#M102874
Let's try. There are references to Arduinos in beamPointer
so maybe changes to the code are needed. Confirm with @majaskret exactly what can be removed.
Rename. But this will mean altering code that uses the API.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.