Git Product home page Git Product logo

hummingbird's People

Contributors

anatoli-ulmer avatar andyofmelbourne avatar cnettel avatar daurer avatar duaneloh avatar egorsobolev avatar ekeberg avatar filipemaia avatar gijsschot avatar idalundholm avatar jbthayer avatar kartikayyer avatar louisdoctor avatar mhantke avatar sellberg avatar toonggg avatar yellowsub17 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hummingbird's Issues

Interface does not run at SLAC

When starting the interface at SLAC, I get this error:

Traceback (most recent call last):
File "./hummingbird.py", line 32, in
interface.start_interface()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/init.py", line 14, in start_interface
mw = Interface()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/interface.py", line 33, in init
5554,'login'))
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/data_source.py", line 14, in init
self.connect()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/data_source.py", line 30, in connect
self._ctrl_socket.connect(addr, self._ssh_tunnel)
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/zmqsocket.py", line 42, in connect
ssh.tunnel_connection(self._socket, addr, tunnel)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 133, in tunnel_connection
new_url, tunnel = open_tunnel(addr, server, keyfile=keyfile, password=password, paramiko=paramiko, timeout=timeout)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 161, in open_tunnel
tunnel = tunnelf(lport, rport, server, remoteip=ip, keyfile=keyfile, password=password, timeout=timeout)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 201, in openssh_tunnel
raise ImportError("pexpect unavailable, use paramiko_tunnel")
ImportError: pexpect unavailable, use paramiko_tunnel


Skype_Agenda_2015_02_13

Meeting Agenda for Single Particle Imaging

1. Key analyses goals - extraction of reducedData:

A. How much have we reduced the background? Background level(s) VS motor position. --> "Heatmap" of background level vs motor position.

  • EPICS parameters for motors.
  • CSPAD photon scores using psana's dark-subtraction + median-subtraction routines. For each 2x1 measure (6 values) photon average, median, StdDev, NumPix>Thresh0, NumPix>Thresh1, NumPix>Thresh2.

B1. Level of noise and sufficiency for SPI-hitfinding (perhaps studied in offline mode):

  • Convert patterns into average photon pattern so we can relate to SPI scattering strength.
  • Loss of resolution of an equivalent powder pattern given measured background. Measuring the fluctuation of the background pattern (e.g. std. deviation) informs us how well we can detect signal photons that appear over the background.
  • Which "regions" on the detector can we use for hit-finding?
  • Simulated hit-finding -- what is the smallest particle that we can detect (with xxx confidence) given the current measured background?
  • Detection of spurious outlier patterns.

B2. What are the key photon features in the pattern? Persistent average pattern of (-periodic writes to single file?):

  • ADUs that are below Thresh0.
  • ADUs between Thresh0 to Thresh1.
  • ADUs between Thresh1 to Thresh2.
  • ADUs above Thresh2.
  • Strong background/detector artifacts.
  • Identify dead pixels on the fly. I think py-psana's darkcalibration kinda does this to some extent.
  • For the future: adaptive thresholding?

C. Setting up a framework that will become useful for hit-finding in June.

  • How to integrate structure-aware hit-finding?
  • How to integrate simulation modules into data stream (e.g. sizing, sphericity etc).
  • We can discuss this in detail over time.

2. Py-psana framework and backbone-code for accessing DAQ datastream.

A. Should the backbone code be:

  • like Cheetah, where a main thread passes events to worker threads, or
  • have independent worker threads that separately access the datastream and write to its own time-tagged LogFile?

B. How similar should online and offline analyses codes be similar?

  • Are psana library and its dependencies portable across different machines?

C. Paired programming model:

  • "one person" works on data-reduction code, "another person" work on data-analyses. This way, we don't have to re-access XTCs just to rerun/debug analyses on reduced data.

3. Cheetah’s role:

A. Output pixel-wise Histograming.

  • Pixel-wise average and standard deviation.

B. Convert statistical outlier patterns to CXIDB format.

  • For March-2015 beamtime this would be useful to identify streaks in background (correlate with motor positions?) which might be lost if we just stared at histograms.

C. Photon-based hitfinding using running background subtraction?

  • Help find surprises in the background (e.g. streaking, or that we hit a fixed target!)

4. Feature request for AMI:

A. Histograms of ADU counts for certain pixels vs EPICS parameter. The cxiopr could use this for fast feedback for optimal motor positioning.

5. Offline (or quick-offline) analyses:

A. Pixel-by-pixel pedestal + gain calibration off Cheetah output?

B. Probability mass in one-photon peak vs that in zero-photon peak.

C. False positive hit rates given pixel-by-pixel histogram statistics and simulated scattering pattern.

6. Additional questions:

A. How many processing streams should we have during online analysis?

B. How are dark-cal computed in py-psana?

C. Matching simulated diffraction patterns? (e.g. diffraction pattern of fabricated shape)

D. Test data streams for us to play with before March beamtime.

E. Who will be going to the beamtime?

F. How should this document be shared with the Initiative?


analysis - Counting photons

A simple algorithm that integrated the signal on the detector in specified ROIs (rectangle or provided mask). If possible, implement an interactive feature in the interface in order to define the ROI (e.g. using a polygon) on top of images from the buffer.


Segfault when exiting interface

When closing the interface after l have listened and received signals via ZMQ (even if I stopped listening before closing), I got a segfault.

Closing
[psnxserv01:05428] *** Process received signal ***
[psnxserv01:05428] Signal: Segmentation fault (11)
[psnxserv01:05428] Signal code: Address not mapped (1)
[psnxserv01:05428] Failing at address: 0x40
[psnxserv01:05428] [ 0] /lib64/libpthread.so.0[0x32b960f710]
[psnxserv01:05428] [ 1] /reg/g/psdm/sw/external/qt/4.8.5/x86_64-rhel6-gcc44-opt/lib/libQtCore.so.4(_ZNK7QObject6threadEv+0x4)[0x7fcff6b846e4]
[psnxserv01:05428] [ 2] /reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/PyQt4/QtGui.so(+0x41e13e)[0x7fcff7fe213e]
[psnxserv01:05428] [ 3] /reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/sip.so(+0x76be)[0x7fcff86446be]
[psnxserv01:05428] [ 4] /reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/sip.so(+0x7709)[0x7fcff8644709]
[psnxserv01:05428] [ 5] python[0x464c51]
[psnxserv01:05428] [ 6] python[0x42068b]
[psnxserv01:05428] [ 7] python(PyDict_Clear+0x142)[0x448b42]
[psnxserv01:05428] [ 8] python[0x448b59]
[psnxserv01:05428] [ 9] python[0x4d03c8]
[psnxserv01:05428] [10] python(PyGC_Collect+0x24)[0x4d0994]
[psnxserv01:05428] [11] python(Py_Finalize+0xf6)[0x4bcd76]
[psnxserv01:05428] [12] python[0x4bc83c]
[psnxserv01:05428] [13] python(PyErr_PrintEx+0x1a5)[0x4bca85]
[psnxserv01:05428] [14] python(PyRun_SimpleFileExFlags+0x12a)[0x4bd6aa]
[psnxserv01:05428] [15] python(Py_Main+0xa95)[0x414e65]
[psnxserv01:05428] [16] /lib64/libc.so.6(__libc_start_main+0xfd)[0x32b8e1ed5d]
[psnxserv01:05428] [17] python[0x413ff9]
[psnxserv01:05428] *** End of error message ***
Segmentation fault


ZMQ passing

What is the intended use case for sending data to the GUI?
Looking at the current framework we have:

Data flow:

#!python
    Data
      |
    LCLSTranslator
         \
       EventTranslator
         /
    LCLSTranslator
      |
    ipc.set_current_event --> broadcast.set_current_event
      |
    conf.onEvent
         \
       analysis.function
         /
    conf.onEvent   
    ...

Should I be calling ZMQ from the "onEvent" function through the broadcast module?

#!python
    Data
      |
    LCLSTranslator
         \
       EventTranslator
         /
    LCLSTranslator
      |
    ipc.set_current_event --> broadcast.set_current_event
      |
    conf.onEvent
         \
       analysis.function
         /
    conf.onEvent  --> ipc.set_data --> broadcast.set_data --> GUI
    ...

Interface does not run at SLAC

When starting the interface at SLAC, I get this error:

Traceback (most recent call last):
File "./hummingbird.py", line 32, in
interface.start_interface()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/init.py", line 14, in start_interface
mw = Interface()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/interface.py", line 33, in init
5554,'login'))
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/data_source.py", line 14, in init
self.connect()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/data_source.py", line 30, in connect
self._ctrl_socket.connect(addr, self._ssh_tunnel)
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/zmqsocket.py", line 42, in connect
ssh.tunnel_connection(self._socket, addr, tunnel)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 133, in tunnel_connection
new_url, tunnel = open_tunnel(addr, server, keyfile=keyfile, password=password, paramiko=paramiko, timeout=timeout)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 161, in open_tunnel
tunnel = tunnelf(lport, rport, server, remoteip=ip, keyfile=keyfile, password=password, timeout=timeout)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 201, in openssh_tunnel
raise ImportError("pexpect unavailable, use paramiko_tunnel")
ImportError: pexpect unavailable, use paramiko_tunnel


Skype_Agenda_2015_02_13

Meeting Agenda for Single Particle Imaging

1. Key analyses goals - extraction of reducedData:

A. How much have we reduced the background? Background level(s) VS motor position. --> "Heatmap" of background level vs motor position.

  • EPICS parameters for motors.
  • CSPAD photon scores using psana's dark-subtraction + median-subtraction routines. For each 2x1 measure (6 values) photon average, median, StdDev, NumPix>Thresh0, NumPix>Thresh1, NumPix>Thresh2.

B1. Level of noise and sufficiency for SPI-hitfinding (perhaps studied in offline mode):

  • Convert patterns into average photon pattern so we can relate to SPI scattering strength.
  • Loss of resolution of an equivalent powder pattern given measured background. Measuring the fluctuation of the background pattern (e.g. std. deviation) informs us how well we can detect signal photons that appear over the background.
  • Which "regions" on the detector can we use for hit-finding?
  • Simulated hit-finding -- what is the smallest particle that we can detect (with xxx confidence) given the current measured background?
  • Detection of spurious outlier patterns.

B2. What are the key photon features in the pattern? Persistent average pattern of (-periodic writes to single file?):

  • ADUs that are below Thresh0.
  • ADUs between Thresh0 to Thresh1.
  • ADUs between Thresh1 to Thresh2.
  • ADUs above Thresh2.
  • Strong background/detector artifacts.
  • Identify dead pixels on the fly. I think py-psana's darkcalibration kinda does this to some extent.
  • For the future: adaptive thresholding?

C. Setting up a framework that will become useful for hit-finding in June.

  • How to integrate structure-aware hit-finding?
  • How to integrate simulation modules into data stream (e.g. sizing, sphericity etc).
  • We can discuss this in detail over time.

2. Py-psana framework and backbone-code for accessing DAQ datastream.

A. Should the backbone code be:

  • like Cheetah, where a main thread passes events to worker threads, or
  • have independent worker threads that separately access the datastream and write to its own time-tagged LogFile?

B. How similar should online and offline analyses codes be similar?

  • Are psana library and its dependencies portable across different machines?

C. Paired programming model:

  • "one person" works on data-reduction code, "another person" work on data-analyses. This way, we don't have to re-access XTCs just to rerun/debug analyses on reduced data.

3. Cheetah’s role:

A. Output pixel-wise Histograming.

  • Pixel-wise average and standard deviation.

B. Convert statistical outlier patterns to CXIDB format.

  • For March-2015 beamtime this would be useful to identify streaks in background (correlate with motor positions?) which might be lost if we just stared at histograms.

C. Photon-based hitfinding using running background subtraction?

  • Help find surprises in the background (e.g. streaking, or that we hit a fixed target!)

4. Feature request for AMI:

A. Histograms of ADU counts for certain pixels vs EPICS parameter. The cxiopr could use this for fast feedback for optimal motor positioning.

5. Offline (or quick-offline) analyses:

A. Pixel-by-pixel pedestal + gain calibration off Cheetah output?

B. Probability mass in one-photon peak vs that in zero-photon peak.

C. False positive hit rates given pixel-by-pixel histogram statistics and simulated scattering pattern.

6. Additional questions:

A. How many processing streams should we have during online analysis?

B. How are dark-cal computed in py-psana?

C. Matching simulated diffraction patterns? (e.g. diffraction pattern of fabricated shape)

D. Test data streams for us to play with before March beamtime.

E. Who will be going to the beamtime?

F. How should this document be shared with the Initiative?


Skype_Agenda_2015_02_13

Meeting Agenda for Single Particle Imaging

1. Key analyses goals - extraction of reducedData:

A. How much have we reduced the background? Background level(s) VS motor position. --> "Heatmap" of background level vs motor position.

  • EPICS parameters for motors.
  • CSPAD photon scores using psana's dark-subtraction + median-subtraction routines. For each 2x1 measure (6 values) photon average, median, StdDev, NumPix>Thresh0, NumPix>Thresh1, NumPix>Thresh2.

B1. Level of noise and sufficiency for SPI-hitfinding (perhaps studied in offline mode):

  • Convert patterns into average photon pattern so we can relate to SPI scattering strength.
  • Loss of resolution of an equivalent powder pattern given measured background. Measuring the fluctuation of the background pattern (e.g. std. deviation) informs us how well we can detect signal photons that appear over the background.
  • Which "regions" on the detector can we use for hit-finding?
  • Simulated hit-finding -- what is the smallest particle that we can detect (with xxx confidence) given the current measured background?
  • Detection of spurious outlier patterns.

B2. What are the key photon features in the pattern? Persistent average pattern of (-periodic writes to single file?):

  • ADUs that are below Thresh0.
  • ADUs between Thresh0 to Thresh1.
  • ADUs between Thresh1 to Thresh2.
  • ADUs above Thresh2.
  • Strong background/detector artifacts.
  • Identify dead pixels on the fly. I think py-psana's darkcalibration kinda does this to some extent.
  • For the future: adaptive thresholding?

C. Setting up a framework that will become useful for hit-finding in June.

  • How to integrate structure-aware hit-finding?
  • How to integrate simulation modules into data stream (e.g. sizing, sphericity etc).
  • We can discuss this in detail over time.

2. Py-psana framework and backbone-code for accessing DAQ datastream.

A. Should the backbone code be:

  • like Cheetah, where a main thread passes events to worker threads, or
  • have independent worker threads that separately access the datastream and write to its own time-tagged LogFile?

B. How similar should online and offline analyses codes be similar?

  • Are psana library and its dependencies portable across different machines?

C. Paired programming model:

  • "one person" works on data-reduction code, "another person" work on data-analyses. This way, we don't have to re-access XTCs just to rerun/debug analyses on reduced data.

3. Cheetah’s role:

A. Output pixel-wise Histograming.

  • Pixel-wise average and standard deviation.

B. Convert statistical outlier patterns to CXIDB format.

  • For March-2015 beamtime this would be useful to identify streaks in background (correlate with motor positions?) which might be lost if we just stared at histograms.

C. Photon-based hitfinding using running background subtraction?

  • Help find surprises in the background (e.g. streaking, or that we hit a fixed target!)

4. Feature request for AMI:

A. Histograms of ADU counts for certain pixels vs EPICS parameter. The cxiopr could use this for fast feedback for optimal motor positioning.

5. Offline (or quick-offline) analyses:

A. Pixel-by-pixel pedestal + gain calibration off Cheetah output?

B. Probability mass in one-photon peak vs that in zero-photon peak.

C. False positive hit rates given pixel-by-pixel histogram statistics and simulated scattering pattern.

6. Additional questions:

A. How many processing streams should we have during online analysis?

B. How are dark-cal computed in py-psana?

C. Matching simulated diffraction patterns? (e.g. diffraction pattern of fabricated shape)

D. Test data streams for us to play with before March beamtime.

E. Who will be going to the beamtime?

F. How should this document be shared with the Initiative?


Skype_Agenda_2015_02_13

Meeting Agenda for Single Particle Imaging

1. Key analyses goals - extraction of reducedData:

A. How much have we reduced the background? Background level(s) VS motor position. --> "Heatmap" of background level vs motor position.

  • EPICS parameters for motors.
  • CSPAD photon scores using psana's dark-subtraction + median-subtraction routines. For each 2x1 measure (6 values) photon average, median, StdDev, NumPix>Thresh0, NumPix>Thresh1, NumPix>Thresh2.

B1. Level of noise and sufficiency for SPI-hitfinding (perhaps studied in offline mode):

  • Convert patterns into average photon pattern so we can relate to SPI scattering strength.
  • Loss of resolution of an equivalent powder pattern given measured background. Measuring the fluctuation of the background pattern (e.g. std. deviation) informs us how well we can detect signal photons that appear over the background.
  • Which "regions" on the detector can we use for hit-finding?
  • Simulated hit-finding -- what is the smallest particle that we can detect (with xxx confidence) given the current measured background?
  • Detection of spurious outlier patterns.

B2. What are the key photon features in the pattern? Persistent average pattern of (-periodic writes to single file?):

  • ADUs that are below Thresh0.
  • ADUs between Thresh0 to Thresh1.
  • ADUs between Thresh1 to Thresh2.
  • ADUs above Thresh2.
  • Strong background/detector artifacts.
  • Identify dead pixels on the fly. I think py-psana's darkcalibration kinda does this to some extent.
  • For the future: adaptive thresholding?

C. Setting up a framework that will become useful for hit-finding in June.

  • How to integrate structure-aware hit-finding?
  • How to integrate simulation modules into data stream (e.g. sizing, sphericity etc).
  • We can discuss this in detail over time.

2. Py-psana framework and backbone-code for accessing DAQ datastream.

A. Should the backbone code be:

  • like Cheetah, where a main thread passes events to worker threads, or
  • have independent worker threads that separately access the datastream and write to its own time-tagged LogFile?

B. How similar should online and offline analyses codes be similar?

  • Are psana library and its dependencies portable across different machines?

C. Paired programming model:

  • "one person" works on data-reduction code, "another person" work on data-analyses. This way, we don't have to re-access XTCs just to rerun/debug analyses on reduced data.

3. Cheetah’s role:

A. Output pixel-wise Histograming.

  • Pixel-wise average and standard deviation.

B. Convert statistical outlier patterns to CXIDB format.

  • For March-2015 beamtime this would be useful to identify streaks in background (correlate with motor positions?) which might be lost if we just stared at histograms.

C. Photon-based hitfinding using running background subtraction?

  • Help find surprises in the background (e.g. streaking, or that we hit a fixed target!)

4. Feature request for AMI:

A. Histograms of ADU counts for certain pixels vs EPICS parameter. The cxiopr could use this for fast feedback for optimal motor positioning.

5. Offline (or quick-offline) analyses:

A. Pixel-by-pixel pedestal + gain calibration off Cheetah output?

B. Probability mass in one-photon peak vs that in zero-photon peak.

C. False positive hit rates given pixel-by-pixel histogram statistics and simulated scattering pattern.

6. Additional questions:

A. How many processing streams should we have during online analysis?

B. How are dark-cal computed in py-psana?

C. Matching simulated diffraction patterns? (e.g. diffraction pattern of fabricated shape)

D. Test data streams for us to play with before March beamtime.

E. Who will be going to the beamtime?

F. How should this document be shared with the Initiative?


Skype_Agenda_2015_02_13

Meeting Agenda for Single Particle Imaging

1. Key analyses goals - extraction of reducedData:

A. How much have we reduced the background? Background level(s) VS motor position. --> "Heatmap" of background level vs motor position.

  • EPICS parameters for motors.
  • CSPAD photon scores using psana's dark-subtraction + median-subtraction routines. For each 2x1 measure (6 values) photon average, median, StdDev, NumPix>Thresh0, NumPix>Thresh1, NumPix>Thresh2.

B1. Level of noise and sufficiency for SPI-hitfinding (perhaps studied in offline mode):

  • Convert patterns into average photon pattern so we can relate to SPI scattering strength.
  • Loss of resolution of an equivalent powder pattern given measured background. Measuring the fluctuation of the background pattern (e.g. std. deviation) informs us how well we can detect signal photons that appear over the background.
  • Which "regions" on the detector can we use for hit-finding?
  • Simulated hit-finding -- what is the smallest particle that we can detect (with xxx confidence) given the current measured background?
  • Detection of spurious outlier patterns.

B2. What are the key photon features in the pattern? Persistent average pattern of (-periodic writes to single file?):

  • ADUs that are below Thresh0.
  • ADUs between Thresh0 to Thresh1.
  • ADUs between Thresh1 to Thresh2.
  • ADUs above Thresh2.
  • Strong background/detector artifacts.
  • Identify dead pixels on the fly. I think py-psana's darkcalibration kinda does this to some extent.
  • For the future: adaptive thresholding?

C. Setting up a framework that will become useful for hit-finding in June.

  • How to integrate structure-aware hit-finding?
  • How to integrate simulation modules into data stream (e.g. sizing, sphericity etc).
  • We can discuss this in detail over time.

2. Py-psana framework and backbone-code for accessing DAQ datastream.

A. Should the backbone code be:

  • like Cheetah, where a main thread passes events to worker threads, or
  • have independent worker threads that separately access the datastream and write to its own time-tagged LogFile?

B. How similar should online and offline analyses codes be similar?

  • Are psana library and its dependencies portable across different machines?

C. Paired programming model:

  • "one person" works on data-reduction code, "another person" work on data-analyses. This way, we don't have to re-access XTCs just to rerun/debug analyses on reduced data.

3. Cheetah’s role:

A. Output pixel-wise Histograming.

  • Pixel-wise average and standard deviation.

B. Convert statistical outlier patterns to CXIDB format.

  • For March-2015 beamtime this would be useful to identify streaks in background (correlate with motor positions?) which might be lost if we just stared at histograms.

C. Photon-based hitfinding using running background subtraction?

  • Help find surprises in the background (e.g. streaking, or that we hit a fixed target!)

4. Feature request for AMI:

A. Histograms of ADU counts for certain pixels vs EPICS parameter. The cxiopr could use this for fast feedback for optimal motor positioning.

5. Offline (or quick-offline) analyses:

A. Pixel-by-pixel pedestal + gain calibration off Cheetah output?

B. Probability mass in one-photon peak vs that in zero-photon peak.

C. False positive hit rates given pixel-by-pixel histogram statistics and simulated scattering pattern.

6. Additional questions:

A. How many processing streams should we have during online analysis?

B. How are dark-cal computed in py-psana?

C. Matching simulated diffraction patterns? (e.g. diffraction pattern of fabricated shape)

D. Test data streams for us to play with before March beamtime.

E. Who will be going to the beamtime?

F. How should this document be shared with the Initiative?


Skype_Agenda_2015_02_13

Meeting Agenda for Single Particle Imaging

1. Key analyses goals - extraction of reducedData:

A. How much have we reduced the background? Background level(s) VS motor position. --> "Heatmap" of background level vs motor position.

  • EPICS parameters for motors.
  • CSPAD photon scores using psana's dark-subtraction + median-subtraction routines. For each 2x1 measure (6 values) photon average, median, StdDev, NumPix>Thresh0, NumPix>Thresh1, NumPix>Thresh2.

B1. Level of noise and sufficiency for SPI-hitfinding (perhaps studied in offline mode):

  • Convert patterns into average photon pattern so we can relate to SPI scattering strength.
  • Loss of resolution of an equivalent powder pattern given measured background. Measuring the fluctuation of the background pattern (e.g. std. deviation) informs us how well we can detect signal photons that appear over the background.
  • Which "regions" on the detector can we use for hit-finding?
  • Simulated hit-finding -- what is the smallest particle that we can detect (with xxx confidence) given the current measured background?
  • Detection of spurious outlier patterns.

B2. What are the key photon features in the pattern? Persistent average pattern of (-periodic writes to single file?):

  • ADUs that are below Thresh0.
  • ADUs between Thresh0 to Thresh1.
  • ADUs between Thresh1 to Thresh2.
  • ADUs above Thresh2.
  • Strong background/detector artifacts.
  • Identify dead pixels on the fly. I think py-psana's darkcalibration kinda does this to some extent.
  • For the future: adaptive thresholding?

C. Setting up a framework that will become useful for hit-finding in June.

  • How to integrate structure-aware hit-finding?
  • How to integrate simulation modules into data stream (e.g. sizing, sphericity etc).
  • We can discuss this in detail over time.

2. Py-psana framework and backbone-code for accessing DAQ datastream.

A. Should the backbone code be:

  • like Cheetah, where a main thread passes events to worker threads, or
  • have independent worker threads that separately access the datastream and write to its own time-tagged LogFile?

B. How similar should online and offline analyses codes be similar?

  • Are psana library and its dependencies portable across different machines?

C. Paired programming model:

  • "one person" works on data-reduction code, "another person" work on data-analyses. This way, we don't have to re-access XTCs just to rerun/debug analyses on reduced data.

3. Cheetah’s role:

A. Output pixel-wise Histograming.

  • Pixel-wise average and standard deviation.

B. Convert statistical outlier patterns to CXIDB format.

  • For March-2015 beamtime this would be useful to identify streaks in background (correlate with motor positions?) which might be lost if we just stared at histograms.

C. Photon-based hitfinding using running background subtraction?

  • Help find surprises in the background (e.g. streaking, or that we hit a fixed target!)

4. Feature request for AMI:

A. Histograms of ADU counts for certain pixels vs EPICS parameter. The cxiopr could use this for fast feedback for optimal motor positioning.

5. Offline (or quick-offline) analyses:

A. Pixel-by-pixel pedestal + gain calibration off Cheetah output?

B. Probability mass in one-photon peak vs that in zero-photon peak.

C. False positive hit rates given pixel-by-pixel histogram statistics and simulated scattering pattern.

6. Additional questions:

A. How many processing streams should we have during online analysis?

B. How are dark-cal computed in py-psana?

C. Matching simulated diffraction patterns? (e.g. diffraction pattern of fabricated shape)

D. Test data streams for us to play with before March beamtime.

E. Who will be going to the beamtime?

F. How should this document be shared with the Initiative?


Allow users to control the update rate

Allow users to control the rate at which different plots are published from the frontend. One possibility is for the frontend to "request" plots it wants to show.

Interface does not run at SLAC

When starting the interface at SLAC, I get this error:

Traceback (most recent call last):
File "./hummingbird.py", line 32, in
interface.start_interface()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/init.py", line 14, in start_interface
mw = Interface()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/interface.py", line 33, in init
5554,'login'))
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/data_source.py", line 14, in init
self.connect()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/data_source.py", line 30, in connect
self._ctrl_socket.connect(addr, self._ssh_tunnel)
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/zmqsocket.py", line 42, in connect
ssh.tunnel_connection(self._socket, addr, tunnel)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 133, in tunnel_connection
new_url, tunnel = open_tunnel(addr, server, keyfile=keyfile, password=password, paramiko=paramiko, timeout=timeout)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 161, in open_tunnel
tunnel = tunnelf(lport, rport, server, remoteip=ip, keyfile=keyfile, password=password, timeout=timeout)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 201, in openssh_tunnel
raise ImportError("pexpect unavailable, use paramiko_tunnel")
ImportError: pexpect unavailable, use paramiko_tunnel


Sort data sources in interface

I could not spot a proper place in the code to trigger sorting of the data sources in the tabels, also how to deal with new sources that appear and disappear. When to resort?

Message in ImageWindow is not properly changing with the content when browsing the buffer

I attempted to fix this by introducing a new ringbuffer for strings (in this case the messages), but I could not properly sync this with the main ringbuffer. I guess, having both images and messages in the same buffer makes sense. However, one problem here is, that the browsing feature of the image plot is an inherent feature of pyqtgraph. I would not how to integrate updating the messages into this, but maybe there is an easy solution for this?

Interface does not run at SLAC

When starting the interface at SLAC, I get this error:

Traceback (most recent call last):
File "./hummingbird.py", line 32, in
interface.start_interface()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/init.py", line 14, in start_interface
mw = Interface()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/interface.py", line 33, in init
5554,'login'))
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/data_source.py", line 14, in init
self.connect()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/data_source.py", line 30, in connect
self._ctrl_socket.connect(addr, self._ssh_tunnel)
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/zmqsocket.py", line 42, in connect
ssh.tunnel_connection(self._socket, addr, tunnel)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 133, in tunnel_connection
new_url, tunnel = open_tunnel(addr, server, keyfile=keyfile, password=password, paramiko=paramiko, timeout=timeout)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 161, in open_tunnel
tunnel = tunnelf(lport, rport, server, remoteip=ip, keyfile=keyfile, password=password, timeout=timeout)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 201, in openssh_tunnel
raise ImportError("pexpect unavailable, use paramiko_tunnel")
ImportError: pexpect unavailable, use paramiko_tunnel


Interface does not run at SLAC

When starting the interface at SLAC, I get this error:

Traceback (most recent call last):
File "./hummingbird.py", line 32, in
interface.start_interface()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/init.py", line 14, in start_interface
mw = Interface()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/interface.py", line 33, in init
5554,'login'))
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/data_source.py", line 14, in init
self.connect()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/data_source.py", line 30, in connect
self._ctrl_socket.connect(addr, self._ssh_tunnel)
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/zmqsocket.py", line 42, in connect
ssh.tunnel_connection(self._socket, addr, tunnel)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 133, in tunnel_connection
new_url, tunnel = open_tunnel(addr, server, keyfile=keyfile, password=password, paramiko=paramiko, timeout=timeout)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 161, in open_tunnel
tunnel = tunnelf(lport, rport, server, remoteip=ip, keyfile=keyfile, password=password, timeout=timeout)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 201, in openssh_tunnel
raise ImportError("pexpect unavailable, use paramiko_tunnel")
ImportError: pexpect unavailable, use paramiko_tunnel


git clone

$ git clone [email protected]:spinitiative/hummingbird.git

doesn't work for me, I have the bitbuck username set as my git global user.name but no dice...

**amorgan@amorgan-desktop:~/Physics/git_repos$ git clone [email protected]:spinitiative/hummingbird.git
Cloning into 'hummingbird'...
Warning: Permanently added the RSA host key for IP address '131.103.20.167' to the list of known hosts.
Permission denied (publickey).
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.
**

I used:
git clone https://[email protected]/spinitiative/hummingbird.git
successfully.


Clean up Image Window

Image Window became a bit of a mess. Clean it up so we can continue to develop it.

Interface does not run at SLAC

When starting the interface at SLAC, I get this error:

Traceback (most recent call last):
File "./hummingbird.py", line 32, in
interface.start_interface()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/init.py", line 14, in start_interface
mw = Interface()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/interface.py", line 33, in init
5554,'login'))
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/data_source.py", line 14, in init
self.connect()
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/data_source.py", line 30, in connect
self._ctrl_socket.connect(addr, self._ssh_tunnel)
File "/reg/neh/home2/benedikt/software/hummingbird/src/interface/zmqsocket.py", line 42, in connect
ssh.tunnel_connection(self._socket, addr, tunnel)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 133, in tunnel_connection
new_url, tunnel = open_tunnel(addr, server, keyfile=keyfile, password=password, paramiko=paramiko, timeout=timeout)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 161, in open_tunnel
tunnel = tunnelf(lport, rport, server, remoteip=ip, keyfile=keyfile, password=password, timeout=timeout)
File "/reg/g/psdm/sw/releases/ana-current/arch/x86_64-rhel6-gcc44-opt/python/zmq/ssh/tunnel.py", line 201, in openssh_tunnel
raise ImportError("pexpect unavailable, use paramiko_tunnel")
ImportError: pexpect unavailable, use paramiko_tunnel


Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.