libamtrack / library Goto Github PK
View Code? Open in Web Editor NEWCore library code
License: GNU General Public License v3.0
Core library code
License: GNU General Public License v3.0
Version should be extracted from GIT tag
(pyamtrack1) grzanka@grzanka-VirtualBox:~$ pip install pyamtrack
Collecting pyamtrack
Downloading https://files.pythonhosted.org/packages/27/ad/2c0e74c8b53fe8aa1e2c87b4002fb6b816bf83541c384c9c45551c7cdf3a/pyamtrack-0.1.1-py3-none-manylinux1_x86_64.whl (4.8MB)
|████████████████████████████████| 4.8MB 1.1MB/s
Collecting cffi>=1.13.0
Downloading https://files.pythonhosted.org/packages/16/cd/1f4ddf6be8300713c676bb9f3a2d3b8eb8accc0a6a24f57d4f6c4cd59d34/cffi-1.13.2-cp37-cp37m-manylinux1_x86_64.whl (398kB)
|████████████████████████████████| 399kB 53.8MB/s
Processing ./.cache/pip/wheels/f2/9a/90/de94f8556265ddc9d9c8b271b0f63e57b26fb1d67a45564511/pycparser-2.19-py2.py3-none-any.whl
Installing collected packages: pycparser, cffi, pyamtrack
Successfully installed cffi-1.13.2 pyamtrack-0.1.1 pycparser-2.19
(pyamtrack1) grzanka@grzanka-VirtualBox:~$ python
Python 3.7.6 (default, Jan 8 2020, 19:59:22)
[GCC 7.3.0] :: Anaconda, Inc. on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from pyamtrack import libAT
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/grzanka/.conda/envs/pyamtrack1/lib/python3.7/site-packages/pyamtrack/libAT.py", line 2, in <module>
from .lib import _libAT
ImportError: cannot import name '_libAT' from 'pyamtrack.lib' (/home/grzanka/.conda/envs/pyamtrack1/lib/python3.7/site-packages/pyamtrack/lib/__init__.py)
>>>
(pyamtrack1) grzanka@grzanka-VirtualBox:~$ python -V
Python 3.7.6
tested with Anaconda
http://ftpmirror.gnu.org as described in https://www.gnu.org/prep/ftp.html
I.e. *.o
See also
https://github.com/github/gitignore/blob/master/C.gitignore
After migration to GIT creation of R package may fail.
It looks that some scripts still rely on SVN to extract version number.
https://github.com/libamtrack/library/blob/master/distributions/R/scripts/R.add.metainfo.R
See Thomas Bortfeld and Wolfgang Schlegel 1996 Phys. Med. Biol. 41 1331
https://doi.org/10.1088/0031-9155/41/8/006
It looks that material is already specified, the method would simply rescale by its density
It looks like libtool has been renamed to libtool-bin in Ubuntu 16.04?
Fix README's, compilation instructions etc.
See Thomas Bortfeld and Wolfgang Schlegel 1996 Phys. Med. Biol. 41 1331
https://doi.org/10.1088/0031-9155/41/8/006
First #22 needs to be implemented.
Note that some part is already there in AT_NumericalRoutines.h
/**
* Computes the convolution of a term (R0 - z)^(ni - 1) with a Gaussian
* in z with variance sigma^2, i.e.
* F(z, R0) = 1/(2*pi*sigma) * int_{-inf}^{R0}[ (R0 - z)^(ni - 1) * exp(-(z - z')^2/(2*sigma^2)) * dz']
* that can be solved using the gamma function and the parabolic cylinder function:
* F(z, R0) = 1/(2*pi*sigma) * exp((R0 - z)/(4*sigma^2)) * sigma^ni * gamma(ni) * D[-ni](-(R0-z)/sigma)
* where D[-ni] is the parabolic cylinder function of order -ni
*
* The procedure is elucidated in Bortfeld, 1997, An analytical approximation of the Bragg curve for therapeutic
* proton beams, Med. Phys. 24(12), 2024ff., Appendix A, Eqs. A1, A6
*
* This function uses gamma_ and AT_Dyx.
*
* Cave: Be careful to give the correct ni (not ni - 1)!
*
* @param[in] z
* @param[in] R0
* @param[in] sigma
* @param[in] ni
* @return funs
*/
double AT_range_straggling_convolution( const double z,
const double R0,
const double sigma,
const double ni);
From https://edoc.ub.uni-muenchen.de/20516/1/Bellinzona_Valentina.pdf
[40] W. Ulmer and B. Schaffner. Foundation of an analytical proton beamlet model for inclusion in a general proton dose calculation system. Radiation Physics and Chemistry, 80(3):378 – 389, 2011. 28, 29, 33
Fresh windows with R Studio and R installed:
> install.packages('libamtrack', repos='http://R-Forge.R-project.org')
WARNING: Rtools is required to build R packages but is not currently installed. Please download and install the appropriate version of Rtools before proceeding:
https://cran.rstudio.com/bin/windows/Rtools/
Installing package into ‘C:/Users/Legion/Documents/R/win-library/3.6’
(as ‘lib’ is unspecified)
Package which is only available in source form, and may need
compilation of C/C++/Fortran: ‘libamtrack’
These will not be installed
Currently implemented formula:
const double b1_g_cm2 = 0.2335;
const double b2 = 1.209;
const double b3 = 1.78e-4;
const double b4 = 0.9891;
const double b5 = 3.01e-4;
const double b6 = 1.468;
const double b7 = 1.18e-2;
const double b8 = 1.232;
const double b9 = 0.109;
// constants...
*a1_g_cm2 = b1_g_cm2*average_A / pow(average_Z,b2);
*a2 = b3*average_A;
*a3 = b4 - b5*average_Z;
*a4 = b6 - b7*average_Z;
*a5 = b8 / pow(average_Z,b9);
In the original paper we find different equation for a2
:
See #22 (comment)
A list of functions needed for cBinder project.
It seems that PSTAR and libdedx uses cubic spline instead of linear interpolation. See comments here libamtrack/web#87 (comment) and below.
Due to this fact the results obtained from libamtrack and PSTAR/libdedx may differ up to few percent.
It would be worth to check if a C/C++ replacement exists for the cernlib functions we use in libamtrack
AT.CSDA.range.g.cm2
returns CSDA.range.cm2.g
which probably should be spelled CSDA.range.g.cm2
I use it in R:
CSDA <- AT.CSDA.range.g.cm2(Energies,rep(0,N),rep(C12,N),M1)$CSDA.range.cm2.g
All functions and structs which are not exposed as part of API can be safely move from header to *.c files.
In some files there are some unclear error messages. For example file AT_NumericalRoutines.c - lines 455-463:
fprintf(stderr,"Numerical Recipes run-time error...\n");
fprintf(stderr,"%s\n",error_text);
fprintf(stderr,"...now exiting to system...\n");
exit(1);
In file AT_PhysicsRoutines.c in function double AT_beta_from_E_single( const double E_MeV_u )
there is an assertion which checks if the given energy is greater than 0 (line 35). This function should allows also the energy value that is equal 0.
R code give result != 0:
library(libamtrack)
mom <- AT.momentum.MeV.c.u.from.E.MeV.u(150)$momentum.MeV.c
150 - AT.E.MeV.u.from.momentum.MeV.c.u(mom)$E.MeV.u
See comment here: libamtrack/web#87 (comment)
It would be good to know exact source of stopping power data.
It is not clear now from where the energy binning comes from and why it is different between ICRU49 and PSTAR.
/io/library/src/AT_SuccessiveConvolutions.c: In function ‘AT_single_impact_local_dose_distrib’:
/io/library/src/AT_SuccessiveConvolutions.c:330:11: error: conflicting types for ‘j’
for (int j = 0; j < n_bins_f1; j++){
^
/io/library/src/AT_SuccessiveConvolutions.c:120:10: note: previous declaration of ‘j’ was here
long i, j;
^
/io/library/src/AT_SuccessiveConvolutions.c:330:2: error: ‘for’ loop initial declarations are only allowed in C99 mode
for (int j = 0; j < n_bins_f1; j++){
^
/io/library/src/AT_SuccessiveConvolutions.c:330:2: note: use option -std=c99 or -std=gnu99 to compile your code
/io/library/src/AT_SuccessiveConvolutions.c: In function ‘AT_n_bins_for_DSB_distribution’:
/io/library/src/AT_SuccessiveConvolutions.c:1043:4: error: ‘for’ loop initial declarations are only allowed in C99 mode
for(long i = 0; i < n_bins_f; i++){
^
/io/library/src/AT_SuccessiveConvolutions.c: In function ‘AT_get_DSB_distribution’:
/io/library/src/AT_SuccessiveConvolutions.c:1079:4: error: ‘for’ loop initial declarations are only allowed in C99 mode
for(long i = 0; i < n_bins_f; i++){
^
/io/library/src/AT_SuccessiveConvolutions.c:1091:13: error: redefinition of ‘i’
for(long i = 0; i < max_number_of_DSBs; i++){
^
/io/library/src/AT_SuccessiveConvolutions.c:1079:13: note: previous definition of ‘i’ was here
for(long i = 0; i < n_bins_f; i++){
^
/io/library/src/AT_SuccessiveConvolutions.c:1091:4: error: ‘for’ loop initial declarations are only allowed in C99 mode
for(long i = 0; i < max_number_of_DSBs; i++){
^
/io/library/src/AT_SuccessiveConvolutions.c:1094:5: error: ‘for’ loop initial declarations are only allowed in C99 mode
for(long j = 0; j < n_bins_f; j++){
^
/io/library/src/AT_SuccessiveConvolutions.c:1096:16: error: ‘NAN’ undeclared (first use in this function)
p_DSB[i] = NAN;
^
/io/library/src/AT_SuccessiveConvolutions.c:1096:16: note: each undeclared identifier is reported only once for each function it appears in
/io/library/src/AT_SuccessiveConvolutions.c: In function ‘AT_translate_dose_into_DSB_distribution’:
/io/library/src/AT_SuccessiveConvolutions.c:1173:5: error: ‘for’ loop initial declarations are only allowed in C99 mode
for(long i = 0; i < n_bins_DSB; i++){
^
Make it working with autotools, some hints here:
http://stackoverflow.com/questions/15013672/use-autotools-with-readme-md
Development within the main branches should have made this old (>5a) branch obsolete.
Some functions are not correctly parsed:
funs+='"_AT_lambda_max_multi",'
funs+='"_AT_lambda_max_single",'
funs+='"_AT_lambda_Landau_Mode()",'
funs+='"_AT_lambda_Landau_Mean(cons",'
funs+='"_AT_lambda_Landau_FWHM_left()",'
funs+='"_AT_lambda_Landau_FWHM_right()",'
funs+='"_AT_lambda_Landau_FWHM()",'
funs+='"_AT_energy_loss_keV_Landau_FWHM(cons",'
funs+='"_AT_energy_loss_keV_Landau_Mode(cons",'
funs+='"_AT_energy_loss_from_lambda_landau_multi",'
funs+='"_AT_energy_loss_from_lambda_landau_single",'
funs+='"_AT_Landau_energy_loss_distribution",'
Maybe some more formal ways of parsing code should be considered (something like pycparser) ?
python version: Python 3.6.9
pip version: pip 20.0.1 from /usr/local/lib/python3.6/dist-packages/pip (python 3.6)
Running:
pip install pyamtrack
gives:
ERROR: Could not find a version that satisfies the requirement pyamtrack (from versions: none)
ERROR: No matching distribution found for pyamtrack
cBinder project https://github.com/Tetrite/cBinder creates bindings in one of two available modes:
cBinder was tested on libamtrack library and compilation on Linux was successful.
On Windows, however, it wasn't possible.
cBinder executes compilation using CFFI - which on Windows utilizes MSVC.
The following problems were encountered:
After fixing these problems, compilation is possible - after deleting code fragments causing errors, execution of cBinder was successful.
Similar to AT_Vavilov_energy_loss_distribution
In https://arxiv.org/pdf/1410.1378.pdf one can find comparison of Stopping Power calculated with Bethe and PSTAR sources:
It looks a bit different than what now is seen in web interface:
It seems that the order of parameters (fluence, z) is wrong, see:
void AT_dose_Bortfeld_Gy_multi(const long n,
const double z_cm[],
const double fluence_cm2,
const double E_MeV_u,
const double sigma_E_MeV_u,
const long material_no,
const double eps,
double dose_Gy[]) {
long i;
for (i = 0; i < n; i++) {
dose_Gy[i] = AT_dose_Bortfeld_Gy_single(z_cm[i], E_MeV_u, fluence_cm2, sigma_E_MeV_u, material_no, eps);
}
}```
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.