Git Product home page Git Product logo

submitter's Introduction

PyLHC Tools

Cron Testing Code Climate coverage Code Climate maintainability (percentage) GitHub last commit GitHub release DOI

This package provides tools for particle accelerator physics complementing the optics measurement analysis tools of the omc3 package. It is a collection of useful scripts for the Optics Measurements and Corrections team (OMC) at CERN, with functionality for data analysis, corrections calculation, simulations management and machine information extraction.

Documentation

Installing

This package is Python 3.7+ compatible, and can be installed through pip:

python -m pip install pylhc

After installing, scripts can be run with either python -m pylhc.SCRIPT --FLAG ARGUMENT or by calling the Python files directly.

For development purposes, we recommend creating a new virtual environment and installing from VCS in editable mode with all extra dependencies:

git clone https://github.com/pylhc/pylhc
python -m pip install --editable "pylhc[all]"

Note: Some scripts access functionality only available on the CERN Technical Network. To use those, you should make sure to install the relevant extra dependencies with python -m pip install "pylhc[cern]".

Functionality

  • Forced DA Analysis - Script to analyze forced DA. (forced_da_analysis.py)
  • Machine Settings Info - Prints an overview over the machine settings at a given time. (machine_settings_info.py)
  • KickGroup Information - Get information about KickGroups. (kickgroups.py)
  • BSRT Logger and BSRT Analysis - Saves data coming straight from LHC BSRT FESA class and allows subsequent analysis. (bsrt_logger.py & bsrt_analysis.py )
  • BPM Calibration Factors - Compute the BPM calibration factors using ballistic optics. Two methods are available: using the beta function and using the dispersion. (bpm_calibration.py)

License

This project is licensed under the MIT License - see the LICENSE file for details.

submitter's People

Contributors

fsoubelet avatar joschd avatar mihofer avatar

Stargazers

 avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

anabramo

submitter's Issues

modify submit option to use a Jobs.tfs file

currently, parameter is always the product of all replace_dict parameters
alternatively, a Jobs.tfs file could be provided to specify which parameter should go together ..
user is responsible for preparing such a file

allow for multiple input masks

currently, only one mask file is supported.
As Roderik pointed out in the ABP CP, in some cases multiple input masks could be used

Autosix: Wrong calculation of Standard Deviation

Currently the standard deviation (std) over all seeds is calculated, like the min, max and mean, by taking the std of the stds of the seeds, but it should be calculated from the std of all instances.

Towards 1.0.0

change

if njobs > HTCONDOR_JOBLIMIT:
to and not run_local and not dryrun as this limit only applies when submitting to HTC
a warning should be issued instead if either running locally or for a dryrun

Cosmetic changes in the documentation

remove nr of jobs warning

remove max nr of jobs warning and instead use the max_materialize option in the sub file.
if more then X jobs are submitted, the user is informed that those will materialize at a slower pace to not kill the scheduler

Fix write config

Currently, if the package is installed, the inputparameters are saved as a config in the lib/pylhc-submitter directory, whereas they should be saved in the working_directory

add masked commandline option

Currently, only a mask file can be used for submission.
Alternative option could be to allow a masked commandline input like --commandparam1=%(param1)s --commandparam2=%(param2)s .

make .sub file name an input

currently, the sub file is always called queuehtc.sub
in dagman workflows, one might want multiple .subs created in the same directory, which is currently not possible

add option to modify Job.sh

currently, Job.sh have to be created and then modified by hand after to achieve more complicated behaviour..
optional input could be a dict with 'pre' and 'post' keys that take any string and insert it before/after the execution ..
advantage: the mkdir Outputdata part could be added there too

[Request] Have Job Submitter create the submit file when given --dryrun

Currently, when given --dryrun, the job_submitter will create the proper folders, bash scripts, output some summary and exit. It would be nice to also create the submition file before giving a summary and exiting, without submitting to htc of course.

It should be a simple change, maybe incorporating the following lines in the main function?

subfile = htcutils.make_subfile(
    cwd, job_df, output_dir=output_dir, duration=flavour, **additional_htc_arguments
)

Those lines are currently only in the _run_htc function.

The reason I'd like this is that while job_submitter makes life easy, it does not provide ways to do some of the wacky complicated things we can see in the batch docs. Going for a dry run and changing / adding things to the submit file would be an easy way to toy around. @JoschD what do you think?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.