Git Product home page Git Product logo

loudoun_codes's Introduction

README

Build Status

This is an attempt to create an open-source, MIT-license-friendly, content runner for HSPC-style and similar coding contests.

This project was born as a project at RubyforGood 2017.

Submission Runners

A SubmissionRunner is used to compile and run the language of choice in order to score a Submission. SubmissionRunners::Base handles the logic for building docker containers used in compiling the language and running the submission. Each language's runner should inherit from Base and provide methods for the image, how to compile the language, and how to run it.

`docker_run`

We are using TTY::Command to interface with the docker command line. SubmissionRunners::Base offers a docker_run method, which a language runner's build and run methods should utilize. docker_run accepts a splatted command parameter. This gets passed through to the docker container as its run command.

For example, a program Foobar is compiled via

$ javac Foobar.java

Acceptable inputs to docker_run might look like:

docker_run('javac', 'Foobar.java')

After compilation, running a class Foobar is accomplished with

$ java Foobar

Acceptable inputs might look like:

docker_run('java', 'Foobar')

`source_file`

The #source_file method returns a SubmissionRunners::SourceFile (Pathname subclass) instance bound to the basename of the participant-submitted source code file for this run. It responds to #without_extension, which returns an extension-less instance (helpful for compiler targets), as well as the entire Pathname "interface".

If your language doesn't need a build step (the vast majority of interpreted languages don't, for example), then you can omit a #build method for your runner.

Running tests

$ bundle exec rake                # all specs
$ bundle exec rake ci             # all specs + rubocop
$ bundle exec rake minus_docker   # non-docker specs
$ bundle exec rake dev_specs      # non-docker specs + rubocop

Contributors

(contributors, please add your information here.)

Getting Started

  1. Install Ruby v2.4.1
  2. Install (and start) Redis
  3. mkdir -p /var/lib/milton - You must ensure that the user running the app has permissions in this directory.
  4. Install Docker curl -fsSL https://get.docker.com/ | sh; sudo usermod -aG docker $(whoami)
  5. Pull docker images rake docker
  6. git clone https://github.com/rubyforgood/loudoun_codes.git
  7. cd loudoun_codes
  8. bundle install
  9. bundle exec rake db:setup
  10. bundle exec foreman start

If you are running the application for an actual competition, you probably want to use RAILS_ENV=production bundle exec foreman start in step 7.

After starting the application, you will see this line (or something similar) in your output:

14:25:13 web.1 | * Listening on tcp://0.0.0.0:5100

Browse to the provided address and you will see the web application. Browse to /sidekiq to see statistics about currently running jobs.

Future Deployment

The 'getting started' instructions are above are while this project is in active development. A near-future goal of this project is to make deployment as brain-dead simple as possible. We may, for instance, wrap this code in a mini custom linux distribution so it could be put on a thumb drive and take over a machine for the purpose of the running content. We had previously considered a docker container mimicking the setup of the jenkins docker container, but considering our use of docker containers for the submission judging, we believe the inception scenario there would hurt our heads too much.

loudoun_codes's People

Contributors

benabik avatar bennacer860 avatar bokmann avatar cflipse avatar danielpclark avatar mlpinit avatar openmailbox avatar wburns84 avatar yarmiganosca avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

loudoun_codes's Issues

Update README header about section

I'd like the README to have a clear depiction of what this project is about and for. Especially as this is a good resource reference for a resume.

Issues to remedy for building an Elixir runner

I don't program in Elixir but just getting a script to run or compile should be easy… right? Well as it turns out there are limitations with the way we do things that don't agree with Elixir.

For just running an Elixir script we'd use iex. But that currently has an issue with having input given too quickly ( see elixir-lang/elixir#3741 ). So if we want to do it that way we'd need to delay input injection to STDIO until it's ready.

If we want to compile and then run the executable we have to write a custom project definition script for what they have named escript. After doing that the student team would have to write their code with the same module name as what set up in the script with a main method. So in other words we'd need to provide the students with a template starting script. For an excellent how-to on compiling Elixir via escript see this post: http://ieftimov.com/writing-elixir-cli-apps

So this is doable — but some decisions need to be made on the implementation. So this thread is open for discussion.

problem creation view (for admins)

admins adding a new problem is probably our most complex view. We need to know if there are multiple input/output pairs, and we'll probably need some js support if multiple pairs need to be supported.

Things we do know:

  • problem name field
  • file upload for solution source file
  • file upload for printable assets

Standard Problem for Testing Runners

Implementing new runners (especially writing the specs) would be easier if there was a standard test problem. I like the pig latin one Daniel's got in his tests, because it's dead simple to implement solutions for.

CoC RFC

I see CoC 1.3.0 has been approved. If I may add I really like the direction Matz went with the CoC discussion in choosing the PostrgreSQL CoC (The Ruby Community Conduct Guideline) which has one important statement this CoC doesn't include:

When interpreting the words and actions of others, participants should always assume good intentions.

The current CoC here doesn't reflect “assuming the best” or give any statement as to trying to make amends. This CoC strictly focuses on negative aspects and enforcement. So whereas I'm okay with the policy itself — I don't feel it properly reflects a positive ideology for keeping the peace.

Distribution & Deployment

How do we actually want to go about:

  1. Making this available to people?
  2. Deploying it to actual systems?

Edge case: some false positives w/ Ruby/Python invalid code

Working proper code with good or bad results works just fine in evaluating results from the submission runner.

When the code to be executed is asdf then only comparing the output file to the stdout will tell you that it failed. Everything else says it passes. When I experiment on my computer the exit status is a non zero number but it's not resulting that way in the docker run.

Permission to self assign issues

In case you weren't aware I was joking when I asked about merging directly to master. Have no worries about me doing any such thing. I would appreciate higher repo privileges.

Modular Output Comparisons

The job output comparison needs to be made modular. There are 5 options PCSqared supports. We should figure out what those are, and then implement them.

Merge additional SubmissionRunners

The code works, it's clean, and any refactoring that needs to be done may effect multiple runners so if the code base starts getting modified on refactors now the merge master madness will start all over again.

Move lib/submission_runners -> app/runners

I don't think there's a lot of benefit to keeping them out of app/, and putting them there gives us some minor advantages:

  1. less typing
  2. rake tasks already have them loaded
  3. more Railsy

Remove extraneous machinery left over from OmniBuilder

@danielpclark

None of Docker::EntryFile, Docker::InputFile, Docker::OutputFile, Docker::ResultFile, Docker::IOFile are necessary. We don't need to build &/ run in a temp directory--building in the submission directory and persisting any compilation artifacts provides more of an audit trail should that be necessary. Everything else these classes provide is redundant given that SubmissionRunner::Base#source_file returns a Pathname object. This is born out by the fact that the Java runner, which uses none of this machinery, works.

Support::InterpretativeLanguage::MockResult is a good solution to the design flaw I left in SubmissionRunner::Base, and it should be moved into the runner namespace.

Support::InterpretiveLanguage itself strikes me as too much DRY too quickly. We've only had this domain model in our heads for a weekend. I think it's too early to completely remove the docker_run invocation from Runners. It might be a good idea down the road, but I don't think extracting it from only a certain set of Runners is the right move. There's probably a more comprehensive abstraction that lets us pull that invocation out of all the Runners, but we probably will be less likely to see it if we provide an abstraction to a subset now.

Add false input test per submission runner

If a student team knows the answer they can simply print the answer out to standard output. We should add a garbage input test to make sure they don't give the originally expected answer in the output when the wrong input has been given.

We should (configurably?) block network access to containers

Participant code could potentially access the internet from running containers on the host if it has internet access. We should block containers from accessing the host's internet connection. And potentially make it configurable by contest.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.