Git Product home page Git Product logo

drupe's Introduction

Dispersed-Computing-Profiler (DRUPE)

====================================

DRUPE is a tool to collect information about computational resources as well as network links between compute nodes in a dispersed computing system to a central node. DRUPE consists of a network profiler and a resource profiler.

NETWORK PROFILER


  1. Description: automatically scheduling and logs communication information of all links betweet nodes in the network, which gives the quadratic regression parameters of each link representing the corresponding communication cost. The quadratic function represents how the file transfer time depends on the file size (based on our empirical finding that a quadratic function is a good fit.)

  2. Input

  • File central.txt stores credential information of the central node
CENTRAL IP USERNAME PASSWORD
IP0 USERNAME PASSWORD
  • File nodes.txt stores credential information of the nodes information
TAG NODE (username@IP) REGION
node1 username@IP1 LOC1
node2 username@IP2 LOC2
node3 username@IP3 LOC3
  • File link_list.txt stores the the links between nodes required to log the communication
SOURCE(TAG) DESTINATION(TAG)
node1 node2
node1 node3
node2 node1
node2 node3
node3 node1
node3 node2
  • File generate_link_list.py is used to generate file link_list.txt (all combinations of links) from the node list in file nodes.txt, or users can customize the link_list.txt on their own.
  1. Output: all quadratic regression parameters are stored in the local MongoDB server on the central node.

  2. Userguide (Non-dockerized version):

    • At the central network profiler:

      • run the command ./central init to install required libraries
      • inside the folder central input add information about the nodes and the links.
      • python3 central scheduler.py to generate the scheduling files for each node, prepare the central database and collection, copy the scheduling information and network scripts for each node in the node list and schedule updating the central database every 10th minute.
    • At the droplets:

      • The central network profiler copies all required scheduling files and network scripts to the folder online profiler in each droplet.
      • run the command ./droplet init to install required libraries
      • run the command python3 automate droplet.py to generate files with different sizes to prepare for the logging measurements, generate the droplet database, schedule logging measurement every minute and logging regression every 10th minute. (These parameters could be changed as needed.)
  3. Userguide (Dockerized version)

    • At the docker_online_profiler folder:

      • Modify input in folder central_input (nodes.txt, link_list.txt) of central_network_profiler and upload_docker_network accordingly (IP, PASSWORD, REG, link_list)
      • Run: ./upload_docker_network to upload codes to all the nodes and the central
      • Example run: Scheduler IP0, and other droplets IP1, IP2, IP3
    • At the droplets, inside the droplet_network_profiler:

      • Build the docker: docker build -t droplet_network_profiler .
      • Run the containers:

      docker run --rm --name droplet_network_profiler -t -i -e DOCKER_HOST=IP1 -p 5100:22 -P droplet_network_profiler

      docker run --rm --name droplet_network_profiler -t -i -e DOCKER_HOST=IP2 -p 5100:22 -P droplet_network_profiler

      docker run --rm --name droplet_network_profiler -t -i -e DOCKER_HOST=IP3 -p 5100:22 -P droplet_network_profiler

    • At the central network profiler (IP0):

      • Build the docker: docker build -t central_network_profiler .
      • Run the container:

      docker run --rm --name central_network_profiler -i -t -e DOCKER_HOST=IP0 -p 5100:22 -P central_network_profiler

RESOURCE PROFILER


  1. Introduction: This Resource Profiler will get system utilization from node 1, node 2 and node 3. Then these information will be sent to home node and stored into mongoDB.

The information includes: IP address of each node, cpu utilization of each node, memory utilization of each node, and the latest update time.

  1. User Guide: For details, please go inside each folder to check README file.

Acknowledgement

This material is based upon work supported by Defense Advanced Research Projects Agency (DARPA) under Contract No. HR001117C0053. Any views, opinions, and/or findings expressed are those of the author(s) and should not be interpreted as representing the official views or policies of the Department of Defense or the U.S. Government.

drupe's People

Contributors

jiatongw avatar caravansary83 avatar iampradiptaghosh avatar bkrishnamachari avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.