Git Product home page Git Product logo

tmolteno / tart Goto Github PK

View Code? Open in Web Editor NEW
26.0 15.0 6.0 25 MB

Transient Array Radio Telescope

Home Page: https://tart.elec.ac.nz

License: GNU Lesser General Public License v3.0

OpenSCAD 0.22% Verilog 63.59% Makefile 1.16% Python 16.28% Shell 0.16% C++ 2.65% Mathematica 0.65% Haskell 1.28% C 0.37% HTML 2.28% TypeScript 4.88% JavaScript 0.85% CSS 0.29% Tcl 2.86% SystemVerilog 0.28% Dockerfile 0.14% Ruby 2.08%
astronomy radio-astronomy interferometry telescope fpga hardware

tart's Introduction

Transient Array Radio Telescope

DOI

The Transient Array Radio Telescope (TART) is a low-cost 24-element aperture synthesis array developed by the Elec Research Group at the University of Otago

All-sky image from a TART telescope

The TART software and hardware designs are released as open-source designs and are licensed under the GPL v3.

The authors of the first version of the TART are Tim Molteno, Charles Shaw, Max Scheel and Phil Brown. Enquries about the TART should be addressed to Tim Molteno ([email protected]).

Connect to our development radio telescope here

Hardware

The TART hardware is designed using a combination of KiCAD (newest boards) and CadSoft's Eagle package. These designs are located in the hardware folder.

Firmware

The TART uses an FPGA to synchronize data from each receiver. The code for this in the Verilog language is in the hardware/FPGA directory

Installation

See the Installation Instructions file for more detail on installation.

Telescope API documentation

The Telescope API documentation describes how the TART is controlled via the network for configration and data retrieval.

tart's People

Contributors

19173296 avatar dependabot[bot] avatar humanrikus avatar maxscheel avatar maxscheel-sr avatar milletf1 avatar psuggate avatar shach353 avatar sharvey71 avatar tmolteno avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

tart's Issues

Antenna positions not uploaded correcty to (SA) TART

As part of the S.A. TART calibration process, I measured the antenna positions. This was done according to the Jupyter notebook listed in https://github.com/tmolteno/TART/tree/master/doc/calibration/positions.

I replaced all the measurements wiith our ones. Something that I did however change, was the antenna positions, which is entered into the following array (which accordingly is the distance from each antenna to the reference poles: x0, x1 and x2).

m[0,:] = [1563, 855, 2618]

I assumed the above example was incorrect, as the order of the columns 1 and 2 is interchanged, i.e. it should be:

m[0,:] = [855,1563, 2618]

After entering our data, I generated the attached JSON file: (note i merely renamed it to .txt for the upload to Github): tart_positions_2019-02-14.txt

I then submitted the data to the TART, as follows:

/usr/local/bin/tart_upload_antenna_positions --api http://146.232.222.105 --pw changethispw --file tart_positions_2019-02-14.json

and received the response:

Response: {}

I then downloaded the antenna gains again from our TART (in order to verify that they are properly set), but they did not correspond to what I uploaded (see the attached file): antenna_positions.txt

However, after putting our TART in Diagnose mode, the status map of the antennas changed completely:

image

This is not correct.

I am probably doing something wrong here. Should I also add the antenna positions as in the example, i.e. [distance from antenna to x1, ..x0, x2 ] ?

Unable to load historic *.pkl files

I am trying to convert the attached *.pkl file that was generated with the Stellenbosch University TART, but I am unable to do so using the latest tart package (version 0.15.5).

I am running the following Python code and passing it the attached *.pkl file:

# ==================================================================================================
# Danie Ludick (2020) 
# [email protected]
#
# Description:
#    Converts TART raw data (in *.pkl format to MATLAB *.mat data object)
#    Note, this assumes that you have installed the tart package: "sudo pip3 install tart"

import scipy.io
import pickle
from tart.operation import observation
import argparse

# =================================================================================================
# Main driver
# =================================================================================================
if __name__ == '__main__':

  PARSER = argparse.ArgumentParser(description='Convert raw data from the TART telescope to MATLAB compatible format')
  PARSER.add_argument('--file', required=True, help="The raw data *.pkl file ")
  
  ARGS = PARSER.parse_args()

  print("Convert tart raw data v1.0 from 2020-03-29")
  print("")
  
  # Load the Observation file (i.e. the PKL file)
  obs = observation.Observation_Load(ARGS.file)

  # Extract information from the *.pkl file
  print("  Sampling time-stamp   : %s " % str(obs.timestamp))
  print("  Number of Samples     : %d " % len(obs.data[ANTENNA_INDEX]))
  print("  Number of Antenna     : %d " % obs.config.get_num_antenna())
  fs =  obs.get_sampling_rate()
  print("  Sampling rate [Hz]    : %d " % fs)

  scipy.io.savemat('tart_matlab_data', mdict={'version':1, \
                                              'timestamp': str(obs.timestamp), \
                                              'sampling_rate': fs, \
                                              'number_of_antennas': obs.config.get_num_antenna(), \
                                              'antenna_channels_raw_data':obs.data})

I get the following error:

djludick@forecasts:~/scratch-dl/pkl_reader$ python3 convert_pkl_to_mat.py --file tartza_data.pkl
Convert tart raw data v1.0 from 2020-03-29

not gzipped
Traceback (most recent call last):
File "/usr/local/lib/python3.6/dist-packages/tart/operation/observation.py", line 83, in Observation_Load
d = pickle.load(load_data)
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe4 in position 1: ordinal not in range(128)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "convert_pkl_to_mat.py", line 28, in
obs = observation.Observation_Load(ARGS.file)
File "/usr/local/lib/python3.6/dist-packages/tart/operation/observation.py", line 87, in Observation_Load
d = pickle.load(load_data)
_pickle.UnpicklingError: invalid load key, '\x1f'.
djludick@forecasts:~/scratch-dl/pkl_reader$

Status map - Component

Draw a projected (x,y) map of the telescope
at each antenna position given in [east, north, up] draw a circle containing the antenna id [0-23]
colour in standard bootstrap alert colours based on antenna status
clicking on an antenna will show additional status information about that antenna

Initial host of the component will be diagnose mode

Image time is incorrect

The image time appears to be 12 hours early. At 09:14 on Monday July 10 NZST it displays.

Recorded: Sun Jul 09 2017 21:14:59 GMT+1200 (NZST).

Correct behaviour is for the API to return UTC, and the browser to display in local time.

Add Visibility Monitor Component

  • Show in vis data mode
  • 'Visibility Monitor' Component
    select i-th antenna, select j-th antenna (perhaps dropdown.)
    only allow j to be selected after i has been selected and j>i
  1. Display current magnitude and phase
  2. Plot Magnitude[1] in log10() and Phase [rad] vs. time

reuse or share vis data that is requested in imaging component

misc:
mag = Math.sqrt(Math.pow(re,2) + Math.pow(im,2))
phase = Math.atan2(im, re)

nice to have:
set range of time for display last N (e.g. 10-500) magnitude and phases

SPI speed limitation to 7.8 MHz

Papillio Pro <-> RPi SPI speed limited to 7.8 MHz.
Faster speeds will cause issues as the Papillio latches to the sclk provided by the RPI. 96 MHz is too slow to latch to rising and falling edge. SPI client needs a make over.

get visibilities API request

should have a timestamp key and a data key with a list of {'i': ant_id, 'j': ant_id, 're': vis.real, 'im': vis.imag}
objects.

Antenna Beam Modeling

This is an important part of enhanced calibration as well as new inference algorithms. This issue is a placeholder. Andre Offringa suggests EveryBeam as a package that can model beam patterns. Meerkat uses FITS images - however Andre says this is inefficient as one needs a different image for each antenna and each frequency and for each elevation. By expanding the beam in basis functions, one can interpolate in frequency space to calculate beams at different frequencies.

  • Choose a technique (FITS or Basis functions). Tim prefers the latter.
  • Be able to generate a fits image from the basis functions, so that other packages can be used for imaging.
  • Allow modeling to determine beam packages (e.g. from Numerical EM packages) so reading output from FEKO might be useful.

Extracting individual antenna signals

Good day,

I would like to extract the individual (time sampled) signals from each of the antennas for our setup in SA. I am logging in to my Raspberry pi and I execute:

/git/TART/software/python/tart/tart/simulation $ python telescope_system.py

I am however getting the following error:

Traceback (most recent call last): File "telescope_system.py", line 13, in <module> rad = radio.Max2769B(sample_duration = 1e-5) TypeError: __init__() got an unexpected keyword argument 'sample_duration'

The installation is also already up to date (I think). After pulling from the repo, I issued:

/home/pi/git/TART/software/python/tart_web_api and issue: sudo python setup.py install

Perhaps there is another setup.py to run?

FPGA Documentation

A README.md and INSTALL.md are required in the FPGA section. This should include

  • Instructions for programming an FPGA
  • A location (perhaps 'bin' inside the fpga directory) for prebuilt FPGA images.
  • Instructions for firmware modification via Xilinx tools.

Error in status map

On firefox, I get "TypeError: i.ellipse is not a function". When using the status map.

Stacktrace.
.sTWA/s</e.prototype.drawAntennaPositions/<()
forEach()
.sTWA/s</e.prototype.drawAntennaPositions()
.sTWA/s</e.prototype.ngOnChanges()
.SMJ5/x</e.prototype.ngDoCheck()
.B3Ku/V</t.prototype.detectChangesInternal()

Default imaging interval should be 15

One some slow browsers (i.e. firefox on android) with slow connections the 5 second refresh interval seems too fast - it takes longer than 5s to render. Suggest 15?

TART-ZA (TART in South Africa) remains uncalibrated even after running continuous calibration script

After installing the latest object detection server and calibration software on my Ubuntu 18.04 PC, I am not having any luck with our TART system in SA.

For reference, the image produced looks as follows:

image

Just for reference, this calibration was run using the following:

sudo docker run --rm -d -e TART_LOGIN_PW=SAPWD -e TART_API=http://146.232.222.105 -v ~/calibration_results:/app --name=cal -it tart_calibration_server

Then I did:

docker exec -it cal bash

And when inside the bash attached to this docker machine, I executed the following:

sh /tart_calibrate.sh

The following output snippet was produced:

...
Calibration output is in /app/cal_2019_02_20_15_38_37.json
SUCCESS
Uploading new antenna positions
SUCCESS

There is a directory in my $HOME with the following content:

dludick@danie-ubuntu-desktop:~/calibration_results$ ls -als
total 28
4 drwxr-xr-x 5 dludick dludick 4096 Feb 21 07:00 .
4 drwxr-xr-x 55 dludick dludick 4096 Feb 18 09:22 ..
4 -rw-r--r-- 1 root root 3510 Feb 20 19:28 cal_2019_02_20_15_36_28.json
4 -rw-r--r-- 1 root root 3491 Feb 20 19:28 cal_2019_02_20_15_38_37.json
0 -rw-r--r-- 1 root root 0 Feb 21 07:00 calibrate.log
4 drwxr-xr-x 2 root root 4096 Feb 18 08:47 calibration_2019_02_18_06_47_55
4 drwxr-xr-x 2 root root 4096 Feb 18 08:50 calibration_2019_02_18_06_50_22
4 drwxr-xr-x 2 root root 4096 Feb 21 07:00 calibration_2019_02_21_05_00_37

I have attached now the file cal_2019_02_20_15_38_37.json (added .txt extension to get it on GitHub):

cal_2019_02_20_15_38_37.json.txt

Spectrum View

Needs to acquire raw data, will be very slow via the API.

Save button in VIS mode

in vis-data-mode is currently linked to
/acquire/raw/save/
but needs to be
/acquire/vis/save/

external paths still exist in web api

currently requires (hardcoded) folder called 24_ant_setup with:
a) telescope_config.json
b) calibrated_antenna_positions.json
this is also the path where the gains.db sqlite data base is stored.

changes need to made in:
tart_web_api/config.py
tart_web_api/database.py

make sure that api can be installed and run by just 2 steps.

  1. sudo pip install tart_web_api
  2. LOGIN_PW=foobar FLASK_APP=tart_web_api flask run

Sanity check for raw data and visibility data

Compare visibilities calculated from raw data with visibilities from fpga.
On first examination sequential collected datasets are compatible, however there seems to be currently a relationship:
vis_{from_ raw} = -vis_{fpga}*

Web-App Enhancements

Checklist of minor enhancements for the Web-GUI https://tart.elec.ac.nz

  • Add version number to bottom
  • Fix API documentation link
  • Add meta-information to the image view pane (date, number of baselines, integration time e.t.c)
  • Add telescope information to the landing page (location, number of antennas e.t.c)
  • Add a loading screen
  • Change title to "Radio Telescope Console" from TartApp

Animated GIF from sequence of images in vis mode

Use https://yahoo.github.io/gifshot/ to animate a sequence of telescope images. Ideally the last N images would be accumulated by the browser and displayed in a move of the last N frames.

  • A slider for selecting N
  • Some way to store that last N Images (perhaps keeping a list of names as well as the images)
  • Some way to view the current movie.

Another library could be https://github.com/antimatter15/jsgif (but I'm unsure of how mature it is).
And then there's https://github.com/thenickdude/webm-writer-js (Chrome only ATM)

Use HDF5 for data storage

The use of python pickle to store Observation and Visibility objects is not a good long term option. This is not robust to python2 vs python3, so persistence will move to HDF5 files.

This will add a dependency to the 'tart' python package (it will depend on h5py). Some routines will need to be written to convert existing files.

  • Write new persistence methods. with tests.
  • Utilities to convert from pickle to .hdf (written in python2).
  • Update testbenches
  • Web app save HDF RAW files
  • Web app save HDF Vis files

TARTZA installation fails with docker-compose up

We are currently in a process of updating our TART in SA (i.e. TARTZA) to Python3. After installing Raspbian v9 (stretch) on a Pi 3B+, we still cannot get the new telescope software to run.

We completed all steps detailed in the README but when we execute docker-compose up we get a range of errors, including the following:

api_container_1 | File "/usr/local/lib/python3.7/dist-packages/tart_hardware_interface-0.2.0b3-py3.7.egg/tart_hardware_interface/tartspi.py", line 8, in
api_container_1 | File "/usr/lib/python3/dist-packages/pkg_resources/init.py", line 1145, in resource_filename
api_container_1 | self, resource_name
api_container_1 | File "/usr/lib/python3/dist-packages/pkg_resources/init.py", line 1715, in get_resource_filename
api_container_1 | return self._extract_resource(manager, zip_path)
api_container_1 | File "/usr/lib/python3/dist-packages/pkg_resources/init.py", line 1736, in _extract_resource
api_container_1 | timestamp, size = self._get_date_and_size(self.zipinfo[zip_path])
api_container_1 | KeyError: 'tart_hardware_interface/permute.txt'

Do you have any recommendations for us?

Download Image Button

Add a button to the image to allow saving/downloading. The default name should be something to do with the date, i.e., radio_image_yyyy_mm_dd_hh_mm_ss.png. This will allow sorting of images by date (so one might make a movie).

Store Calibration Data in the Raw Data File

The current state of the calibration data should be stored in the raw data file when it is gathered. Otherwise the raw data files can't easily be imaged from without downloading this data via the API at the time that the data file was generated.

  • Add the Antenna Positions (to the config)
  • Add the current Calibration Data to the config
  • Make sure the full config is in the RAW file
  • Make sure the full config is in the VIS file
  • Put a version into the HDF file, and check for this when loading files (to avoid trying to load calibration data if it isn't there. (Required for backward compatibility)

Errors while building the Docker containers (calibration_server, object_position_server)

I am currently trying to setup a calibration server for our TART telescope in SA. First off, I want to get the object_position_server going and tested on an Ubuntu machine. I am getting the following error after following the README.md file contained in that directory:

$ docker-compose build
Docker machine "default" does not exist. Use "docker-machine ls" to list machines. Use "docker-machine create" to add a new one.

For reference, I am unexperienced with Docker and might not have installed everything that is required. I did the following:

sudo snap install docker
sudo apt install docker-compose

Is there anything I am missing?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.