Git Product home page Git Product logo

oil_lamp's People

Contributors

bturep avatar tygamvrelis avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar

oil_lamp's Issues

Playback is not smooth for 1 Hz sine wave with amplitude 10 degrees

During the first playback test, it was observed that a 1 Hz sine wave with amplitude 10 degrees was noticeably jerky. At 0.5 Hz, the jerk was far worse, and at 2 Hz the jerk was very small.

Furthermore, a 25 degree amplitude sine wave at 1 Hz and 2 Hz looked very smooth.

Test System!

hopefully by early next week? I fly out on the Friday!

Add command to convert a .csv of pitch and roll data into a .wav

Is your feature request related to a problem? Please describe.
We have access to boat simulation data in the form of a .csv file (contains timestamps, pitch and roll). We would like to play this data back.

Describe the solution you'd like
Add a command to the python program that loads a .csv file and creates .wav files from the pitch and roll data for playback.

Timeline
Say March 1st.

Additional information
Example .csv row:
00:00:00.044,-0.00288641901681988,-2.69711036988106E-12,-2.13693108005387E-05,6.69750048033119E-09,0,0.000583037041127682

First column is time stamp. Second column is pitch. Third entry is roll.

Feature request: add ability to slice recorded data over a specific time range

Is your feature request related to a problem? Please describe.
Sometimes we may be interested in only a specific part of a long recording (e.g. a nice calm part over which a baseline can be computed). Currently there is no way to split up a recording file other than manually editing the raw data.

Describe the solution you'd like
I'd like a new command for the Python script. Let us call this new command slice. slice will take 3 arguments: the filename, a start time, and an end time (--slice=filename,t_start,t_end). It will create a copy of the file, but only with the data between the specified times.

Start and end times need to be bounds-checked!

Example usage: python lamp.py --slice=lamp_data_0.dat,10,20 (creates a new file containing the measurements in lamp_data_0.dat between 10 seconds and 20 seconds).

The naming of the new file should follow some consistent convention. For example, it could be old_file_name + "_slice" + t_start + "to" + t_end, in which case we'd have "lamp_data_0_slice10to20.dat" for the example above.

Timeline
Not urgent, but likely necessary for deriving baselines from most recordings (even dedicated baseline recordings often have a noisy section somewhere in the beginning middle or end)

Add way to set a new zero orientation for motors from PC

Is your feature request related to a problem? Please describe.
If we want to mount the lamp on a table, it will still have the same zero orientation as if it were mounted sideways on a boat. This is a problem

Describe the solution you'd like
Should be able to configure the zero orientation directly from the PC interface. Ideally this would be programmable for both servos, but at the moment it’s only critical for the outer one.

This will require MCU changes, as the calibration offsets for each motor will now need to be programmable during runtime.

Thinking out loud: do we want the MCU to remember these changes between power cycles? Or should it always default to wall-mounted orientation?

Timeline
No rush

Considering remote sensing in "real-time"

Brandon mentioned the idea of having 2 copies of the lamp setup: one mounted on a boat in recording mode, and one mounted in a gallery in playback mode. Data would be sent from the one on the boat to the one in the gallery for actuation, hence linking the two setups.

Things to figure out:

  • How to set up the network send/receive for this?
  • What kind of latency and reliability can we expect from the network?

Limiting is not smooth

Is your feature request related to a problem? Please describe.
Currently, the maximum deflection amplitude permitted is hard-coded to 40 degrees. Any motion beyond this amplitude is limited down to a deflection of 40 degrees, which sometimes causes playback to appear more abrupt than desired.

Describe the solution you'd like
Motion should gradually decrease as deflection reaches this limit, so that there is no loss of smoothness in the motion.

Timeline
Needed for playback eventually, but not critical at the time of this posting

Camera-data synchronization

The camera needs a way to synchronize with the data stream.

Two components to the solution (as discussed) are as follows:

  • Log time stamps in text file (perhaps every second or so)
  • Flash LED when PC-side script starts. Consider putting a "LED" flag in the logs as well during this period

RPi UART

Is your feature request related to a problem? Please describe.
Using USB necessitates cables with rigid connectors. These cause an undesirable form factor for the lamp.

Describe the solution you'd like
Would be ideal to directly connect the MCU and RPi via UART.

Timeline
Ideally before March 4

Additional information
Documented some of the steps for setting up UART on the RPi here. Also, we did extensive testing today to no avail. However, we made important progress. Namely, we were able to transmit UART info out of the RPi successfully (measured on my logic analyzer). The signalling was 3.3 V, as desired. The problem is that we weren't able to get the MCU to receive these transmissions. We tried connecting the TX of the RPi into the RX (and TX) lines of the ST-Link programmer, since this is where we can normally see the USB-to-serial activity.

One guess I have is that the ST-Link programmer is causing a bus conflict with the RPi's UART TX. Unfortunately, we cannot connect the RPi's UART TX into PA3/D0 unless we modify some solder bridges on the bottom of the MCU dev board.

I propose modifying the MCU code to perform PC interfacing over a different UART, e.g. USART6. Unlike USART2, the other UARTs have nothing to do with the ST-Link.

Implement playback mode in microcontroller

Steps identified:

  • Store angles in data table in MCU
  • Control thread that sends these angles to motors periodically
  • Interface for controlling motors
  • Consider adding IMU feedback to close motor control loop
  • Consider adding PC interface command to switch between recording and playback modes (control thread doesn't need to operate in recording mode, and IMU thread doesn't need to operate in playback mode)

Playback must begin automatically on PC-side

The raspberry pi should be able to boot up into a playback mode. In this mode, it will need to stream commands to the MCU to recreate the recorded motion, and it will also need to initialize the playback of a 4K video stream. These must evolve in lock step

MCU startup issue?

Is your feature request related to a problem? Please describe.
Seems like when we try to auto-start playback the MCU enters an unresponsive state and has to be reset before we can stream it angles.

Describe the solution you'd like
Find root cause of issue.
Should also implement a reset command on the MCU that can be issued from the PC side. Useful to have in any case.
Also, maybe once MCU receives a playback command, it should enable the servos. Might fix this bug.

Timeline
Describe any time constraints that should be taken into account (i.e. when do you need the feature to be ready?)
Next Monday.

Feature request: ability to generate plots for specific time range

Is your feature request related to a problem? Please describe.
Sometimes we may be interested in only a specific part of a long recording. Currently there is no way to plot just a portion of a recording.

Describe the solution you'd like
I'd like a new command for the Python script that accompanies that analyze option (--analyze). Let us call this new command plot_slice. plot_slice will take two arguments, a start time and an end time (--plot_slice=t_start,t_end). It will result in a plot being generated that only contains the data between t_start and t_end.

Start and end times need to be bounds-checked!

Example usage: python lamp.py --analyze=lamp_data_0.dat --plot_slice=10,20 (plots the data in lamp_data_0.dat between 10 and 20 seconds).

The naming of the plot should follow some consistent convention that shows it's only for a specific time range. For example, it could be old_file_name + "_from" + t_start + "to" + t_end, in which case we'd have something like "raw_base_lamp_data_0_from10to20.dat" for the example above (based on the default settings in the current version of the software).

Timeline
Needed to properly assess quality of recordings. Should be completed soon.

Base IMU data occasionally cuts out during recordings

Occasionally, the base IMU acceleration data in Z and Y will be -0.15 instead of their measured values.

Originally I thought this might have been an I2C issue, but upon some investigation this evening I now suspect that the data table is the issue (i.e. we might be overwriting some of the base data when we write in the lamp data). I tested this by modifying the write_table method so that if we are trying to write lamp data, we return instead. There are no issues with the base IMU data when I do this, so the issue cannot possibly be related to the I2C modules.

I suspect this issue will go away when the data table is refactored for #24 to store generic bytes instead of imu_data_t, but nevertheless it needs to be solved.

Elapsed time in log is not consistent with number of samples times sampling rate

Describe the bug
Each data packet from the microcontroller is time-stamped with the local PC time. We can subtract the time stamp in the last packet from the time stamp in the first packet to get the elapsed time. On the other hand, given that the microcontroller sends back data 100 times per second, we can divide the total number of packets received by 100 to get the elapsed time. These two measured of the elapsed time should be very close to each other, however, based on the Oct 13th data, they can actually be quite different. We need to find out why.

To Reproduce
Steps to reproduce the behavior:

  1. Open pc\data\oct_13_2019\lamp_data_0.dat
  2. Observe the first time stamp: 14:52:43.724
  3. Observe the final time stamp: 15:32:01.645
  4. Run python lamp.py --analyze=oct_13_2019\lamp_data_0.dat --estimate=ind_angles --imu=both in the debugger. Print out the value of num_samples on line 137 in analyze.py. You will observe num_samples = 172966.
  5. Based on the time stamps, elapsed time is 32 + (60 - 52) + (1 - 43) / 60 = 39.3 minutes
  6. Based on the number of samples and sampling rate, elapsed time is 172966 / 100 / 60 = 28.8 minutes

39.3 does not equal 28.8, hence the discrepancy.

Expected behavior
The total number of samples should correspond to the real time elapsed.

Additional context
At this point, we cannot even know whether 39.3 or 28.8 is correct. We could determine this with an additional test where we run the recorder for about 30 minutes, and process the data file afterwards to see if it says that 30 minutes passed. There are 2 scenarios:

  1. Raspberry Pi time is not trustworthy, but samples are coming in at the expected rate (i.e. number of samples multiplied by sampling rate corresponds to true time). In this case, we completely neglect the time stamps on the recorded data and construct the time series based solely on sampling rate (this is what we're currently doing, since it's easiest)
  2. Raspberry Pi time is trustworthy, but samples are not being written to the log file at the expected rate (i.e. timestamp interval corresponds to true time). In this case, we construct the time series based solely on the time stamps (let's say to within 10 ms of accuracy, with interpolation as needed). This case would also beg the question: why are samples not being written at the expected rate? Is it because the microcontroller is not (reliably) sending them at the expected rate? It is because our time.sleep(0.001) call in receive() services another process for too long, and when we wake up we've timed out so we exit? Is it a problem with the serial data parser's logic in receive()? If it's a problem with the parser, does this mean the data in the serial buffer is stale by the time we've written it to the file (i.e. the serial buffer accumulates a bunch of data which is not pulled out of it on time)?

Regardless of which of these two cases it is (or maybe it's a third one which I did not anticipate!), samples are still coming in every 13.88 ms on average (72 Hz), which should still be sufficient for motion reconstruction. The main concern here is making sure we don't get a time drift relative to what would be recorded by a camera (this would happen if the reconstructed motion was slightly too fast or slightly too slow).

Logs should be human-readable

Brandon would like to stitch together different parts of the recordings by hand. In order to do this, he must be able to manipulate the data logs, e.g. copy and pasting parts of them into one "master" file.

A binary encoding scheme would make this difficult, hence the logs must be human-readable.

Composition tools

Is your feature request related to a problem? Please describe.
We do not have a straightforward way of manipulating the recordings to compose a final "edit"

Describe the solution you'd like

  1. Add program that converts the 2 streams of angles from a recording into 2 .wav file
  2. Add program that converts from .wav back to stream of angles
    Consider .csv as an intermediate representation.
  3. Add an option for playback that loads a .wav file instead of .dat
  4. Add a how-to page to the wiki
  5. Display time during playback

Timeline
Sooner is better - start of Dec is ideal

IMU acceleration factors returned by MCU are off by a factor of -1

It turns out that all the IMU acceleration values are off by a factor of -1. This means that there needs to be a change in the embedded code that multiplies ax, ay, az by -1. This will make the acceleration values received by the computer log consistent with the coordinate system described in the sensor datasheet. Once this change is made in embedded, the multiplication setting in settings.ini needs to be changed to: mult_z = -1, mult_y = -1, mult_x = 1. Which is the negative of what it is right now. This will not solve any functional issues per se (currently, two wrongs are making a right!), but it's a good improvement for consistency

Servos twitch spastically when electrically coupled to each other through their mounting heads

Undesired behaviour was observed when both servos were sent control signals while mounted to the lamp. This behaviour is not observed when they are sent control signals while dismounted. We noticed that the mounting heads of the servos are metallic, and figured it's possible that the servos are interfering with each other through the mounting mechanism.

We made the following observations:

Case 1: Both servos are mounted to the lamp and powered on, but are not given a control signal
No motion.

Case 2: Outer servo is mounted to the lamp, and is powered on and given a control signal. The inner servo is disconnected
Outer servo moves as desired.

Case 3: Inner servo is mounted to the lamp, and is powered on and given a control signal. The outer servo is disconnected
Inner servo moves as desired.

Case 4: Both servos are mounted to the lamp and powered on. They are both sent a control signal to go to 0 degrees
The inner servo was definitely going haywire in this case. I recall that the outer servo was fairly stable.

The root cause of this issue needs to be diagnosed. It would be ideal if we could solve it through circuitry (e.g. grounding the lamp) rather than something mechanical (e.g. using different mounting material). We did not try grounding the lamp since we measured a voltage of about 1.5 volts on the mounting head of each servo (powered by 5 V supply) and weren't sure if the mounting heads are floating or if they're being driven to that voltage by circuits inside the motors.

Raw data should be logged instead of angle estimates

When we log raw data, we can always tune the filters afterward. This way, the fidelity of the playback will be completely adjustable post-recording.

If, on the other hand, we only logged the angle estimates, we would be forced to use these angles and would not be able to understand the underlying processes as well. Hence, it would be difficult to to compensate for any issues in the filters.

IMU use during playback

I am wondering what you think we might be able to use the IMUs for during the playback phase? If there is not a logical reason for them to be there can you think of an adjustment that might require the IMUs?

I think that it is important to not remove anything that is put onto the lamp; the lamp should wear its history (the first modification I made to the lamp was stripping it of its factory varnish). And I also think that it is important that the lamp maintains its functionality completely-- so it can always at anytime, with a few requirements, perform any of its three tasks: provide light, record its own movement, and playback that movement.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.