tygamvrelis / oil_lamp Goto Github PK
View Code? Open in Web Editor NEWGimballed oil lantern motion recording & playback
Gimballed oil lantern motion recording & playback
During the first playback test, it was observed that a 1 Hz sine wave with amplitude 10 degrees was noticeably jerky. At 0.5 Hz, the jerk was far worse, and at 2 Hz the jerk was very small.
Furthermore, a 25 degree amplitude sine wave at 1 Hz and 2 Hz looked very smooth.
to account for:
hopefully by early next week? I fly out on the Friday!
Is your feature request related to a problem? Please describe.
We have access to boat simulation data in the form of a .csv file (contains timestamps, pitch and roll). We would like to play this data back.
Describe the solution you'd like
Add a command to the python program that loads a .csv file and creates .wav files from the pitch and roll data for playback.
Timeline
Say March 1st.
Additional information
Example .csv row:
00:00:00.044,-0.00288641901681988,-2.69711036988106E-12,-2.13693108005387E-05,6.69750048033119E-09,0,0.000583037041127682
First column is time stamp. Second column is pitch. Third entry is roll.
Is your feature request related to a problem? Please describe.
Sometimes we may be interested in only a specific part of a long recording (e.g. a nice calm part over which a baseline can be computed). Currently there is no way to split up a recording file other than manually editing the raw data.
Describe the solution you'd like
I'd like a new command for the Python script. Let us call this new command slice
. slice
will take 3 arguments: the filename, a start time, and an end time (--slice=filename,t_start,t_end
). It will create a copy of the file, but only with the data between the specified times.
Start and end times need to be bounds-checked!
Example usage: python lamp.py --slice=lamp_data_0.dat,10,20
(creates a new file containing the measurements in lamp_data_0.dat between 10 seconds and 20 seconds).
The naming of the new file should follow some consistent convention. For example, it could be old_file_name + "_slice" + t_start + "to" + t_end
, in which case we'd have "lamp_data_0_slice10to20.dat" for the example above.
Timeline
Not urgent, but likely necessary for deriving baselines from most recordings (even dedicated baseline recordings often have a noisy section somewhere in the beginning middle or end)
Is your feature request related to a problem? Please describe.
If we want to mount the lamp on a table, it will still have the same zero orientation as if it were mounted sideways on a boat. This is a problem
Describe the solution you'd like
Should be able to configure the zero orientation directly from the PC interface. Ideally this would be programmable for both servos, but at the moment itβs only critical for the outer one.
This will require MCU changes, as the calibration offsets for each motor will now need to be programmable during runtime.
Thinking out loud: do we want the MCU to remember these changes between power cycles? Or should it always default to wall-mounted orientation?
Timeline
No rush
Even if we have an RTC installed to keep accurate time, it wouldn't hurt to make sure file names are unique just in case.
Brandon mentioned the idea of having 2 copies of the lamp setup: one mounted on a boat in recording mode, and one mounted in a gallery in playback mode. Data would be sent from the one on the boat to the one in the gallery for actuation, hence linking the two setups.
Things to figure out:
Is your feature request related to a problem? Please describe.
Currently, the maximum deflection amplitude permitted is hard-coded to 40 degrees. Any motion beyond this amplitude is limited down to a deflection of 40 degrees, which sometimes causes playback to appear more abrupt than desired.
Describe the solution you'd like
Motion should gradually decrease as deflection reaches this limit, so that there is no loss of smoothness in the motion.
Timeline
Needed for playback eventually, but not critical at the time of this posting
A raspberry pi will be used to log the data. Upon startup, it must automatically start logging MCU data.
The camera needs a way to synchronize with the data stream.
Two components to the solution (as discussed) are as follows:
Is your feature request related to a problem? Please describe.
Using USB necessitates cables with rigid connectors. These cause an undesirable form factor for the lamp.
Describe the solution you'd like
Would be ideal to directly connect the MCU and RPi via UART.
Timeline
Ideally before March 4
Additional information
Documented some of the steps for setting up UART on the RPi here. Also, we did extensive testing today to no avail. However, we made important progress. Namely, we were able to transmit UART info out of the RPi successfully (measured on my logic analyzer). The signalling was 3.3 V, as desired. The problem is that we weren't able to get the MCU to receive these transmissions. We tried connecting the TX of the RPi into the RX (and TX) lines of the ST-Link programmer, since this is where we can normally see the USB-to-serial activity.
One guess I have is that the ST-Link programmer is causing a bus conflict with the RPi's UART TX. Unfortunately, we cannot connect the RPi's UART TX into PA3/D0 unless we modify some solder bridges on the bottom of the MCU dev board.
I propose modifying the MCU code to perform PC interfacing over a different UART, e.g. USART6. Unlike USART2, the other UARTs have nothing to do with the ST-Link.
Steps identified:
The raspberry pi should be able to boot up into a playback mode. In this mode, it will need to stream commands to the MCU to recreate the recorded motion, and it will also need to initialize the playback of a 4K video stream. These must evolve in lock step
Is your feature request related to a problem? Please describe.
Seems like when we try to auto-start playback the MCU enters an unresponsive state and has to be reset before we can stream it angles.
Describe the solution you'd like
Find root cause of issue.
Should also implement a reset command on the MCU that can be issued from the PC side. Useful to have in any case.
Also, maybe once MCU receives a playback command, it should enable the servos. Might fix this bug.
Timeline
Describe any time constraints that should be taken into account (i.e. when do you need the feature to be ready?)
Next Monday.
Is your feature request related to a problem? Please describe.
Sometimes we may be interested in only a specific part of a long recording. Currently there is no way to plot just a portion of a recording.
Describe the solution you'd like
I'd like a new command for the Python script that accompanies that analyze
option (--analyze
). Let us call this new command plot_slice
. plot_slice
will take two arguments, a start time and an end time (--plot_slice=t_start,t_end
). It will result in a plot being generated that only contains the data between t_start
and t_end
.
Start and end times need to be bounds-checked!
Example usage: python lamp.py --analyze=lamp_data_0.dat --plot_slice=10,20
(plots the data in lamp_data_0.dat between 10 and 20 seconds).
The naming of the plot should follow some consistent convention that shows it's only for a specific time range. For example, it could be old_file_name + "_from" + t_start + "to" + t_end
, in which case we'd have something like "raw_base_lamp_data_0_from10to20.dat" for the example above (based on the default settings in the current version of the software).
Timeline
Needed to properly assess quality of recordings. Should be completed soon.
Occasionally, the base IMU acceleration data in Z and Y will be -0.15 instead of their measured values.
Originally I thought this might have been an I2C issue, but upon some investigation this evening I now suspect that the data table is the issue (i.e. we might be overwriting some of the base data when we write in the lamp data). I tested this by modifying the write_table
method so that if we are trying to write lamp data, we return instead. There are no issues with the base IMU data when I do this, so the issue cannot possibly be related to the I2C modules.
I suspect this issue will go away when the data table is refactored for #24 to store generic bytes instead of imu_data_t
, but nevertheless it needs to be solved.
This will be used when the lamp is set up and recording and I am moving camera positions.
the required files I will need to corroborate our systems for testing.
Describe the bug
Each data packet from the microcontroller is time-stamped with the local PC time. We can subtract the time stamp in the last packet from the time stamp in the first packet to get the elapsed time. On the other hand, given that the microcontroller sends back data 100 times per second, we can divide the total number of packets received by 100 to get the elapsed time. These two measured of the elapsed time should be very close to each other, however, based on the Oct 13th data, they can actually be quite different. We need to find out why.
To Reproduce
Steps to reproduce the behavior:
python lamp.py --analyze=oct_13_2019\lamp_data_0.dat --estimate=ind_angles --imu=both
in the debugger. Print out the value of num_samples
on line 137 in analyze.py. You will observe num_samples
= 172966.39.3 does not equal 28.8, hence the discrepancy.
Expected behavior
The total number of samples should correspond to the real time elapsed.
Additional context
At this point, we cannot even know whether 39.3 or 28.8 is correct. We could determine this with an additional test where we run the recorder for about 30 minutes, and process the data file afterwards to see if it says that 30 minutes passed. There are 2 scenarios:
time.sleep(0.001)
call in receive()
services another process for too long, and when we wake up we've timed out so we exit? Is it a problem with the serial data parser's logic in receive()
? If it's a problem with the parser, does this mean the data in the serial buffer is stale by the time we've written it to the file (i.e. the serial buffer accumulates a bunch of data which is not pulled out of it on time)?Regardless of which of these two cases it is (or maybe it's a third one which I did not anticipate!), samples are still coming in every 13.88 ms on average (72 Hz), which should still be sufficient for motion reconstruction. The main concern here is making sure we don't get a time drift relative to what would be recorded by a camera (this would happen if the reconstructed motion was slightly too fast or slightly too slow).
Brandon would like to stitch together different parts of the recordings by hand. In order to do this, he must be able to manipulate the data logs, e.g. copy and pasting parts of them into one "master" file.
A binary encoding scheme would make this difficult, hence the logs must be human-readable.
Is your feature request related to a problem? Please describe.
We do not have a straightforward way of manipulating the recordings to compose a final "edit"
Describe the solution you'd like
Timeline
Sooner is better - start of Dec is ideal
It turns out that all the IMU acceleration values are off by a factor of -1. This means that there needs to be a change in the embedded code that multiplies ax, ay, az by -1. This will make the acceleration values received by the computer log consistent with the coordinate system described in the sensor datasheet. Once this change is made in embedded, the multiplication setting in settings.ini needs to be changed to: mult_z = -1, mult_y = -1, mult_x = 1. Which is the negative of what it is right now. This will not solve any functional issues per se (currently, two wrongs are making a right!), but it's a good improvement for consistency
I have a specific size requirement, so that will determine it to some degee.
Wondering what specs I should be looking for here?
Building off #8, we now have the DS3231 RTC:
https://www.creatroninc.com/product/ds3231-real-time-clock-for-raspberry-pi/
https://www.creatroninc.com/upload/DS3231%20Datasheet.pdf
It can be integrated with the Pi by following the instructions here:
https://learn.adafruit.com/adding-a-real-time-clock-to-raspberry-pi?view=all
Undesired behaviour was observed when both servos were sent control signals while mounted to the lamp. This behaviour is not observed when they are sent control signals while dismounted. We noticed that the mounting heads of the servos are metallic, and figured it's possible that the servos are interfering with each other through the mounting mechanism.
We made the following observations:
Case 1: Both servos are mounted to the lamp and powered on, but are not given a control signal
No motion.
Case 2: Outer servo is mounted to the lamp, and is powered on and given a control signal. The inner servo is disconnected
Outer servo moves as desired.
Case 3: Inner servo is mounted to the lamp, and is powered on and given a control signal. The outer servo is disconnected
Inner servo moves as desired.
Case 4: Both servos are mounted to the lamp and powered on. They are both sent a control signal to go to 0 degrees
The inner servo was definitely going haywire in this case. I recall that the outer servo was fairly stable.
The root cause of this issue needs to be diagnosed. It would be ideal if we could solve it through circuitry (e.g. grounding the lamp) rather than something mechanical (e.g. using different mounting material). We did not try grounding the lamp since we measured a voltage of about 1.5 volts on the mounting head of each servo (powered by 5 V supply) and weren't sure if the mounting heads are floating or if they're being driven to that voltage by circuits inside the motors.
When we log raw data, we can always tune the filters afterward. This way, the fidelity of the playback will be completely adjustable post-recording.
If, on the other hand, we only logged the angle estimates, we would be forced to use these angles and would not be able to understand the underlying processes as well. Hence, it would be difficult to to compensate for any issues in the filters.
I am wondering what you think we might be able to use the IMUs for during the playback phase? If there is not a logical reason for them to be there can you think of an adjustment that might require the IMUs?
I think that it is important to not remove anything that is put onto the lamp; the lamp should wear its history (the first modification I made to the lamp was stripping it of its factory varnish). And I also think that it is important that the lamp maintains its functionality completely-- so it can always at anytime, with a few requirements, perform any of its three tasks: provide light, record its own movement, and playback that movement.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. πππ
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google β€οΈ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.