Problems with the Datalogger Streaming API

Im using the python moku api for the moku:pro the get a voltage reading from a certain channel. My goal is to measure over a relatively short time with a comparably low sample rate (something like 100 ms and 100 Sa/s or even 10 Sa/s).

Sadly the Datalogger can not record for shorter than one second, and one forum entry from an employee stated this is because for smaller times it is easier to use the Oscilloscope instrument. I don’t agree with that because the Datalogger allows for a higher sample rate at certain time scales (e.g. the Oscilloscope sampling rate at a 1s time span is 16.3 kSa/s, vs 2.0 MSa/s for the Datalogger which was already stated in the original post) and the Oscilloscope does not allow to set the sampling rate independently of the time span. So if I want to record for 100 ms im stuck with 162 kSa/s from the Oscilloscope.

So this is why I tried the streaming api of the Datalogger as a replacement. However I noticed a few problems that occured. When setting low sample rates like 10 to 100 Sa/s the get_stream_data() method gets very slow to the point that I have to wait for half a minute only to get three data points. For example inserting a sampling rate of 10 Sa/s into the datalogger_streaming.py breaks it as the stream ends before the first get_stream_data() call can be executed. It also seems that calling get_stream_data() does not return all newly streamed data but only a certain amount. When I call get_stream_data() and the sleep for 5 s and call get_stream_data() again I do not recive all points up to 5s which can be verified because the points come with the time since the start of the stream. This might actually be intended behaviour (e.g. just returning 512 points at a time or something like this) but in that case I did not find this in the api documentation.

I also noticed that when using get_stream_data() the programm sometimes continues to run after the last line/command has executed and can not be interrupted by ctrl+C. Im not sure what causes this but my assumption is that when not all data from the stream was read some subprocess continues to run, which might block the program. In some rare cases I managed to stop it and somehow and got the error:
Exception in thread Thread-3:
...
File "mokucli\converter\targets.py", line 62, in _check_logger_status
OverflowError: sleep length is too large
...
This might be unrelated though I am not sure.

I am using the newest version of mokucli and the moku python library. Python however is not on the newest verion. If anyone has an idea of what is going on or whether I am just completely misunderstanding the api I would appreciate some intel on that.

Thanks :slight_smile:

I believe these are two separate issues, but I’ll address both below!

If you are recording a signal over a 100 ms time span and have a sampling rate of 100 Sa/s (or 10 Sa/s), then you will only be getting 10 samples (1 sample) total during the duration of your logging session. Is this desired for what you are measuring?

This issue you link to is sort of the opposite I believe, where they want to record in a similar ms time span, but want to set the sampling right very high, which our oscilloscope currently doesn’t allow since it sets the sampling rate based on your time span. If you are looking to have a low sample rate over a small time span, you could downsample the data to fit your desired sampling rate. This would be just picking every 10th or 100th point from the oscilloscope trace data and ignoring the rest.


I have not seen this kind of issue before where a low sampling rate causes issues for the data stream. Would you mind messaging your code or emailing it to me?

Hey

The reason why I would like to have such a low sampling rate is more or less because I want to let the moku:pro do the averaging. If I take the data from the Oscilloscope for 100ms I get my 162k Samples and you are correct, in principle I could discard all but a few points. But then my data would be unnecessary noisy. Of course I could just average the data myself and this is also my current workaround but this takes some time on the python side of things and I would assume that just letting the moku:pro give me the already averaged data at a lower sampler rate would be way faster. This is why I think getting this relatively low sample rate is desired for certain applications.

Indeed you are right that the issue I link to is not related in the way that they want to have a higher sample rate. This was however merely ment to motivate my use of the Datalogger as I also wanted to log for shorter periods of time like the person from this issue.

For the issue regarding the low sampling rate I will send you an email. In case someone else with these problems comes across this I’ll add that I have investigated a bit more and it seems the lower the sample rate the longer the first ‘get_stream_data()’ takes. Afterwards it goes faster but does not give you all the current data. I also noticed that if you start a stream for lets say 100s and read from it for 60s and close the stream then end also the moku session the python programm will only complete after waiting the full 100s.