Replies: 1 comment 4 replies
-
@Darveesh Unfortunately,
800K rows is quite a lot of data and if I were to estimate/extrapolate, based on this performance table, it would take a few hours to complete and this is dependent on the number of cores and amount of memory available on your system. If we're talking about bus data, I'm not sure how much value there is in analyzing at a 1 second granularity. Instead, I would try downsampling the data to something like 5-10 second intervals (and thereby decreasing your data size). At the end of the day, you are interested in path deviations and you are not likely to notice those deviations even at the second level as deviations are likely to occur at the minutely level. 13K rows (of minutely) data should be possible. A quick test on my modest 2-core MBP shows:
I hope that helps |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
In a separate discussion thread #305 I am trying to using
stumpy.mstump
. My data set is about ~4K "uneven" timestamped geolocations (lat/lon). After interpolating the data to 1 second increments, the data set becomes ~800K rows. Feeding that intostumpy.mstump
is taking a long time however. I started it about 40 min ago and it's still going. Running on a new/2019 MacBook Pro 16. Is there an option to do mstump using onboard GPU to speed up the process perhaps? Seems I have an AMD Radeon Pro 5300M in case it matters.Beta Was this translation helpful? Give feedback.
All reactions