Our full technical support staff does not monitor this forum. If you need assistance from a member of our staff, please submit your question from the Ask a Question page.


Log in or register to post/reply in the forum.

Priming the LoggerNet cache?


mjb May 16, 2011 10:14 PM

We use RTMC Pro, which gave us the ability to graph data from log files created by LoggerNet, which is handy, as it is a pain when a DataLogger needs to have its program updated to correct a small bug, or to add a new feature in a new table. We are very very careful about not changing existing tables (number of values, what the values represent, etc). But even when doing this, LoggerNet usually decides it needs to re-create its cache for that logger, removing all history.

For this reason, we use RTMC Pro to read the data files to ensure we still have historical data graphed.

Unfortunately, RTMC is very very processor hungry when reading data files, especially when they get large. We've also found it is somewhat unstable. It is most stable when graphing from the LoggerNet cache.

For this reason, we are wondering if it's possible to "prime" the LoggerNet cache somehow - then we can simply go back to graphing data from the cache, rather than deal with the instability of reading data files.

Further to this - using SendData from a datalogger can sometimes generate a hole. when this hole is collected, it is inserted into the cache correctly, and able to be graphed correctly. But, it is simply appended to the end of the associated data file being collected into, which RTMC cannot deal with correctly when reading that data file. The graph ends up showing an anomoly where that entry is located, usually a flat section of graph.


aps May 17, 2011 10:14 AM

In the setup option of Loggernet, under the Schedule settings for the logger in question, change the setting at the bottom from Automatically reset changed tables to Stop Collection until manually updated.

Then after loading a new program, use the Get table definitions, under the Data Files tab in the setup screen and select the option to Merge the table definitions. This should preserve as much data as possible.


mjb May 17, 2011 11:42 PM

Thanks aps,

That option does solve the issue of the cache being cleared, which may indeed help. But I take it there is no way to prime the cache with old data?

I am imagining that a small utility could be written to do so, given the format of the binary cache file? Is this information available?

Mike.


aps May 19, 2011 01:15 PM

That information is not available to me and there is no utility I know of. The author of the software does frequent this Forum so maybe he will comment.

* Last updated by: aps on 5/19/2011 @ 7:15 AM *

Log in or register to post/reply in the forum.