Our full technical support staff does not monitor this forum. If you need assistance from a member of our staff, please submit your question from the Ask a Question page.


Log in or register to post/reply in the forum.

Split and Table Based Timestamps


Clayton Nov 10, 2011 10:58 PM

After transitioning to CR3000s from CR10Xs I'm struggling to create working Split files. There seems to be very little information in the help files regarding the syntax for dealing with the single field Table Based Timestamps and I just can't seem to make it work. Could someone give me an example of the start condition syntax to average the last 60 minutes of data?

Here's a sample line
"2011-11-10 14:00:00",55188,249.55,19.86,22.50,31.60,34.60

I've tried all sorts of combinations, such as
::::1[01]:

Start offset is set to -4000 so that I'm only looking at about the last 76 lines of data

Column width is set to 25

I'm still getting a red *2011-1

I'm sure it will be obvious once I've seen an example.

Thanks,
Alex Clayton


Dana Nov 11, 2011 08:00 PM

It sounds like the problem may be that the field displaying the date is not large enough to accommodate the date string for the CR3000 datalogger. You say you have set the column width to 25 -- did you set that width in the column that has the date?

The Width can be set for each column of data. The Default Column Width field at the top of the page affects all columns, and the maximum is 9 (to maintain backward compatibility with older versions of Split).

If you continue to get the red "Bad Data" message, search the Index of the Split help for Bad Data and read about the other things that can cause this message to occur.

Dana W.

* Last updated by: Dana on 11/11/2011 @ 1:03 PM *


Dana Nov 11, 2011 08:09 PM

[i]Here's a sample line
"2011-11-10 14:00:00",55188,249.55,19.86,22.50,31.60,34.60

I've tried all sorts of combinations, such as
::::1[01]:[\i]

Sorry, I didn't answer your original question. You didn't say when you were trying to start processing, but for example, the following line:

1[2011]:1[310]:1[0000]:

would begin processing data on November 6 (Julian Day 310) 2011, at midnight (0000.

Dana w.


Clayton Nov 13, 2011 06:27 PM

OK, now I see a bit more clearly. The year is defined as one element, month and day are the next element (defined by DOY), then hour and minute, and finally seconds. That's not at all intuitive, I would have assumed that each unit is an individual element in the table based timestamp. So is there any way to start processing at the top of the last hour? For example, at 10:00 (and 11:00, and 12:00 etc.) I want to average all of the minute data from 09:01 to 10:00. If the hour has to be included in the element for minutes, it doesn't seem like there is a way to write a start condition that would work hourly. Would I have to create 24 different PAR files and tasks?! Can a wildcard be used? For example:
::[*01]:

By the way, I'm still getting the "Bad Data" indication, but it doesn't seem to affect the processing, except for the fact that it won't output the timestamp. Yes, I made the column wide enough, in fact I set them all to 25 just in case the timestamp wasn't getting placed where I thought.

Alex


Dana Nov 14, 2011 06:42 PM

I think this should work for you:

Start condition :1[-0]:1[-60,60]:
Stop condition :1[-0]:1[-0,60]:


If it helps, I spent at least 40 minutes trying to get this to work with one of my data files. I always have to relearn how to use Split whenever I sit down to do something.

Many things can be said about Split. "Intuitive" is not one of them!

Dana


canineaussie Mar 19, 2013 10:25 PM

This is related but may be a more basic question: I want to have split calculate totals for certain columns in the output file. The output file has the Totals in it, but not lined up under the correct columns. I have a Time Series Heading of "Totals". It appears that the totals calculated from the two input files are separated by a carriage return. Is there a way to line up the totals under their respective columns ?
Thanks
Rick

Log in or register to post/reply in the forum.