Our full technical support staff does not monitor this forum. If you need assistance from a member of our staff, please submit your question from the Ask a Question page.


Log in or register to post/reply in the forum.

GOES data retrieval


kmiddel Jul 25, 2012 04:55 PM

Hi All,

I'm wondering if anyone has experience with automating data retrieval from the GOES network. I recently installed a TX320 transmitter on a station and have data being transmitted through the GOES network, but so far the only method I've been able to find for downloading the data is through the LRGS or DADDS DCS website, which requires interactive input of the DCP address. I'd like to find a way to automate this process so the data is downloaded automatically (ideally through a webscript so my computer doesn't have to be on and connected). If anyone has experience or advice on this, please let me know.

Thanks,
Kevin


kirving Jul 25, 2012 11:54 PM

We do this sort of thing, but it tends to be a moving target as the source weblinks change. Currently using a shell script on a linux box to put together a query url which is passed to the wget utility. The output is piped to awk to extract the header, and also saved in raw form. The header is compared to the last header; if it's the same we give up, nothing new has been received; if different, then the content is stored and processed.

The basic idea: automate what you know how to do manually. If you can build a query url in a string that works, then you just need to build a suitable query from parts, e.g., from the server address, a query prefix or path, the platform identifier, and probably some time constraints.

Combining different utilities is what linux/unix shell scripting are all about, and it's not that hard if you dig into it. If you're saddled with a dumbed-down system that devalues scripting, then maybe you can get there by clicking, dragging, and dropping, but I have no idea how to do that. Good luck!


kmiddel Jul 27, 2012 12:47 PM

Thanks for the response. I'm wondering if you would be willing to share the shell script that you use to login and retrieve the data (particularly the query url). I just haven't been able to track down the correct syntax for hitting the server and getting the data I want. If I can get the data I should be able to use R or something simlar to decipher it and build a dynamic HTML page.

As a temporary solution I was able to get the DECODES software up and running yesterday and build a couple of scheduled tasks to run the DECODES download utility and post the decoded data to an html page. that page gets uploaded to my website via an ftp script. It works for now, but the DECODES output isn't very friendly so I'd still like to get my own routine working if possible.

Cheers,


kirving Aug 7, 2012 09:41 PM

Yes, I would be willing to share the code, but don't want to post it online for various reasons. If there's a way to send a user-to-user message on this forum, drop me a line and I'll be happy to respond by email.

Ken


kmiddel Aug 8, 2012 04:59 PM

Thanks for the offer Ken,

I've set up a temporary address: camsci.q 'at' wildliferesearch.ca. Feel free to contact me there.

Cheers,
Kevin

Log in or register to post/reply in the forum.