Before I start, I've got to say that programming the site with internal tremors is really interesting. My goal is to produce more legitimate code than bugs. My fingers will either put two (or more) of the same letter, the letters in a word out of order or just type stuff that doesn't make any sense. This new endeavor is going to be interesting.
Due to a recent redesign of Unisys Weather Data the format of data retrieval is fairly different than before. Using the old site, you could drill down to the core data directly. With the new Unisys site, there will be a two- or three- stage process to retrieve the data. Examining the code of their site, I can understand why they did what they did.
They are assigning tokens to the data as opposed to hard directories. This allows them to store the data in in server separate from the web-server.... Exactly what I am doing.
The process will require the following steps:
- Scrape the six ocean's main pages
- Process to find new/old storms
- Scrape the storm pages to obtain token pointing to data
- Scrape storm data
- Process data in to maps, graphs, etc.
Before, I could go directly to #4.
I'm hoping to get this done withing the next days. Then, the site will be tested and back into service!
Thanks for reading,
Jay C. “Jazzy_J” Theriot