Here is what Unisys' actions did.

Old code:

wget -N -nd http://weather.unisys.com/hurricane/atlantic/2017/IRMA/track.dat

New Code:

wget -O nhc_basins_current_status.txt https://www.nhc.noaa.gov/?text
grep "Forecast Discussion" nhc_basins_current_status.txt > nhc_Forecast_Discussion_links.txt
awk -F'"' '{print $4}' nhc_Forecast_Discussion_links.txt > nhc_end_links.txt
while read l; do 
	echo "https://www.nhc.noaa.gov$l" > nhc_links_to_get.txt
done < nhc_end_links.txt
while read l; do
	wget -P /home/jay/data/nhc_basins/ $l 
done < nhc_links_to_get.txt
for fname in /home/jay/data/nhc_basins/*; do
	grep "" $fname  | awk '{gsub("<title>", "");print}' | awk '{gsub("", "Forecast Discussion", "");print}' | awk '{gsub("Forecast Discussion", "");print}' | awk -F"(" '{print $1}'
	grep -A 8 "INIT" $fname
done

The two code blocks do the same.

I have things like this scattered through out the data-acquisition side of the programming.  When I redesigned the site this last year,  I made most everything modularized.  That is, you have a bunch of small blocks of code called by a unifying script rather than having one script with all the code in it.  Thus, repairing the site should be a little less than trivial, but doable just the same.

I'm just really exhausted from visiting my mom yesterday.  I should have the code repaired and the site back up and running within a few days if my body doesn't disown me.

Jay C. "Jazzy_J" Theriot