Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If anyone is interested in doing your own thing with weather data, check out MADIS [0]. There are various levels of access, some of which require NOAA approval. But if you're serious about making weather predictions, it's a good thread to pull on. I once set up a MADIS node, and our server was shut down very quickly by Amazon for "suspicious traffic", so beware of that - there's a lot of data that gets pushed through the system. If I remember correctly, it was kind of a pain in the ass to get set up/configured, but it was pretty cool.

[0] https://madis.noaa.gov/index.shtml



For those interested in more background on NOAA and making money from it, I highly recommend reading The Fifth Risk[0] by Moneyball author, Michael Lewis[1]. It details how a couple private companies make a lot of money using NOAA data in interesting ways (e.g.: crop insurance (now acquired by Monsanto)). Another of those companies is AccuWeather whose CEO was appointed to head NOAA by Trump[2].

P.S.: Anyone notice that monitoring "Climate" was absent in the government announcement?

[0] https://www.amazon.com/dp/1324002646

[1] https://en.wikipedia.org/wiki/Michael_Lewis

[2] https://oceanleadership.org/trump-taps-accuweather-ceo-head-...


Can you go in to more details about how to get this setup?


I honestly don't remember any more. At the time, we were working with NOAA, and I remember a problem that was solved by talking to an admin at NOAA (our IP needed to be on some official whitelist or something), but that may have been for a restricted data set. We didn't end up using it for long because the client said so.

But I dug around for some information to maybe get you started.

Installation: https://madis.ncep.noaa.gov/doc/INSTALL.unix

API: https://madis.ncep.noaa.gov/madis_api.shtml

Data restrictions: https://madis.ncep.noaa.gov/madis_restrictions.shtml

Another resource that may help: https://press3.mcs.anl.gov/forest/regional-models/global-dat...

When I was working on this stuff, I found that a DFS on various government subdomains (like MADIS) was the best way to find information. It was tedious, but it worked.

It's also helpful to put on your fortran hat. For example, I once attended a Haskell meetup where someone wrote a parser to deal with parsing binary files from NOAA. I also was in a meeting (with some NOAA folks) once where I was asked if I "would prefer an ASCII file, or a binary one". This is not a world that operates on JSON or XML. Expect binary blobs with flags (bits) that change the meaning of other flags in fun and exotic ways. The binary nature of the data can help with data throughput limits, but boy is it a pain to deal with.


> This is not a world that operates on JSON or XML. Expect binary blobs with flags (bits) that change the meaning of other flags in fun and exotic ways.

That bring back memories... As a government contractor I've had to work with sensor data (seismic, radar, etc) in various formats that were developed well before the rise of XML and JSON :(

My favorite was a mixed ASCII and binary format, where each data record in a file had an ASCII header that described the format of the following block of binary, and pretty much anything could be different between records, even within the same data file (Time units? Integers? Floats? 16 bit integers? 64 bit? Big/Little Endian?).

I had to write a parser for that :'(


The most "fun" I've ever had was decoding command and telemetry from piece of equipment for a ground station. The box would spit out this massive frame of data. It was a very long ASCII string that you would turn into binary to break into 6bit BCD values (no clue why they didn't use 4bit...). There were random flags of odd bit lengths (sometimes just a single bit, sometimes 5bits) thrown in between numbers for arbitrary reasons rather than just having all the binary flags up front. My python script was this ugly mess of slicing up the frame to turn it all into a very nice struct I could pass to the rest of the system. The manual with this piece of hardware was some old scan that must have been xeroxed a million times over so some portions of the document were just unreadable and you had to guess what those bits did. Other parts of the frame were just undocumented. Commands were send one by one as single letter with the actual ASCII representation of the numerical command parameter.

When I started the project, I looked online to see if anyone had done any previous work on this thing. A vendor was selling a GUI for the thing for $2000, I scoffed at the price and started working it myself. By the time I was done, it had probably cost my employer more than that but at least we had our own code that could connect to whatever you wanted rather than a GUI with a no API.


Did you try to sell it too?


It was an internal project for a large company so no.


Rather than try and set up MADIS, look into something like the siphon [0] Python module.

MADIS is a bit dated, data dissemination that uses OPeNDAP [1] and ERDDAP [2] is much friendlier.

[0] https://github.com/Unidata/siphon

[1] https://opendap.github.io/documentation/

[2] https://coastwatch.pfeg.noaa.gov/erddap/information.html


How long ago was this? Surprising that AWS would shut down your node, were you trying to run this on the free tier or something?


They probably thought he was mining cryptos




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: