Sramana Mitra: Could you take us through a couple of use cases of how your customers use your product?
John Plavan: The easiest one to explain is last winter, because it was such an extremely warm one. I think it was the warmest winter on record in the U.S.* Going into the winter, around October, the general consensus among meteorologists was that we were going to have another relatively cold, if not an extremely cold winter, as we saw in the winter of 2010-2011. The price of natural gas in October was around a $4.40 per one million BTUs. That price was relatively high given our current situation of supply and demand of gas. But i was expected if you were going to have an extremely cold winter: since there was going to be a high demand for natural gas, the price should be relatively high.
Our product, however, was showing that the patterns that were necessary for the atmosphere to deliver very cold weather just weren’t materializing. These relationships aren’t well known or at least aren’t well applied without our software. We have done all the research that is quantified with those relationships out to 30 or 40 days. The average meteorologist running traditional weather models is not aware of this information. He or she has to wait to get into the ten-day range to have the accurate forecast to show that it would probably be cold, but then [the forecast changed and it] looked like it was going to be warmer than the meteorologist had expected.
Our Temprisk subscribers saw that the conditions necessary to produce cold weather were not materializing, and there were even higher risks for anomalous warm events. Our subscribers said that this market was mispriced. A fair price for natural gas given a warmer than expected weather pattern is not $4.40 per MBTU, it is less. So, they were able to make trades that, once the nearer term weather forecasts came in warmer than expected and the price of natural gas dropped, could be monetized. If you think back to last winter, weather forecasts kept saying, “OK, it has been warmer than we expected, but now we are finally seeing some forecasts that have cold weather in them.” But those forecasts just never materialized. Our clients stayed ahead of those forecasts, stayed ahead of that market, and made extremely profitable short trades on natural gas.
That is probably the strongest use case we can give, because it was such an extreme period of warm weather. Most of the time within a winter or a summer you have volatility, but last winter we just saw continuous, anomalous warmth, some of the hottest warmth on record. Where our product became really valuable was that most of the forecasters in the longer lead range – in the seasonal forecasting – didn’t call for that. They called for cold. We created a great opportunity for our clients to identify that mispriced market.
We have numerous other examples. Another big one that our clients really like is as follows: In February 2011 there was an extreme cold snap in the south central part of the U.S. People remember it because it threatened to shut down the Super Bowl in Dallas, Texas, with ice storms. But that cold snap was not forecast with traditional forecasting methods at all. Our Temprisk product picked up signals well in the 30-day range ahead of that event, and we gave our clients a heads-up that the gas and power market had a high risk of spiking in that period, and the rest of the market, which wasn’t using our product, wasn’t aware of this. Both of these events had an extremely large impact on the economics of the energy markets. Having insight into these events is valuable for those that play in energy markets.
SM: Given what you see and the universe you are dealing with, could you point out to our audience some of the problems to which big data technologies and principles can be applied to address those issues? You may not be working on those problems, but entrepreneurs might find them interesting to look into.
JP: The methods we have pioneered in the extreme temperature realm have a lot of promise with regard to any kind of extreme weather events in these longer lead times. It is subseasonal but still longer than the current forecast method lead times. Think about hurricane tracks and hurricane intensity, tropical storms, the increasing interest in alternative energies such as wind and solar. Large solar events or at least events such as lack of cloud cover, or large-scale wind events all have a dramatic impact on the output of those alternative energies.
We think there is still a lot of opportunity in the research to employ some of the methods we use for temperatures to try to predict large-scale winds, extreme storms, floods, and drought. I think there is huge human health and public welfare benefit in being able to predict an increase of flood or drought scenarios, not only in North America but also in South-Central Asia, for example. There is a huge amount of human sacrifice when there are unexpected floods. There is a lot of room out there for continued work to observe the globe as we see it and use those huge observational data sets to predict eventual major weather events.
We also think there a lot of work to be done in using huge data sets in the sense of social networking or other types of sensors and machines that are gathering these large data sets and then asking, “What might this predict?” We think the weather has as big of an impact, if not more, than any of those other types of predictors. The focus just hasn’t been as sharp in the weather data set because of all of the millions of dollar being spent by governments in weather forecasting are being spent to improve the current methods of simulating the atmosphere and using huge supercomputers to better simulate the atmosphere in play, whereas we think the predictive analytics angle is underused, and we hope to capitalize on that.
[*Editor's note: According to the NOAA, the winter of 2011-2012 was the fourth warmest in 117 years, and 2012 was the warmest year on record in the contiguous United States.]