I ran the numbers for myself on JBM, and sure enough it comes up as 12.7". This is fairly close to the SWAG I use to calculate drift in the field: wind speed (mph) X range/100 which is then divided by 15 the constant used out to 500 yards, that reduces for each 100 yards beyond 500 out to 1000 and = drift (MOA). This gives me 6 MOA or 12". Something to consider is that the logarithm fails to take extraneous elements into account. One element that anyone who has watched a wind flag, or a wind sock, for any time knows, is that wind gusts are significantly stronger than the steady state wind, and that the direction of the wind is variable within an arc of some degrees. When the wind is blowing hard, there is only one viable shooting position, and that position is prone, or you can't hold on target. Well if you shoot from prone, unless on an elevated platform, or shooting from the crest of a hill, the path of your bullet moves in close proximity to the ground, and the wind along the ground tends to be slower than the wind a few feet above it. If you want a good head scratcher, watch snow blow along the ground in a different direction to the direction indicated by the flag. JBM doesn't take a bullet's rotational velocity into account, so the assumption is that drift is equal whether the wind is from 90 degrees or 180. Both terrain and wind direction can create a vertical dispersion in the expected trajectory, and practical marksmanship becomes a challenge when being buffeted in a strong wind. So drift in a 45 mph wind is whatever you observe it to be. Predicting drift when wind speeds exceed 20 mph is exceedingly difficult. I would be hesitant to shoot at a live target that wasn't at 0 or 180 degrees to such a wind, and even then, it would have to be from prone or I'd pass up the shot.