(perhaps they were always a joke).
For the West I was surprised to see Sun Valley at #2 and Big Sky at #13. I was equally surprised to see Aspen Mt, Jackson and Steamboat that "low".
As for the East, I was surprised to see Smuggs at the top and Mt Snow at #3. Also surprised to see Mad River at #6 and Loon at #10. Also found it interesting for Stowe to be down at #8 and Whiteface down at #13.
IMO it reinforces the joke these Ski Magazine rankings have become (perhaps they were always a joke ). Now that multi-mountain passes are making some headway it seems like independents doled out the $$ to Ski Magazine. What does it tell you when the #1 resort in the East does not have a single high-speed chairlift or just about no après-ski options - not to mention that it feels like they have not invested a single $ into the place in three decades? Let me know when Smuggs hosts a Winter Olympics, or two.
No they are not, they are an equal weighting of the individual categories, and this has been a known fact for well over a decade on these threads. Thus my editorial lead-in.So, @TonyC , you're saying that the "overall" results are not the aggregation of the Overall Satisfaction questions?
No they are not, they are an equal weighting of the individual categories, and this has been a known fact for well over a decade on these threads. Thus my View attachment 30133 editorial lead-in.
With at least half the categories being non skiing related, the SKI Magazine rankings will always inspire a knee jerk reaction of derision among the avid ski community. When I was working on that project two years ago, I rounded up several old SKI Magazine surveys and used them as a guideline for some of the non ski related categories. However, I'm in philosophical agreement with Christ Steiner of Zrankings. If we can construct an objective measure for a category, we should try to do that.
The as yet undeveloped project had the right idea for overall rankings. Let the end user choose the importance weightings.
I asked once why SKI Magazine doesn't use the respondents' weightings to determine "best overall" and never got an answer. I have had a fair amount of interaction with print media and it's a VERY frustrating process. They do what they want to do, editing is done by people up the food chain from your contact person and key points are routinely omitted or misstated. Pointed questions like yours are nearly always ignored.So, what are they doing with the Overall Satisfaction ratings? (Maybe it's published in the magazine in fine print?) And, you say they are giving all the categories equal weight, but they are asking how important each item is to you as well. What are they doing with that data? They've got two other methods for determining "best" overall -- the Overall questions themselves, and use of the survey respondents' weightings. Why ask for the data if they don't use it?
Very happy to NOT see any of my regular stops on any of these lists at all.. Nothing to see there... head on up to Wachussett!