Maanschijn Spin Cycle Verdelho 2017

By , 3 April 2019

Comment

12

Maanschijn Spin Cycle Verdelho 2017

It will all come out in the wash.

Maanschijn is the label belonging to Paul Hoogwerf and Doug Mylrea who make wine out a barn between Stanford and Hermanus.

Spin Cycle Verdelho 2017 is from Voor-Paardeberg grapes, the winemaking involved a press partially constructed from old washing machine parts and hence the name. Some 30% was fermented on the skins for two weeks and the wine is dull gold in colour. The nose is overtly oxidative with notes of cut-apple, orange and spice while the palate is rich and full (despite a low alcohol of 12%) with tangy acidity and a pithy finish – not exactly everyday drinking but very good in its style. Wine Cellar price: R245 a bottle.

Editor’s rating: 91/100.

Buy This WineFind our South African wine ratings database here.

Comments

12 comment(s)

Please read our Comments Policy here.

    Matt A | 4 April 2019

    Ignoring arguments r.e. the 100 point system itself, scores given on this site do hold value for me. I am not so fussed at the variance between CE or anyone else, once you become familiar with how a particular critic scores and understand what they tend to reward you have all the reference points you need. If you need to calibrate with your own taste, buy and taste some of the wines. Assuming consistency, scores also add an important dimension in comparing wines across vintages, a one point difference on a wine can signal a lot in terms of what a critic thinks.

    BTW I think the spread of scores quoted above is actually quite impressive, but a top-ten sellers tasting could indeed be interesting…

    Jono Le Feuvre | 4 April 2019

    Tim James published a few articles on “supermarket wines” recently (on this site, I believe), so perhaps that will be more relevant to you, Colin. I’m not sure what price point you’re into. There is such a broad range.
    Perhaps a good idea for an article post would be to score the top ten biggest selling wines in South Africa (yes, four cousins included). It won’t be fun for the reviewer, but I guess it would be interesting, nevertheless, to engage with a list of wines that are encountered by the highest number of SA drinkers. (But perhaps not quite on brief for this blog, though). Heck, I might go ahead and do that. Journalism in the public interest, right?

    Greg Sherwood MW | 3 April 2019

    Some good points.
    The most important is that the emphasis is on scoring interesting wines, the ones you should seek out and buy. I too am not in the game of spending long blog posts reviewing a wine that scores 83 or 84. Luckily I was clever enough to call my blog The FINE Wine Safari… not the Wine Safari. Mediocracy exists but I too choose to use my time to write about wines worth tasting / buying, not those not worth tasting. This fact alone will skewer score averages etc. It should be pointed out that the great writing talent that is Jancis Robinson also comes in for a lot of stick for scoring everything 16-16.5/20. I have no idea if she actually liked the wine. That is partly the fault of the 20 point scoring system. I’m perfectly happy using the 100 point scale and feel that used properly and honestly, Score inflation shouldn’t be a real issue.

      Francois | 8 April 2019

      Greg, out of interest, do you generally “like” the wines you score highly? Surely complete objectivity about the quality of a wine and the scoring thereof should be independent of whether the reviewer actually likes it or not? Although I do appreciate that scoring remains a bit of an art and the scorer will tend to favour wines they like a little more.

        Christian Eedes | 9 April 2019

        Hi Francois, Not wishing to speak for Greg, I increasingly think that individual commentators should not try to be “objective” but instead take a stand one way or the other – the overall quality of “fine wine” is now so high that trying to being too impartial doesn’t really help anybody – leave that to panel tastings were the effect of the median can play its helpful or not so helpful role. Great wines very often sit at stylistic extremes and it often needs an individual taster (“panel of one”) to draw attention to them.

    Colin Harris | 3 April 2019

    Hi Christian
    Let’s talk about scores. Particularly yours. I am not entirely sure why you still bother scoring wines.

    I did a quick count. Of the last 64 wines you reviewed, 6 were between 80 and 88; 8 were 89 points; 12 were 90 points; 13 were 91 points; 8 were 92 points; 6 were 93 points; 6 were 94 points and 5 were 95 points.

    In other words you scored 51 wines out 64 between 88 and 93 points. That’s 80% of everything you’ve reviewed. Or let’s say that 93 is still a score that perhaps says something – you’ve done 45 wines between 88 and 92 points – 70%.

    Those are great fence sitting scores – that is the type of score that tells me you don’t really have an opinion about the wine. By scoring just about everything you review in that band, you’re telling us average Joes absolutely nothing about the quality in the bottle. So why bother scoring them if they’re all scoring the same? Perhaps stick to just telling us about the wine and forget the scores?

      Neil | 3 April 2019

      The difference in quality between a score of 88 and one of 95 is vast. A score of 90 is excellent, 95 is world class. For me there is value in referencing scores, even if it is a subjective opinion/art. If I’d bought 64 wines ”blind” and 50 of them scored 90 to 95 I’d be thrilled (depending on price of course!)

        Duncan | 3 April 2019

        The problem seems to be that the 100 point scale makes very little sense to the uninitiated. Everyone understands the difference between, say, 3 and 5 stars. By contrast, 85 out of 100 sounds pretty darn good. Perhaps there ought to be a more prominent explanation of the rating system.

          Christian Eedes | 3 April 2019

          The matter of scores raises its ugly head again. My feedback in reply to some of the points raised above:

          1. Regarding my scores over time, I would suggest that they sit pretty much on a bell curve (by happy accident rather than design) and hence quite defensible. They are wines below 88 that I encounter week to week but I’ve tended not to capture these on the basis that I’m attempting to highlight what’s worth seeking out rather than what should be avoided. Oh, that they were more wines above 93…

          2. Regarding the accusation of “fence-sitting” and expanding on my point above, I would make the argument that the overall standard of viticulture and winemaking globally but especially in South Africa has risen to such an extent in recent times, that a sub-84 score even for the most basic entry-level wine would point to utter incompetence on the part of the winemaking team and such wines are quickly ejected from the market in any event. Conversely, quality gains are harder and harder to come by, the closer a producer gets to a 100-point score, and the “perfection” that this implies, so very high scores rapidly tail off.

          The 100-point system was supposed to give us all more points to play with but ironically has either led to “bunching” – all manner of wines getting scores within a relatively narrow band or “score inflation” – wines being vastly over-rated as individual tasters seek to attract attention to themselves.

          3. Tasting notes need to be read in conjunction with the score and the Maanschijn Spin Cycle Verdelho 2017 is a case in point – if you don’t have some affinity for/experience of “orange wines”, then it’s unlikely that you’re going to like this regardless of my score of 91/100.

          4. A rough guide to how I think about the different rating systems:
          18/20 = 94 points = gold = 5 Stars
          17/20 = 90-93 points = silver = 4.5 Stars
          16.5/20 = 88-89 points = bronze = strong 4 Stars

        Colin Harris | 3 April 2019

        Neil – I agree with you. But how on earth can you make an informed opinion if 70% of the wines all score the same? Price? Region? Variety? I have a limited budget and can’t afford to go out and buy 40 plus different wines to see if I might get lucky with one of them. So what do I do? I buy what I know, I dont explore new wines, because Winemag’s scores aren’t a reference point. Unless you go to the top end where it does appear Christian often correlates price with quality. And bear in mind that the stats I used above was just a sample of the last 60 odd wines he’s done. If you have the inclination go and check the rest of his scores.

      Jonathan Snashall | 4 April 2019

      Colin, there is no perfect palate, no perfect wine scoring or competition, critical appraisal on any one day swayed by many things, take scores as indicative rather than absolute. I would also argue 100 point system effectively a 20 point system.

      Narkath | 4 April 2019

      Christian, the problem with this scoring system is you provide your readers no baseline with which to set value in terms of palate. I always scroll to the worst review on most things because this has become a common theme with movies, games, books you name it. 70% is new average. i.e. if everything is amazing nothing is. If that’s the case it should be adjusted so that 50% is pedestrian, drinkable but not worthy of remark.

      I would suggest reviewing 50 percenters and garbage every now and then so that there is something to hold better tasting wines against. Personally I think a good reviewer has trashed a goodly number of items and will continue to do so for credibility sake.

Leave a Reply

Your email address will not be published. Required fields are marked *

Like our content?

Show your support.


Subscribe