Letter to the editor: Blind tasting reality check

By , 25 July 2025

Comment

15

The following received via email from Yair Safriel, a physician having trained in neuroradiology. Now based between New York and Florida, he visits South Africa often – he has long had an interest in wine, and makes his own under the Safriel House label.

A lot has been made both in the pages of Winemag (see here) and elsewhere about the relatively “low” scores attained by the wines on show at London’s “New Wave – Ten Years On” tasting in May.

A close look at the scores attained by these same wines in the US reveals an altogether different view. Most of the scores were either spot-on or even over inflated. Before you make angry comments, let me tell you how the view from America may be different (as it often is).

The most influential wine publication in the US is still Wine Spectator. Why? Because it tastes everything blind.  The “Ten Years On” tasting once again showed the power of blind judgment – echoing the closely aligned results you would see in Wine Spectator blind sessions. But here is the hard fact: no South African wine has ever scored above a 95 in a blind tasting by a major U.S. panel, and even that peak remains rare. James Suckling alluded to this in his report a few years ago when he noted that Hungary has more 95-point wines than South Africa.

That matters, because blind tasting levels the playing field. Unlike in sighted tastings, there is no winemaker prestige, brand story, or future access influencing the score. Sighted tastings often push top South African wines into the 95–98 bracket. This does reflect broader context and narrative. Or in other words, 94 points for the wine and 3 points for the story. On the flip side, they are inherently conflicted. Consumers expect to read reviews of the best wines and keeping access to these wines hinges on good reviews, making it hard to “bite the hand that feeds you.”

Take Samantha O’Keefe’s Lismore as a case in point: Platter’s lost access to those wines after giving one vintage a solid but unspectacular 3.5 stars some years ago. Lismore simply did not submit the wines again.

So here is the rub: When a wine earns a 92 in blind tasting, it is not a failure – it is a signal that, under strict conditions, it meets world-class standards – but is not yet beating global heavyweights. Meanwhile, the 96+ from Atkin or Winemag? Those reflect more than bottle quality – they capture story, style, and signal.

Bottom line for South African producers: In high school, one of my teachers told me “There is no comment line next to the score”. The problem for South African producers in the US is that few consumers know the story. The sole judge is the number on the shelf talker, often from Wine Spectator. Sighted tastings give you applause, narrative, and nuance. The Cape needs both – but if we want respect without context, blind judges must be heard – and let’s not pretend otherwise.

Comments

15 comment(s)

Please read our Comments Policy here.

  • GillesP | 25 July 2025

    Kudos to Yair Safried. Well put.

  • Jos | 25 July 2025

    While I don’t disagree with your overall point on blind tasting, using Wine Spectator to substantiate the low score claim is a bit dubious as they do not rate many SA wines in general. As an example, for the 2021 vintage they only reviewed 148 wines, so the sample size is a bit on the small side.

  • Kwispedoor | 25 July 2025

    The way I look at it, adding 3-4 points for story, style, signal, vibe, or whatever is quite simply a scoring error. That’s like scoring the wine lower if you don’t like the winemaker, the weight of the bottle, the story, the cultivar or the price. Tasting notes are where one may mention some stuff that fall outside of the wine’s intrinsics, if needed. But specifically the scoring of a wine should always be about quality and taste alone. I’ll grant that this is a difficult thing to do (especially under various circumstances), but at the very least it should always be the goal and I don’t think this should be negotiable for serious scorers either. The closer your sighted score and your blind score is to each other, the better you are at scoring.

    • Wessel Strydom | 27 July 2025

      Kwispedoor, your response is noted and I, for one, agree 100% with your assessment

    • Jos | 27 July 2025

      I don’t agree with this. Sure, vibes or cultivar should not affect the score, but the size of the bottle? That is directly linked to sustainability and thus – climate change. Needlessly using heavy bottles that negatively affect the environment matter to some consumers. Same with labour practices, some people care if you treat your workers poorly and it would 100% affect their opinion of your product.

      Wine is after all a product, and a review of said product should include the full product – that includes the way it was made and the packaging it comes in.

      Now, if a reviewer chooses to ignore those aspects and purely focus on what’s inside the bottle then so be it. But I would certainly not call it a scoring error if they do include relevant aspects outside of the liquid in the bottle – provided they disclaim it. There are, after all, consumers that care about those things as much as what’s in the bottle.

      • Wessel Strydom | 27 July 2025

        Jos, do I understand this correctly? If the wine is packaged correctly, workers are fairly r enumerated and the bottle is eco friendly but the wine is only modest then it will appeal to you?

        • Jos | 27 July 2025

          Wessel, do I understand this correctly? If the wine taste delicious but was made by a producer that underpays and abuses his workers it will appeal to you?

          • Wessel Strydom | 27 July 2025

            Jos, it might still appeal to me but I won’t purchase the product. But in the same breath, whatever bottle they use, the way the wine was made and the packaging in comes in ( to use your words) will have absolutely zero effect on my purchasing decision. The only factor that comes into play is whether I like the wine or not.

            • Jos | 27 July 2025

              And that’s fine, my point was it’s also fine if someone else does consider those factors as material to whether they purchase the wine and expect it to form part of a review. We can have both…

      • Kwispedoor | 28 July 2025

        Hi Jos.

        I take many factors into account when purchasing wine, As far as I know about all of it, of course – extensive info about every wine’s finer details is not always readily available. These factors can include a wine’s score (preferably my own, otherwise those of people my palate align with somewhat), price, closure, sustainability, fair labour practices, age of vines, clonal info, terroir, production methods, track record, cultivar(s), analysis, etc. For instance, if I’m aware of the fact that a producer is truly treating its workers unfairly, I won’t buy it, no matter what the quality (and I won’t let other factors like the age of vines or track record influence that. ). But having any of these other factors influence a wine’s score simply makes no sense to me. It’s a bit like letting a wine’s price influence the way you look at their labour practices. These are all separate factors, all of which everyone is free to consider before making a purchasing decision.

        Perhaps there’s merit in devising such multi-factored scores (a quality & sustainability score or a quality & price score, etc.). Or even a “Complete Score” for wine that’ll include all of these factors, but I think we can all see how that will be fraught with major obstacles. In the absence of such a system, scores should remain all about quality, in my opinion. I didn’t suggest that reviewers disregard all the other factors. Sure, give as much info as possible, but make it part of the review, not the score. The score itself should remain untainted.

  • keith | 27 July 2025

    I am sure some critics who review SA wines, dish out high scores like confetti so that they can flog more of their labels !! Nothing like a 97ptTA etc etc
    Rarely do I take any notice of such inflated scores , and only really appreciate ratings when they are blind and by critics I trust .

  • Tim James | 28 July 2025

    It would be interesting to probe those Wine Speculator – sorry, Spectator – scores a bit, and the methodology. It’s not a magazine I know these days, but are the wines referred to always tasted fully blind? That is, are they tasted blind as “South African” wines? Or just in the category of “chenins” or “syrahs” from around the world? I’m not saying I know the answer – I don’t, but I wonder. Prejudice can work in various ways. I’d guess, for example, that a tasting of wines from Hungary would turn up a lot of high scoring for sweet wines (perfectly legitimately, probably, but helped by the reputation of Tokaji).

    • Aaron Meeker | 28 July 2025

      My understanding with WS is they taste blind by country/category, so South African Chenin would all be tasted as a group but not the entire lineup for the day. They do open, rolling submissions (but must approve the submissions first).

      I’d say the blind tasting is a real thing, but so are the invoices for advertisements, online features, etc. there is a reason that the Top 100 each year contains many of the usual suspects (and while the quality is high for the wines, I’m not sure it is as high as the rankings contend).

    • Kwispedoor | 28 July 2025

      Absolutely. The moment you have a category or theme for a tasting, it’s not fully blind (sometimes even far from it). Imagine expecting fair scores from a group of British wine journalists for a “South African Pinotage” tasting two decades ago (or even now).

  • Donald Griffiths | 28 July 2025

    James Suckling never comes to South Africa.

Leave a Reply

Your email address will not be published. Required fields are marked *

Like our content?

Show your support.


Subscribe