Showing posts with label rant. Show all posts
Showing posts with label rant. Show all posts

Thursday, March 14, 2013

A Note About Making Baked Mac & Cheese with Pre-Shredded Cheese

Just a brief note to anyone making baked macaroni and cheese. If you are wondering if you can use pre-shredded cheese in the cheesy sauce, don't do it! Prepackaged shredded cheeses have anti-clumping agents like potato starch added. These other ingredients are usually disclosed on the label, but it's safe to assume that all pre-shredded (not block) cheese contains them. Their purpose is to keep the cheese from sticking to itself so you can pour it out of the bag freely. As a side effect, they make baked macaroni and cheese sauce have a terrible texture.

The potato starch will make your cheese sauce taste gritty. If you prefer to sprinkle cheese over the top it will also affect how this cheese melts and browns in the over during baking. While you will still be able to eat the resulting dish, be prepared for a gritty (potato-like) feel in your mouth with every bite. It completely ruins the creamy smooth cheesy goodness that should be baked macaroni and cheese. Just discovered this the hard way. Lesson learned. Do yourself a favor and get a nice block of sharp cheddar. No prepackaged cheese.

Monday, August 27, 2012

Are you sick of the Kohls "sale" pricing strategy?

Isn't it remarkable how Kohls is always having a sale? My local Kohls store, for example, is always mailing out flyers advertising 50% off the entire stock of this or that. I'm not knocking it - there definitely seems to be some value in this strategy since Kohls has stuck with it for so long - but lets consider how the deals might not be as good as they appear. Kohls sale pricing strategy creates a remarkable sense of brand loyalty and the feeling of getting a good deal by exploiting the whole concept of a sale and some careful emotional marketing.

It's no secret that shoppers (specifically women shoppers) love sales. They get the feeling that their dollar goes farther (Walmart does a great job of marketing this to customers) and the sensation of getting a good deal is exhilarating and addictive. Plus people don't mind buying things when they're on sale. They automatically draw the conclusion that the item they are buying is less expensive at this point in time than it was previously or will be in the future. That's kind of the whole idea behind sales in the first place - some kind of tangible savings. For example, last week this product was $4.00, now it is $3.00. I should buy this product.

Kohls has done an amazing job of capitalizing on our automatic sale logic. I suggest that they price the majority of their items at an artificially high "original" price. Then they put the item on sale for a massive discount, say 40% off. Plus, they give customers who use their store card (a powerful loyalty device in and of itself, not to mention a pleasant profit booster) an additional coupon via direct mail to redeem for another 20% off. And then when they check out, we will tell them they saved a huge amount of money, circle it on their receipt, and send them on their way. It's a strategy designed to exploit our understanding of "sales." They tell you an item is on sale. They tell you you saved more than you spent. And by the time you leave, even you are convinced that you have just saved more than you spent, never mind that the same item is $20.00 cheaper at amazon or a comparable quality pillow is hitting a lower price point at Target. From at least one viewpoint, Kohls has just told you a little bit of a lie.

I'm sure you're wondering how this white lie of "you just saved x amount of dollars" is possible with all of today's advertising regulations. The fine print at Kohls appears on every mailing piece and on their website and has some very clear language disclosing what is going on - here's a direct quote from kohls.com:
“Sale” prices and percentage savings offered by Kohl’s are discounts from Kohl’s “Regular” or “Original” prices. The “Regular” or “Original” price of an item is the former or future offered price for the item or a comparable item by Kohl’s or another retailer. Actual sales may not have been made at the “Regular” or “Original” prices, and intermediate markdowns may have been taken. “Original” prices may not have been in effect during the past 90 days or in all trade areas.

What do you think of the Kohls pricing strategy? Would you agree with my suggestion that the whole thing is a little deceptive? Or do you like the feeling and enjoy aggressively shopping the multitude of sales to get what you agree is a great price? I would love to hear from you in the comments!


Wednesday, September 14, 2011

Why Consumer Reports Sucks

When was the last time you picked up a Consumer Reports magazine? I don't know anyone who has a subscription to this popular "reviewed by consumers, for consumers" magazine anymore except my local library and maybe a doctor's office or two. And while there are plenty of conspiracy theories out there that center around how Consumer Reports isn't the gold standard of unbiased reviews it boldly claims to be because of industry payouts, manufacturer kickbacks, and pharmaceutical company sponsorship, when I recently glanced through a copy I was dismayed with the publication for completely different reasons, and felt compelled to review the reviewers at Consumer Reports. Here's what I noticed.

No Quantitative Data!
I saw plenty of comparisons that caught my eye in the issue I was looking at, but what struck me was that none of them mentioned any numbers. For example, the June 2011 issue has a half-page article about electric razors. 13 different razors were tested, and of those 6 were recommended and 1 was a best buy. That's all lovely, except the table compares things like "Noise," "Battery Life" and "Features" without ever explaining exactly how these variables were measured. I'm sure Consumer Reports has their readers' best interests in mind, but to make a truly informed decision about which products are best, wouldn't it be nice to have a few extra details?

For example, how did they test the battery life? Did they just turn each electric razor on and run it down until it quit? Or did each tester just use their assigned razor for a few minutes each day to shave normally, introducing variables like beard toughness and shaving technique? Or did they simulate actual shaving in a lab setting, cycling each razor on and off for ten minutes at a time over the course of several days to see how well the battery held its charge? Maybe I'm splitting hairs here but I'd like to see some concrete statistics from Consumer Reports, like "Razor B1 had a tested battery life of three hours in our rundown test, but razor B2 had a tested battery life of 2 hours and 45 minutes, so we gave B1 excellent and B2 very good ratings." That way the average Joe consumer could look at the data and make a better informed buying decision, especially if there's a significant price difference. I don't know about you, but I can live with a few minutes less shaving time for a few dollars less, but Consumer Reports insists on hiding the reality behind their results with those annoying circle icons. Are they trying to appease the manufacturers by never giving out any actual performance results in their testing to show just how far ahead of the pack a product is - or how little variation there actually is among products?

Another example that concerned me was an in-depth article covering washing machine testing. The lab rated things like "gentleness," "noise" and "vibration" which all seem to be good things to consider in deciding which washing machine to purchase. But if you want to know exactly how Consumer Reports tested each machine for gentleness, you'll have to look elsewhere, because there's no explanation in the article. And it's not just these slightly more subjective measurements that are missing - I would think standards like noise could be easily measured with a decibel meter and some real digits put in an article to show consumers what the tested differences were. Of course the consumer's mileage may vary, since I doubt anyone is washing a test-sized load at home and the noise levels would be different depending on what you were washing - but why not do a little more science and a little less touting the ratings as gospel?

Re-Using Data in "Summary" Articles
I'm sure Consumer Reports isn't the only magazine that does this, but I noticed that in the June 2011 issue there's an in-depth (well, okay, as in-depth as you can get without actually mentioning any data) report on gas grills. They test and rate something like 65 gas grills in a 3 page article with their pleasantly mysterious circles. But then in September 2011 there's a whole page devoted to "Great Grills" that shows much of the same information. And we see the top performers from the first grilling test with their test results re-printed for readers to digest again, along with strikingly similar headlines. "Think safety" becomes "Give it a safety check" and "Focus on features" turns into "Choose features you'll use." And judging by last year's "Home and Yard Products roundup" edition, readers can plan on seeing the same tests again in the future.

I'm not sure I would be comfortable paying for a magazine that claims to be the ultimate buying guide for consumers that really just feeds you the same content over and over again, even if it's only 3% recycled material. Just because their reader base is aging does not mean that they are also forgetting what they read three months ago! Maybe it's time to do a little comparison of buying guides and give ratings magazines and websites some pretty circles. And doesn't it scare you just a tiny bit when a magazine puts a little burst on their next-to-last page saying "Please remember CU in your will?" (This is at the bottom of a page-long list of names of individuals involved in their organization! Why not have another review in there instead of all those names!?)

All Reliability Data Comes From Readers
I'm sure Consumer Reports readers are nice people. But judging from the heavily targeted advertising in their magazines for life insurance, reviews for medical supplements and medicare plans, and promotions reminding readers that the Consumer Reports website is available 24/7 (just like the rest of the internet!) I get the impression that their readers are in a specific demographic that could cast a shadow on some of their reliability reports. I'm talking about the mature demographic, like over age 50. A quick glance at Alexa.com's audience report confirms that most of the visitors to consumerreports.com are over the age of 50.

The point I'm getting at is when you ask your average 55-year old about how their dishwasher has been working, for example, to try and determine the most reliable brands, you're going to get an answer that's skewed by their age group's habits. I see them saying things like "Well, we only run it twice a week." So how can this data be of use to your average family of four? Is it three loads a week vs. two loads a day? As another example, think about where this generation of readers is driving their cars. They're driving to work, to the store, and back home, and then to church on weekends. They're not driving to work, to school, to soccer practice to pick up Johnny, then to the music clinic to pick up Jenny, and then to Walmart and then to the grocery store and then home and then to the park and back. I'm not saying that the data is faulty, I'm sure Consumer Reports would say that the real world mileage of their reliability reports will vary immensely, but wouldn't you like to know a few more details?

In keeping with the "just take our word for it" theme of the entire publication, Consumer Reports doesn't bother to tell you how many readers they surveyed to get their data, or how many percentage points their survey could be off by, so who's to say they aren't just calling a few extra readers until they get the results they're looking (or getting paid) for?

The Bottom Line
Maybe I'm just spoiled by this generation's fascination with user reviews and the full range of review websites out there that take great pride in exposing all the gory, technical details of the products that they work so hard to review accurately and completely. But the classic Excellent, Very Good, Good, Fair and Poor circles of Consumer Reports are starting to feel a bit outdated without any real substance to back them up, and their attitude of "trust us, we're Consumer Reports" doesn't inspire confidence, especially when they're feeding readers the same data multiple times and mining their increasingly segregated reader base for reliability info that may not accurately reflect real-world results. What's your take on Consumer Reports? As always, I'm willing to consider any viewpoint in your comments, provided you can back it up.