Edit: I have added a more concise review with photos, find it here: Aldi's Milk Review - Revisited (With Photos)
Aldi vs. The World in the Milk Arena?
What exactly makes different brands of milk different? Have you ever asked
yourself the same question? For this review, I purchased a gallon of 2% Aldi
milk (sold under their friendly farms label) for $2.49 and compared it to a
gallon of 2% store branded milk from another local supermarket that cost $3.99.
At the same supermarket, they also carried gallons of 2% milk from
nationally-recognized dairy brand Kemps for $4.09. I was already thinking I was
saving a few pennies by purchasing the store brand milk, but after a trip to
Aldi I was concerned I was getting scammed at my local supermarket. If I could
justify the cost savings of Aldi's milk - over a dollar per gallon - and I
consumed a gallon per week, I could save about $50 on my grocery bill without
doing anything differently except stopping at a second store.
So What's The Catch With Aldi's Milk?
As soon as I saw the Aldi's milk I felt there had to be some kind of catch -
how else could a company (even a highly efficient, German company like Aldi)
sell their milk for so much less than everywhere else? I wondered if this was
some kind of "B" grade milk, the milk that was about to expire, or
milk that had tons of hormones in it. And who exactly was this “Friendly Farms,”
anyway?
Is Aldi Milk “B” Grade Milk Or Something?
I learned that there is such a thing as B grade milk, and its inferior
status comes from the level of bacteria present in a sample of it, but it's
usually used in cheese and other dairy products, not sold for drinking. I
felt confident that there would have been an uproar on the world wide web if
Aldi was selling B grade milk in their stores as regular, grade A milk. I
didn’t find any complaints about Aldi milk online, except on one forum where a
lady was convinced that her gallon of Aldi milk held less product than the
competition. I examined my gallon of Aldi milk and it sure looked like it held
the exact same amount of milk as the store brand gallon of milk. The milk jugs
were the same size, but the circles on the sides that expand to prevent the
gallon from exploding under pressure or impact were shaped a little
differently. I decided that the woman’s complaint probably wasn’t valid and
there was no way Aldi was using “B” grade milk under their label. Point to
Aldi.
Is Aldi Milk About To Expire?
My gallon of Aldi milk had an expiration date over a week from the date I
purchased it. I felt that Aldi wouldn’t risk the bad press and government
fallout that would arise from consistently selling milk nearer its expiration
date than their competition, but at that point I wasn’t exactly convinced there
wasn’t some scheme running behind the scenes with Aldi milk where they would
ask the plant for their oldest milk to negotiate a lower price or something. No
points given here.
How About the Hormones in Aldi Milk?
The Aldi brand label, Friendly Farms, says that their milk is produced from
cows not treated with rBGH or rBST. Aldi milk also carries the REAL seal, which
means that the milk is certified as Made in the USA, made from cow’s milk,
doesn’t contain any casein, caseinate, vegetable oil, or other substitutes. Not
that I would expect any of these products to be present in Aldi’s milk, but
it’s reassuring to see the seal. The ingredients listing and nutrition facts on
the Aldi milk label are also exactly the same as the ingredients and nutrition
facts on my gallon of store branded milk, which was also a point in Aldi’s
favor.
What Really Put My Mind At Ease…
After tasting the milk and not being able to tell the difference, I knew I
had to try a little harder to discover why Aldi could sell their milk so cheap.
Finally I noticed that the plant code stamped on both my store brand milk and
the gallon of Aldi’s milk had the same format, two digits, a dash, and three digits.
This convinced me that the two milk brands were either from the same company,
which used the same format between all its plants, or else there was some
standard plant notation, that might be documented on the web. After a little
more research, I discovered the mind-blowing web site whereismymilkfrom.com,
which lets users enter in the code from their dairy products and uses it to
look up the dairy plant of origin in a public database published periodically
by the government, the Interstate Milk Shippers List. It then returns the
result in a Google Maps pane with icons detailing what products are made at
that plant.
Looking up the Aldi label milk using its code, I saw it came from a plant operated
by Kemps! The same thing with my store brand milk, it also came from a Kemps
plant, although it wasn’t the same exact one. This discovery leads me to
conclude that most milk is exactly the same regardless of the container it
arrives in, and consumers who choose Aldi milk can save money without worrying
about getting a subpar product. Why would anyone pay more for milk that is made
at the same plant as the cheap stuff? In my opinion, Aldi milk is just a good
way to cut the grocery budget down to size, and Aldi wins this one!
Tuesday, September 20, 2011
Wednesday, September 14, 2011
Why Consumer Reports Sucks
When was the last time you picked up a Consumer Reports magazine? I don't know anyone who has a subscription to this popular "reviewed by consumers, for consumers" magazine anymore except my local library and maybe a doctor's office or two. And while there are plenty of conspiracy theories out there that center around how Consumer Reports isn't the gold standard of unbiased reviews it boldly claims to be because of industry payouts, manufacturer kickbacks, and pharmaceutical company sponsorship, when I recently glanced through a copy I was dismayed with the publication for completely different reasons, and felt compelled to review the reviewers at Consumer Reports. Here's what I noticed.
No Quantitative Data!
I saw plenty of comparisons that caught my eye in the issue I was looking at, but what struck me was that none of them mentioned any numbers. For example, the June 2011 issue has a half-page article about electric razors. 13 different razors were tested, and of those 6 were recommended and 1 was a best buy. That's all lovely, except the table compares things like "Noise," "Battery Life" and "Features" without ever explaining exactly how these variables were measured. I'm sure Consumer Reports has their readers' best interests in mind, but to make a truly informed decision about which products are best, wouldn't it be nice to have a few extra details?
For example, how did they test the battery life? Did they just turn each electric razor on and run it down until it quit? Or did each tester just use their assigned razor for a few minutes each day to shave normally, introducing variables like beard toughness and shaving technique? Or did they simulate actual shaving in a lab setting, cycling each razor on and off for ten minutes at a time over the course of several days to see how well the battery held its charge? Maybe I'm splitting hairs here but I'd like to see some concrete statistics from Consumer Reports, like "Razor B1 had a tested battery life of three hours in our rundown test, but razor B2 had a tested battery life of 2 hours and 45 minutes, so we gave B1 excellent and B2 very good ratings." That way the average Joe consumer could look at the data and make a better informed buying decision, especially if there's a significant price difference. I don't know about you, but I can live with a few minutes less shaving time for a few dollars less, but Consumer Reports insists on hiding the reality behind their results with those annoying circle icons. Are they trying to appease the manufacturers by never giving out any actual performance results in their testing to show just how far ahead of the pack a product is - or how little variation there actually is among products?
Another example that concerned me was an in-depth article covering washing machine testing. The lab rated things like "gentleness," "noise" and "vibration" which all seem to be good things to consider in deciding which washing machine to purchase. But if you want to know exactly how Consumer Reports tested each machine for gentleness, you'll have to look elsewhere, because there's no explanation in the article. And it's not just these slightly more subjective measurements that are missing - I would think standards like noise could be easily measured with a decibel meter and some real digits put in an article to show consumers what the tested differences were. Of course the consumer's mileage may vary, since I doubt anyone is washing a test-sized load at home and the noise levels would be different depending on what you were washing - but why not do a little more science and a little less touting the ratings as gospel?
Re-Using Data in "Summary" Articles
I'm sure Consumer Reports isn't the only magazine that does this, but I noticed that in the June 2011 issue there's an in-depth (well, okay, as in-depth as you can get without actually mentioning any data) report on gas grills. They test and rate something like 65 gas grills in a 3 page article with their pleasantly mysterious circles. But then in September 2011 there's a whole page devoted to "Great Grills" that shows much of the same information. And we see the top performers from the first grilling test with their test results re-printed for readers to digest again, along with strikingly similar headlines. "Think safety" becomes "Give it a safety check" and "Focus on features" turns into "Choose features you'll use." And judging by last year's "Home and Yard Products roundup" edition, readers can plan on seeing the same tests again in the future.
I'm not sure I would be comfortable paying for a magazine that claims to be the ultimate buying guide for consumers that really just feeds you the same content over and over again, even if it's only 3% recycled material. Just because their reader base is aging does not mean that they are also forgetting what they read three months ago! Maybe it's time to do a little comparison of buying guides and give ratings magazines and websites some pretty circles. And doesn't it scare you just a tiny bit when a magazine puts a little burst on their next-to-last page saying "Please remember CU in your will?" (This is at the bottom of a page-long list of names of individuals involved in their organization! Why not have another review in there instead of all those names!?)
All Reliability Data Comes From Readers
I'm sure Consumer Reports readers are nice people. But judging from the heavily targeted advertising in their magazines for life insurance, reviews for medical supplements and medicare plans, and promotions reminding readers that the Consumer Reports website is available 24/7 (just like the rest of the internet!) I get the impression that their readers are in a specific demographic that could cast a shadow on some of their reliability reports. I'm talking about the mature demographic, like over age 50. A quick glance at Alexa.com's audience report confirms that most of the visitors to consumerreports.com are over the age of 50.
The point I'm getting at is when you ask your average 55-year old about how their dishwasher has been working, for example, to try and determine the most reliable brands, you're going to get an answer that's skewed by their age group's habits. I see them saying things like "Well, we only run it twice a week." So how can this data be of use to your average family of four? Is it three loads a week vs. two loads a day? As another example, think about where this generation of readers is driving their cars. They're driving to work, to the store, and back home, and then to church on weekends. They're not driving to work, to school, to soccer practice to pick up Johnny, then to the music clinic to pick up Jenny, and then to Walmart and then to the grocery store and then home and then to the park and back. I'm not saying that the data is faulty, I'm sure Consumer Reports would say that the real world mileage of their reliability reports will vary immensely, but wouldn't you like to know a few more details?
In keeping with the "just take our word for it" theme of the entire publication, Consumer Reports doesn't bother to tell you how many readers they surveyed to get their data, or how many percentage points their survey could be off by, so who's to say they aren't just calling a few extra readers until they get the results they're looking (or getting paid) for?
The Bottom Line
Maybe I'm just spoiled by this generation's fascination with user reviews and the full range of review websites out there that take great pride in exposing all the gory, technical details of the products that they work so hard to review accurately and completely. But the classic Excellent, Very Good, Good, Fair and Poor circles of Consumer Reports are starting to feel a bit outdated without any real substance to back them up, and their attitude of "trust us, we're Consumer Reports" doesn't inspire confidence, especially when they're feeding readers the same data multiple times and mining their increasingly segregated reader base for reliability info that may not accurately reflect real-world results. What's your take on Consumer Reports? As always, I'm willing to consider any viewpoint in your comments, provided you can back it up.
No Quantitative Data!
I saw plenty of comparisons that caught my eye in the issue I was looking at, but what struck me was that none of them mentioned any numbers. For example, the June 2011 issue has a half-page article about electric razors. 13 different razors were tested, and of those 6 were recommended and 1 was a best buy. That's all lovely, except the table compares things like "Noise," "Battery Life" and "Features" without ever explaining exactly how these variables were measured. I'm sure Consumer Reports has their readers' best interests in mind, but to make a truly informed decision about which products are best, wouldn't it be nice to have a few extra details?
For example, how did they test the battery life? Did they just turn each electric razor on and run it down until it quit? Or did each tester just use their assigned razor for a few minutes each day to shave normally, introducing variables like beard toughness and shaving technique? Or did they simulate actual shaving in a lab setting, cycling each razor on and off for ten minutes at a time over the course of several days to see how well the battery held its charge? Maybe I'm splitting hairs here but I'd like to see some concrete statistics from Consumer Reports, like "Razor B1 had a tested battery life of three hours in our rundown test, but razor B2 had a tested battery life of 2 hours and 45 minutes, so we gave B1 excellent and B2 very good ratings." That way the average Joe consumer could look at the data and make a better informed buying decision, especially if there's a significant price difference. I don't know about you, but I can live with a few minutes less shaving time for a few dollars less, but Consumer Reports insists on hiding the reality behind their results with those annoying circle icons. Are they trying to appease the manufacturers by never giving out any actual performance results in their testing to show just how far ahead of the pack a product is - or how little variation there actually is among products?
Another example that concerned me was an in-depth article covering washing machine testing. The lab rated things like "gentleness," "noise" and "vibration" which all seem to be good things to consider in deciding which washing machine to purchase. But if you want to know exactly how Consumer Reports tested each machine for gentleness, you'll have to look elsewhere, because there's no explanation in the article. And it's not just these slightly more subjective measurements that are missing - I would think standards like noise could be easily measured with a decibel meter and some real digits put in an article to show consumers what the tested differences were. Of course the consumer's mileage may vary, since I doubt anyone is washing a test-sized load at home and the noise levels would be different depending on what you were washing - but why not do a little more science and a little less touting the ratings as gospel?
Re-Using Data in "Summary" Articles
I'm sure Consumer Reports isn't the only magazine that does this, but I noticed that in the June 2011 issue there's an in-depth (well, okay, as in-depth as you can get without actually mentioning any data) report on gas grills. They test and rate something like 65 gas grills in a 3 page article with their pleasantly mysterious circles. But then in September 2011 there's a whole page devoted to "Great Grills" that shows much of the same information. And we see the top performers from the first grilling test with their test results re-printed for readers to digest again, along with strikingly similar headlines. "Think safety" becomes "Give it a safety check" and "Focus on features" turns into "Choose features you'll use." And judging by last year's "Home and Yard Products roundup" edition, readers can plan on seeing the same tests again in the future.
I'm not sure I would be comfortable paying for a magazine that claims to be the ultimate buying guide for consumers that really just feeds you the same content over and over again, even if it's only 3% recycled material. Just because their reader base is aging does not mean that they are also forgetting what they read three months ago! Maybe it's time to do a little comparison of buying guides and give ratings magazines and websites some pretty circles. And doesn't it scare you just a tiny bit when a magazine puts a little burst on their next-to-last page saying "Please remember CU in your will?" (This is at the bottom of a page-long list of names of individuals involved in their organization! Why not have another review in there instead of all those names!?)
All Reliability Data Comes From Readers
I'm sure Consumer Reports readers are nice people. But judging from the heavily targeted advertising in their magazines for life insurance, reviews for medical supplements and medicare plans, and promotions reminding readers that the Consumer Reports website is available 24/7 (just like the rest of the internet!) I get the impression that their readers are in a specific demographic that could cast a shadow on some of their reliability reports. I'm talking about the mature demographic, like over age 50. A quick glance at Alexa.com's audience report confirms that most of the visitors to consumerreports.com are over the age of 50.
The point I'm getting at is when you ask your average 55-year old about how their dishwasher has been working, for example, to try and determine the most reliable brands, you're going to get an answer that's skewed by their age group's habits. I see them saying things like "Well, we only run it twice a week." So how can this data be of use to your average family of four? Is it three loads a week vs. two loads a day? As another example, think about where this generation of readers is driving their cars. They're driving to work, to the store, and back home, and then to church on weekends. They're not driving to work, to school, to soccer practice to pick up Johnny, then to the music clinic to pick up Jenny, and then to Walmart and then to the grocery store and then home and then to the park and back. I'm not saying that the data is faulty, I'm sure Consumer Reports would say that the real world mileage of their reliability reports will vary immensely, but wouldn't you like to know a few more details?
In keeping with the "just take our word for it" theme of the entire publication, Consumer Reports doesn't bother to tell you how many readers they surveyed to get their data, or how many percentage points their survey could be off by, so who's to say they aren't just calling a few extra readers until they get the results they're looking (or getting paid) for?
The Bottom Line
Maybe I'm just spoiled by this generation's fascination with user reviews and the full range of review websites out there that take great pride in exposing all the gory, technical details of the products that they work so hard to review accurately and completely. But the classic Excellent, Very Good, Good, Fair and Poor circles of Consumer Reports are starting to feel a bit outdated without any real substance to back them up, and their attitude of "trust us, we're Consumer Reports" doesn't inspire confidence, especially when they're feeding readers the same data multiple times and mining their increasingly segregated reader base for reliability info that may not accurately reflect real-world results. What's your take on Consumer Reports? As always, I'm willing to consider any viewpoint in your comments, provided you can back it up.
Monday, September 12, 2011
SIGG bottles vs. Nalgene Bottles
You've probably heard of both companies in this debate, but I thought an in-depth review of each would be useful. Here's a comparison table of some key points.
.
| SIGG Bottle | Nalgene Bottle | |
.
| Manufacturer Name | SIGG Heritage Collection Smoked Pearl 1.0L Bottle | Nalgene 32oz Wide Mouth Bottle |
.
| Suggested Retail | $24.99 | $10.20 |
.
| Country of Origin | Switzerland | USA |
.
| Material | Coated aluminum Inert EcoCare lining Plastic/aluminum cap Santoprene cap gasket | Copolyester |
.
| Capacity | Stated: 1 Liter (33.8 oz); measured (actual): 33.8 ounces | Stated: 32 ounces; measured (actual): 40 ounces |
.
| BPA-Free? | Yes | Yes |
.
| BPA Note | Sigg does point out the following tidbit on their website: "Substances like BPA are prevalent in the environment and are in a very wide variety of consumer products found in the home, including food and beverage product containers and most plastic products. As a result, it is literally impossible to certify that something is 100% BPA free and to scientifically validate such a guarantee." It seems like they are saying that whatever you're getting your liquids out of is potentially made with BPA anyway, so don't sweat it. And I think they're probably right. | |
.
| Design Notes | Ribbed design makes it slightly easier to grip and prevents it from rolling around on a flat surface, but makes stickers and decals distorted and challenging to apply. | Smooth surface makes applying stickers and decals easy, but can be slippery when wet. Lid tether does a fair job at preventing the bottle from rolling on a flat surface. |
.
| No gradations. Opaque exterior means nobody else can see what you're drinking, including you, which can be good or bad. You can't see if there's dirt inside the bottle, and the only way to tell how much liquid remains is to slosh it around or heft it, and even then it's not easy to judge how much is left. | Printed gradations for approximate volume aid mixing beverages and tracking your hydration, at least until they wear off, which does eventually happen. Transparent material lets you tell at a glance how much liquid is left, as well as what it might be and if there are contaminants inside. | |
.
| Dents easily. If dropped on concrete expect your bottle to have a new geometry, although mine has taken some abuse with just minor denting to show for it. Exterior coating can scratch, chip and flake off after rough use exposing the aluminum underneath. Relatively sharp corner makes getting a bottle brush inside to clean difficult. | Outside can be scratched but not dented, although the markings will eventually wear off. Scratches in the plastic can be a magnet for dirt particles, requiring extra scrubbing to remove. | |
.
| |||
.
| Mouth Diameter | Approx 1" | Approx. 2" |
.
| Number of turns to remove cap | About 1-3/4 turns. | About 3/4 of a turn. |
.
| Mouth Notes | Relatively narrow mouth lets you drink confidently even while in motion without worrying about sloshing and spills, but makes cleaning challenging. | Wider mouth allows you to easily maneuver a cleaning brush - plus when you do get it inside, you can actually see what you're cleaning. But if you're in motion - say riding in a car - drinking requires some caution to avoid spilling from the sides of the opening onto yourself. It's also easy to add whole ice cubes through the larger opening. |
.
| Cap Notes | Plastic, aluminum, and Santoprene cap has hexagonal grip surface with aluminum ring for carabiners that also gives extra leverage if you've screwed it too tight or the contents are under negative pressure. Between the Santoprene seal and plastic threads is a channel that makes an excellent place for mold to grow, but this is fairly easy to clean with a brush. | Tethered plastic cap has ridged grip, but can still be difficult to remove if you've screwed it on too tight or the contents are under negative pressure. The cap is easy to clean but the collar where the tether attaches to the bottle can be tough to clean. Luckily, the tether is removable from the bottle with some effort, and can be cut off completely if desired. |
.
| Cleaning | Hand wash recommended. | Dishwasher safe (top rack). |
.
| Taste | SIGG's EcoSafe liner is apparently some kind of baked-on, chemically inert material whose primary component is, according to the SIGG website, "co-polyesters," which sounds similar to what Nalgene claims to be using for their entire bottle. So if you were expecting there to be a difference in lingering taste from acidic beverages or were thinking about how abrasion could occur inside the Nalgene that would trap tiny particles of whatever and bring them along to the next drink, I don't think there would be enough of a difference to make the call one way or the other. | See left. |
.
| Recycleable after use? | Yes | Yes |
.
| Freezer-safe? | No | Maybe. Nalgene says it's safe to -40 degrees F, but anecdotes suggest if you aren't careful to leave the top cracked slightly the expansion can cause distortion or breakage. |
.
| Safe for boiling liquids? | Not recommended. The aluminum shell will transmit all 212 degrees right to your skin. | Not exactly - Nalgene says it's safe up to exactly 212 degrees, but I'd hesitate to try. The plastic sidewalls should offer better insulation from the heat than the SIGG bottle, though. |
.
| Accessories | Insulated bag and some different sport tops are about all you'll find for this bottle. | A variety of insulated bags, different tops, and devices to make the wide mouth narrower to avoid spilling are available. Also compatible with several water filtration and purification systems. |
So what's the verdict? Even though I have been using a SIGG bottle the last few years, I can see where someone would prefer the Nalgene. Whether you're a mom who needs something easy to clean or an outdoor adventurer who can't risk a dented bottle if it falls out of a pack onto a rock, the Nalgene's durability and dishwasher-safe clean-up are tough to beat. But for the college student who likes to bring an adult beverage along on a late night study trip without everyone seeing what you've packed or a working professional who prefers slightly more subtle style that fits in with today's office environments, the SIGG bottle could also be an excellent choice. Overall, both bottles deliver a way to transport liquids effectively, but if I can buy 2 Nalgenes for just 80% of the cost of a SIGG... the odds are tipped in Nalgene's favor for me. What do you think? Let me know in the comments.
Here's another shot of the Nalgene's top just for reference. The coarse threads are easy to screw on and off and clean, but the tethered top can get in the way sometimes.
And here's the SIGG's top unscrewed. The yellowish color is the interior coating, and the white ring on the cap is the Santoprene gasket seal. You can see some of the scratches from everyday use on the exterior coating also.
Subscribe to:
Posts (Atom)