Tech companies love to talk about how automation, internet of things, and the connected house are going to make the machines we use every day more convenient. But does it work? Anyone who’s used Apple’s Siri or “talk to text” feature knows that the promises of technology sometimes fall short.
I tested the promise of automation in the Xiaomi air purifier, the Mi2 to be precise.
Here’s the air quality readout coming from a small particle implanted on the side of the machine.
Here’s what it promises to do—detect how bad air is in your home, turn the purifier on when air is bad and turn it off when air is good. If it can do that, it means we can breathe clean air without the fan on high all the time. That’s awesome because it means less noise and less wasted electricity.
We previously tested how good the Mi2 purifier was at cleaning the air, and found shocking results: it left air unsafe 86% of the time. This time around, we wanted to do a more methodical test of the particle counters inside the Mi purifiers.
I tested the Mi1, Mi2, and the more expensive Mi2 Pro version against three particle counters. None of the Mi’s were new, but the Mi2 was relatively new, used to do just a few weeks of testing.
I set up the two Laser Eggs and the Air Visual Node on a chair next to the built-in particle counter on the Mi2.
I burned a cigarette in a closed 12m2 room and then turned on the purifier on high until the air got clean again (about 30 minutes from start to finish). That way we can test for accuracy from clean levels to truly toxic levels. I set my phone to take pictures of all the readings every 30 seconds.
How Accurate SHOULD It Be?
But wait, before I get to the results, I want to set expectations. I don’t expect the Xiaomi particle counter to be really accurate. It’s a cheap particle counter inside a machine that costs less than some of the particle counters I’m about to compare it to. We need to have realistic expectations.
So what are realistic expectations? I think a reasonable expectation is that it works well enough to do what it’s designed to do—run the auto mode.
Even against modest expectations, the Xiaomi was off by a lot. When the air was bad, the Xiaomi was off by an astounding 218 micrograms.
To give a sense of how large that discrepancy is, the WHO 24-hour limit is 25 micrograms. The Xiaomi’s error alone was over 8 times the WHO limit.
Here’s what that looked like live.
The Xiaomi seemed like it basically stopped counting past 50 micrograms. At that rate, the Xiaomi was saying the air inside was at the orange AQI level (“unhealthy for sensitive groups”) when it was really in the purple (“very unhealthy”) range.
OK, so the Xiaomi undercounts—severely at times. It turns out that’s not the only problem. If we zoom into the low range, the Xiaomi was overcounting there too.
I suppose a 9-microgram discrepancy might sound like not a big deal, but on the other hand, the Xiaomi was overestimating the real number by a factor of 10.
The Xiaomi 1 Is Inaccurate Too
Maybe the Mi2 I got was just broken. Who knows? Maybe the shipping guy dropped the machine on the way to my home and damaged the particle counter.
To test that possibility, I tested an older Mi1 against the Dylos Pro (which also scored well against the official PM2.5 numbers). The results showed the same pattern as the Mi2.
I also tested the Mi2 Pro, and it showed the same pattern. Thus, this seems to be a consistent problem with Xiaomi purifiers.
How Do We Know Those Other Numbers Are Correct?
Hang on, aren’t we assuming the Laser Egg and the Node are the right numbers? How do we know that those are the right ones, and the Xiaomi is the wrong?
Smart Air ran comparison tests of the Node and Laser Egg with official PM2.5 numbers for six days. The Node and the Egg correlated with the official PM2.5 at a very respectable r = .98, with an average error of 4.8 micrograms for the Node and 6.5 micrograms for the Egg. That makes me confident their numbers are a good approximation of the true concentration.
This Could Explain the Xiaomi Left Air at Dangerous Levels in Separate Tests
Smart Air tested the Xiaomi Mi2 air purifier in a real Beijing apartment for 12 nights, and the results shocked me. I honestly thought it’d do a fine job. After all, purifiers are just fans and filters. But the Xiaomi left the air at unhealthy levels for 86% of the time.
The fact that the Xiaomi so severely underestimates pollution levels could explain why it so often leaves the air at those unsafe levels. I found similarly atrocious results when I tested the Philips auto mode, which convinces me that the technology behind air purifier auto modes just isn’t good enough yet. I would not use an auto mode in my home.
Why This Problem Is More Than Just an Accuracy Problem
The Mi2 is fine purifier when it’s on high. Our open-source tests show that it does a great job on high (check out the first three hours in the test graph above). But the problem is the Mi2 forces users to use auto mode. No matter what you do to the machine, it will switch to auto mode after three hours. Sounds weird, right? We asked customer service three times just to be sure.
That means unless you wake up every three hours during the night and switch the machine back onto high, you have to use auto mode and the particle counter that controls it. I hope Xiaomi fixes this simple design flaw, but until they do, I would not use a Xiaomi in my home.
Read more for extra data and methods. I also test the possibility that the particle counter is inaccurate because it’s on the inside of the machine and so sampling air that is different from air outside the machine.
Extra Data and Methods
Mi1 Test Method
I tested the Mi2 in the Smart Air office and the Mi1 at my home, so the room and methods were slightly different. In the office, I burned a cigarette to make the particle counts go up. At home, I don’t have any cigarettes, so I burned a piece of paper.
The size of the office room was 12m2. My room at home was larger, probably closer to 15m2.
Is the Xiaomi inaccurate because the particle counter is inside the machine?
I wondered if the particle counter is inaccurate because it’s on the inside of the machine and therefore not getting a good sample of air. One way to test this is to take the particle counter out of the machine, which isn’t very hard. Even when I did that, the numbers still consistently undercounted when pollution was high and overcounted when pollution was low. Thus, I don’t think the problem is the placement of the particle counter.
What are the Xiaomi numbers exactly?
One frustrating part of the Xiaomi is that it doesn’t label the air quality numbers. Are they micrograms, China AQI, US AQI, or something else? I can’t understand why they wouldn’t label the numbers.
This isn’t just a nerd concern. It could really affect the results because the relationship between micrograms and AQI isn’t linear.
If you dig around deeply enough through the Xiaomi, they do say that the numbers are micrograms. Thus, I compare micrograms to micrograms in the analysis.
Can the Xiaomi get below 10 micrograms?
The lowest number the Xiaomi registered was 9 micrograms, while the Node was registering 0.2 micrograms and the Eggs 1 microgram. That made me wonder, is it even possible for the Xiaomi to display numbers below 9? Is it programmed not to go below that number?
To get to the bottom of it, I turned on the DIY 1.1 and pointed it directly onto the Xiaomi particle counter. When I do the same test with the Dylos particle counter, the numbers go down to zero. But with the Xiaomi, the numbers stayed around 10 micrograms. Therefore, I think the Xiaomi is either registering phantom particles or programmed not to go below 9.
I’m making the original data available as an Excel file download here.
Why Were the Node Numbers Low?
In the main graph in the article, you’ll see that the Node numbers were lower than both of the Laser Eggs. The Node I used in the test was about a year old (although the Laser Eggs weren’t new either). One problem with older particle counters is that dust accumulates inside the machine and restricts the air flow. The guy behind AQIcn.org tested an old Dylos and found it was undercounting when concentrations were bad. Then he cleaned out the dust inside with compressed air, and found it got up to higher numbers.
I suspect the same thing was happening with the Node in our tests. That can be a particular problem when the particle counter is subjected to really high levels of particulate, like in our cigarette tests.