Photoacoustic CO2 Sensor Accuracy (SCD40 & SCD41)

A few weeks ago I was testing a new carbon dioxide monitor, the AIRVALENT CO2 Monitor, and I was interested to find a few ‘quirks’ with the measurements provided by it in comparison to my trusty Aranet4 Home monitors. Being someone who loves learning and experimenting, I decided to have a bit of a deeper look into how the AIRVALENT monitor’s data differed from my other monitors.

If you’ve ever looked to purchase or have purchased a CO2 monitor, you will likely have already heard the advice ‘only buy a monitor if it has a NDIR sensor’. NDIR stands for non-dispersive infrared, and these sensors are currently the gold standard for accuracy on consumer-grade monitors. The most popular NDIR sensors come from Sensirion and SenseAir, and you may have even heard of some popular models such as the SCD30, S8, or Sunrise sensor.

However, another type of sensor is picking up in popularity: photoacoustic sensors. These sensors are often chosen over NDIR sensors for several reasons, but they also have some downsides. Here is an explanation from my article:

Most traditional NDIR sensors in devices such as the Aranet4 Home and INKBIRD IAM-T1 are transmissive NDIR sensors. These sensors work by firing infrared light across a chamber towards an optical detector. Since carbon dioxide absorbs this light, the detector detects the amount of light energy that reaches the other end of the chamber. As the carbon dioxide concentration increases, the light reaching the detector decreases, and the sensor can then calculate the carbon dioxide concentration.

While this method is accurate, it also means the sensors have to be large, as there is a minimum required distance between the emitter and detector in order to get accurate results. For example, one common NDIR sensor, the SenseAir S8, has the following dimensions: 8.5 x 33.5 x 20 mm. While this might seem relatively small, it means devices housing this sensor have to be able to house the sensor alongside all the other required components. When coupled with a battery, screen, mainboard, and other components, carbon dioxide monitors quickly turn from small products to thick and relatively large devices.

However, there is a way to decrease the size of the sensor – in this case, photoacoustic NDIR. The AIRVALENT CO2 Monitor uses a Sensirion SCD41, one of two models in Sensirion’s current photoacoustic range. Most interesting, this sensor is tiny compared to typical transmissive NDIR sensors.

As you can see in the image above, the SCD41 (the little silver box with a white cover) is incredibly small! This allows carbon dioxide monitors to be made much smaller than if they used an NDIR sensor.

About how photoacoustic sensors work:

Photoacoustic NDIR sensors operate using a slightly more complex concept. First, they fire an infrared light into the chamber, which will be absorbed by the carbon dioxide molecules. This will cause the molecules to vibrate, which leads to a pressure change in the chamber. This pressure change will cause a sound, which can be picked up by a microphone inside the chamber. Since the placement of the microphone isn’t as important in a photoacoustic sensor (since sound is omnidirectional), and since there isn’t a minimum distance needed for measurements to be taken, these sensors can be much smaller than transmissive NDIR sensors.

As you can see, the main reason for choosing a photoacoustic sensor is its size - they are tiny! In addition, there are other benefits, such as the price (photoacoustic sensors are often cheaper) and power consumption.

However, while I was testing the monitor’s accuracy against some Aranet4 Homes, I had some interesting results that I shared in the article. Having more time on my hands now, I wanted to dig further into the issue, so I made this thread. Firstly, a few notes:

  1. This is far from scientific. I am using my Aranet4 Homes as a baseline, as they are the SenseAir Sunrise, the most accurate sensor I can access.

  2. This isn’t an issue with the AIRVALENT monitor - it’s common across all photoacoustic sensors I’ve tried.

  3. all monitors were calibrated in precisely the same conditions for every test below.

  4. I am not an expert on this matter, so if you have any thoughts to add, please go ahead and do so. It would be great to hear if others have similar findings!

So, what are the ‘quirks’ I alluded to in the introduction? Well, firstly, I made the graph below, which shows the carbon dioxide concentrations as recorded by two Aranet4 Home monitors and my AIRVALENT monitor over about three days. When I first saw this graph, I wondered why the AIRVALENT with its SCD41 consistently read higher or lower than the Aranet4s. I guessed there might be another variable at play and went on to compare the temperature, relative humidity, and air pressure.

Above: without temperature.

Below: with temperature.

As you can see, the SCD41 readings seem to be significantly impacted by temperature. The sensor consistently reads lower than the NDIR-equipped Aranet4 Homes at lower temperatures. At higher temperatures (above around 20 degrees Celsius), this swaps, and the SCD41 consistently reads higher - interesting!

These were room tests, meaning that the monitors were all located right next to one another but that the CO2 concentrations also changed frequently. To rule out CO2 concentration as a variable, I placed the three monitors in an airtight container. I placed the container in a cold, moderate, and warm environment. Below are the results.

Unfortunately, I don’t have a lab or anywhere to control the temperatures, so this test was quite rudimentary. For the room temperature part of the test, I put the container in my room, which was around 20 degrees. I then took the box outside (into the Bangkok sun), which caused the box’s interior to reach almost 40 degrees. Finally, I placed the box in my fridge and set it to 6 degrees. As you can see, while the Sunrise sensors in the Aranet4 Homes gradually decreased (the box must not have been fully sealed), as expected, the SCD41 monitor jumped around a lot - presumably because of how temperature impacts photoacoustic sensors.

I did some research, and this is a common issue with photoacoustic sensors as they suffer from signal loss at higher temperatures. While there are compensation algorithms that can be used (and likely are with the SCD41 since it also houses a temperature sensor), they aren’t fully accurate - at least not if the Sunrise/Aranet4 is to be believed.

These graphs were all included in my full review of the AIRVALENT CO2 Monitor, but I wanted to dive deeper since I now have some more time to look into the matter. As such, I wanted to repeat the temperature tests but for longer periods of time to see if the results were consistent. Therefore, I took the three monitors out again the other day and decided to see how the SCD41 would perform. Again, I recalibrated all monitors in the same condition and placed them back into the plastic container. Below are the results over 24 hours at room temperature (I’m in Vietnam, so 25-30 degrees Celsius is typical indoors).

As you can see, the SCD41 sensor again records higher than the two Aranet4 Homes. Since I’ve found 19-22 degrees where the SCD41 begins to read higher than the Sunrise sensors, I was not surprised to see this result. As you will notice, some data is missing, but that is not an issue with the SCD41 (I’m not sure why it is missing, but it is an issue with the AIRVALENT monitor/app and not the sensor). While I was very frustrated to see this data was missing, there is still around 18 hours of data, which I believe is enough for a second room-temperature test.

Of course, I also wanted to test the SCD41 at lower temperatures over 24 hours, which meant all three monitors had to go back to the fridge. Here are the results from that test:

As you can see, the temperature in this test was consistently between 4 and 0 degrees Celsius (likely due to the fridge cooling activating at a certain threshold). Again, the SCD41 performed exactly as expected by reporting lower values than the Aranet4 Homes and their Sunrise sensors at cooler temperatures. In this graph, there is a larger difference between the two Sunrise sensors, but the SCD41 still reports significantly lower than both.

I would love to repeat this test again with the monitors in a hot environment, but unfortunately, I don’t know how I can maintain a high temperature for 12-24 hours. In Thailand, I placed the monitors on the balcony, and they nearly reached 40 degrees. I don’t have a balcony here in Vietnam, so I will need to find another way to maintain such a temperature for a decent length of time. Once I get a chance to perform this test, I expect to see the SCD41 report significantly higher once again!

Learning more about how these sensors perform and how environmental factors and other variables impact them has been fascinating. While this might seem like I’m saying you should avoid any monitor with the SCD41, this isn’t at all the case. Again, here are my own words from the full review:

So, what does this mean? Well, it might not mean much for you if you intend to use the device indoors. The device’s accuracy becomes relatively poor above around 30 degrees Celsius, and these are temperatures you’ll usually only experience outdoors or during summer. Indoors and in controlled environments, you likely won’t notice this issue much as the device is accurate. However, it’s essential to know about this issue so you can compensate for it. Unfortunately, nothing in technology seems purely beneficial, and the SCD41 makes a few compromises to be incredibly small.

In conclusion, the sensor has some fantastic advantages, such as its tiny size and cost. However, you should be aware of the caveats with the SCD41 and other photoacoustic sensors so you know when they might not be entirely accurate. In my own testing, I found the SCD41 could read as much as 150 ppm (at only 2000 ppm) higher than the Sunrise at 35-40 degrees Celsius. While this is in line with the SCD41 stated accuracy (at 2000+ ppm, it is ±40 ppm ±5.0% MV), I’ve found that real-life sensors tend to perform better than the stated accuracy in typical conditions. A great example of this is the Aranets in these, which were extremely consistent throughout the tests.

Of course, you must also ask yourself if this difference even matters to you. In many of these tests, the difference between the SCD41 and Sunrise sensors was only 50 - 100 ppm. Would that difference actually matter to you in real life, or would you act the same if you saw your monitor reporting 1850 ppm compared to 1950 ppm? At the end of the day, I would take the same actions, trying to improve ventilation or masking up if needed. Even in the worst case, at around 35 - 40 degrees, the difference was only around 150 ppm.

There are a couple more SCD41 devices that I want to test.

Since I’ve already jumped into this rabbit hole, I will likely update this post soon. I would love to get a larger sample of sensors to test, and I am currently trying to find a way to gather data from two more SCD41 devices that I have. I would also like to test the SCD41 against the SenseAir S8, as I have four of these sensors sitting around (as opposed to only two Sunrise sensors). Right now, I feel like a lack of a larger sample size is causing me the most doubt about these results, and I will prioritise gathering more information. In the meantime, if you know anything about this issue, I would love it (and appreciate it!) if you could chime in with your experiences.

I was planning to do a quick attempt at reproducing your findings. Packed my Aranet4 and one of the self built sensors (which are using the Sensirion SCD41) into an airtight lunch box, put it into the fridge, took it out back again, then put it on the heated 3d-printer bed.

Results did not come out very meaningful. The time in the fridge and on the heated bed was too short to make an appropriate distinction between the different states.

Still, there was one important observation. The “airtight” container may be less airtight than thought since ambient pressure would stay constant until taken off the printer bed, when maybe due to the temperature drop, the ambient pressure inside did drop drastically.

Since ambient pressure is important for correct CO₂ readings it may be necessary to find a container that allows for ambient pressure exchange while still being as airtight as possible.

I will read up on how this can be done, then continue.

Another factor may be the operation mode of the SCD41 sensor. It can be configured to consume very little energy, but then i found it to produce significantly more noise and it also gets much harder to calibrate it reliably.

These findings are very interesting; thank you for sharing them.

I had considered air pressure but couldn’t find much of a relationship between it and CO2 concentration in my first test. However, I didn’t expose the monitors to extremes in those tests.

I currently have them all in the fridge again, and I will check the air pressure this time, too. From what I’ve read, air pressure can make a big difference in the CO2 concentration and measurements of these devices.

I can reproduce part of what you saw. Put the sensors back in the box. Put them into the fridge overnight, let it go back to ambient, then put it into the oven at 40° (remembered that it can be adjusted to as low as 30°).

After being put into the fridge the SCD41 readings clearly drop compared to the Aranet4 readings, then return to pre-fridge values. However, when in the oven, the opposite does not become true, at least not as clear as when in the fridge.

I think there is two explanations for this:

  • Something changed in the SCD41 calibration (what i think).
  • The temperature to error ratio is not linear.

Why i think the calbration changed: I tried to plot a diagram showing the ratio between the Aranet4 and SCD41 readings on the y-axis and temperature on the x-axis …

There is a gap at around 21°C, so i think there must be some error in the data.

Will keep looking at it, this is an interesting topic, which i would like to explore further, especially since i am trying to improve precision on my own sensor.

The drop in readings is quite apparent (albeit minor) at cooler temperatures, but as you said, it’s much less obvious at higher temperatures on your monitor.

I am currently testing the monitors in the fridge again, but at a higher concentration, to see if I can show the difference more obviously. I should have enough data to come back with some more graphs tomorrow.

One thing that worries me is that the device manufacturers (not Sensirion) can change the calibration settings and offsets, so there could also be variations between monitors using the SCD40/SCD41. I wonder if it might be worth getting some SCD41s individually for further testing. It’s something I will look into!

Here are my results from a second fridge experiment at a higher CO2 concentration to attempt to show the differences more clearly. When removed from the fridge, the Aranets and SCD41 devices showed a jump in concentration, but the increase was much more pronounced with the SCD41. Below is the same graph, but with air pressure shown instead.

While air pressure can certainly impact both NDIR and photoacoustic sensors, the change in temperature seems to track more closely with the variation in concentrations, so I’m still leaning towards that as the cause.

Hi Ethan, these are very interesting diagrams. The offset in CO2 readings at warmer temperatures appear very significant. As you wrote in your review, in terms of dervied action it may not make a big difference if you see 3.700ppm or 3.900ppm. Still i think it is important to be aware of this, and especially in situations where the device is calibrated at colder temperatures it may really also make a large difference at lower CO2 levels.

I think the SCD41 operation mode could be relevant as well and will i try to test and confirm this in the next days.

Maybe you have seen the SCD4x Low Power Operation Manual at https://sensirion.com/media/documents/077BC86F/62BF01B9/CD_AN_SCD4x_Low_Power_Operation_D1.pdf.

I have been trying some of the operation modes described in that document in the past.

image

“Low power mode” (30 seconds interval) provides clean values, which apparently have undergone some kind of noise reduction filtering. “Idle single shot mode” (free choice of measurement interval) has much more noise in measurements. In the SCD41 Low power manual you can read: “This occurs because higher
sampling periods generally decrease the feasibility of filtering
or averaging over several data points to alleviate signal noise.”

The Airvalent User Manual specifies battery life to be 5 days at a 1 minute measurement interval. If “Low power mode” was used in this monitor, it would require a battery capacity of at least ~500mAh, given the values provided in the low power manual. Especially since you can choose much longer measurement intervals for the Airvalent monitor i think they may be using single shot measurements or have come up with a different protocol altogether.

I have two spare SCD41 sensors lying around and will set up tests at different operation modes with those when time allows.

Thanks for that write-up, and all the other very useful reviews.

I have designed and installed co2 monitors in a number of classrooms, gyms, a pub and other places, and used mainly SCD4x, SCD30 in my designs, but also a number of Aranet 4 ( GitHub - oseiler2/CO2Monitor ).

Firstly, in my view and use cases, the 2 main purpose of using a co2 monitors are

  • providing some immediate feedback on air quality, to act on, e.g. improve ventilation, leave the room, etc
  • (for fixed monitors) to collect data to assess a setting, identify recurring patterns, or provide targeted feedback/education

From experience CO2 levels can change over a wide range quite quickly. Extremely high precision is not required (nor achievable with affordable devices), and often observing a trend over 50-100ppm and higher is what’s relevant, as opposed to a +/-50ppm variation. But of course we need accurate and reliable data within usable tolerances.

Please also note that all co2 sensors need regular calibration, and even the usually quite good and stable Aranets do drift, and aren’t the gold standard to compare other devices against in itself.
Aranet devices as well as SCD4x can be configured to ‘auto-calibrate’, but this only produces usable results when following the instructions carefully, and otherwise can put the sensor in a worse state, so it’s important to check that too, besides a one-off calibration.

With that out of the way, a few observations:

Please have a look at the Design-in Guide SCD4x CO2 Sensor, since it describes some design considerations that are relevant for the correct functioning of the sensor within the specified tolerances.

Sensirion’s miniaturized CO2 sensor combines smallest package with highest performance. In order to take full advantage of the SCD4x performance and the integrated features a number of housing and PCB design rules need to be considered. This guide describes an easy-to-implement and affordable design-in of the sensor. Please note that unbeneficial housing and/or PCB designs may cause significant CO2 and temperature deviations, increased noise levels as well as highly increased response times.

When I look at the photo of the AIRVALENT internals I’d argue that it is not decoupled from internal heat sources (though the cpu might not get very hot since it runs on batteries), and neither is there good airflow (well coupled, but no turbulences) around the sensor (since all the openings seem only at the bottom of the case). Vibration can be a problem too (as with probably most portable devices).

As pointed out by Hannes (and documented in the specification), running the sensor in single shot mode requires discarding the first measurement and doing another sample, and the device might not do that. Without access to a device that’s hard to check (same holds for supply voltage stability).

When I look at the charts you posted I see a lot of noise (and that includes the Aranet data), and that’s not something I’ve ever seen on any of my monitors (which either run in continuous or low power operation, not single shot). To me that suggests that something is quite wrong with the data. See a day from one of mine (SC40) in comparison

Something else to keep in mind is that a lot of electronic components get cloned and put on the market (sometimes hard to identify). I’m not saying that it is an issue in this case, but I have seen cheap SCD4x sensors offered that I would not use or trust. You could read the serial number from the sensor and get in touch with Sensirion to see if there’s a way to check it’s genuine or not.

Overall I am getting the impression that the AIRVALENT might not be following the SCD4x design guidelines correctly and that that’s contributing to the noise in the data you are observing.
I’ve run SCD40 sensors next to Aranet monitors over extended periods and compared the data, and not observed the behaviour you’re describing here. But it’s probably also worth pointing out that I believe there is limited value in putting the devices through scenarios that aren’t within specification, and I’d check carefully the fridge/heated results against specified operation conditions.

Hope that helps.

1 Like

Forgot to mention that rapid changes in ambient pressure impact the measurements, and that might well be the driver behind the deviation you see from your fridge experiment. In addition, the relative humidity inside the fridge will be high, if not above the specified 95% maximum, and the fridge is also likely a very noisy environment when it comes to vibrations.


Thanks a lot for sharing your many insights! There’s a lot to consider here, and I appreciate you taking the time to share more information on the matter.

I’m most interested in seeing your results with the sensors themselves, as they definitely indicate something is going on with the AIRVALENT monitor (perhaps it is due to external heat sources, as you mentioned), as opposed to the sensor itself. I will send this thread to the AIRVALENT team so they can also look into the issue and perhaps provide some answers (or even data from more monitors, as my sample size is only one AIRVALENT monitor). This would also explain why @hannes only partially sees the same trends I do with the SCD41.

It would be great to have a chamber so humidity, pressure, vibrations, and temperature could be controlled, but it’s very difficult to manage all of these variables to get anything near conclusive results (and more samples would be needed for that, too). Hence, I appreciate you sharing your results after a lot more experience with the sensors!

If you do come across anything further regarding findings, please feel free to share them. This has been very informative!

Hi Ethan, Hi oseiler. One more post from me on that topic.

I used the single aranet that i have and two of my own sensors, which have SCD41’s running in idle single shot measurement mode, measuring once per minute. In that mode the SCD41’s produce significantly more noise than in low power periodic mode, but also use considerably less power.

Rather that a hard container i used 2 zip-bags in an attempt to isolate the sensors. The idea was that the zip-bags would reduce pressure variations caused by heating and cooling. Oven first at 40°, then fridge at 7°, normal temperature last. The bags were less airtight than expected, even with extra clear tape CO2 values first went up (probably higher concentration in the outer bag), then continously went down.

My aranet also produces noisy data and it appears that there is more noise at higher temperatures.

Aranet only:

For further diagrams i added filtering to the data for better readability.

Aranet and the first SCD41 device:

Aranet and the second SCD41 device:

What i see: The first device tends to read higher at lower temperatures, the seconds device tends to read lower at lower temperature.

My conclusion:

  • there may not be a general temperature/accuracy issue at all
  • the degree of accuracy that i can achieve in the exeriment may just not be good enough to give reliable results
  • affordable sensors like the ones we are using do have a remarkably good performance given the price, but still come with some tolerance

and on a more general level:

  • any CO2 device that people pay attention to is better than none
  • the precision provided by available sensor is likely more than sufficient to make appropriate decisions in most situations

This has been very interesting. Thank you Ethan for providing this platform.

2 Likes

Thanks for the follow-up! It’s interesting how all devices seem to display higher noise at higher temperatures.

After seeing @oseiler’s and your own results, I agree with your conclusions. I think there is no issue with the SCD41 itself, but perhaps just with the AIRVALENT monitor. With that said, I’ve only tested one monitor, and there’s not enough information from which to draw conclusions. Even if the SCD41 is less accurate than standard NDIR sensors, it has the benefit of being incredibly small and is still an excellent sensor.

On a more general level, I also agree with your conclusions. Any sensor (as long as it’s reasonably accurate) is better than none - as long as it’s used to inform actions.

Thank you both so much for helping me investigate this issue in more detail. It’s been a very interesting conversation, and I’ve learned a lot. I’m glad to be able to have such discussions somewhere - I just need to find a way to encourage more discussion and interest, which is proving difficult.