UK Businesses Roll Out Biosurveillance That Will Deny You Access to Work if You Show a Temperature


MHRA has warned against their use because of inaccuracy but businesses are going ahead with programme regardless.


The coronavirus pandemic has been a gateway to even more unjustified surveillance of our movements and our health. Schools, workplaces and hospitality businesses have been investing in thermal screening technology to prove that they’re ‘safe’ to open.


But infrared scanners do not accurately measure core body temperature, unlike thermometers, and the MHRA has warned against their use. There is no evidence that processing this sensitive health data is making anyone safer says Big Brother Watch.


Thermal surveillance represents yet another form of inaccurate, intrusive and unnecessary monitoring. Worse, they often combine other forms of surveillance like behavioural analysis and facial recognition, posing a real risk to rights. There are a range of reasons a thermal scanner might detect a high temperature. Most devices aren’t precise enough to detect the subtle shifts in temperature that would indicate a fever.


Weather, certain medications, alcohol consumption, pregnancy, menstruation, high blood pressure and other conditions can all impact a person’s temperature reading. Many people will have to choose between disclosing sensitive personal information or being barred from venues. No one should be denied access to their workplace, school or travel due to unreliable technology.


Thermal screening has serious accuracy problems. Claims that thermal cameras can detect fever are misleading – a camera can’t diagnose coronavirus and suggesting that they can keep students, workers, and travellers safe only promotes a false sense of security.


One UK study found that temperature screening was just 0.78% effective at detecting coronavirus, and the medical regulator has made it clear: thermal screening doesn’t work and shouldn’t be used.


Related Article

NHS App Secretly Shares Facial Recognition Data With Police for 'Algorithmic Discrimination'


32 views0 comments