This is particularly difficult when verbal communication is impaired or impossible (due to impaired consciousness, ventilatory support, or disease-related language disorders). The aim of this thesis is to design robust automatic facial expression analysis methods that rely on transfer learning and multitasking to identify, characterize, and monitor respiratory discomfort in the absence of direct human interaction. This study provides long-term prospects for innovative applications such as the automatic monitoring of intubated patients or the design of intelligent ventilators that can adapt to the patient's sense of discomfort.