Conditional entropy refers to the amount of uncertainty that remains in a system when some knowledge or information has been already obtained. There are several synonyms for the term conditional entropy, including conditional information entropy, conditional uncertainty, and conditional information. These terms are used interchangeably, and they all describe the same concept of measuring the uncertainty of a random variable given some additional knowledge or information. Other related terms include conditional probability and joint entropy, both of which are measures of the degree of correlation between two or more random variables. Overall, these synonyms provide a range of language to describe the complexity and uncertainty of a system or set of data points.