There is no consensus on the definition of “consciousness”. The situation is not that bad with “information” but still a vague concept despite great advances in the quantification of information by scientists and engineers. Philosophers are getting into this game as well. Among the physicists there is a growing awareness about information theoretic approaches to physics. We were all pleasantly surprised when Edward Witten wrote a paper titled “A Mini-Introduction to Information Theory” in 2018.
First places to look for various definitions of information are Wikipedia and SEP.
“Any natural process that is not completely random, and any observable pattern in any medium can be said to convey some amount of information. Whereas digital signals and other data use discrete signs to convey information, other phenomena and artifacts such as analog signals, poems, pictures, music or other sounds, and currents convey information in a more continuous form. Information is not knowledge itself, but the meaning that may be derived from a representation through interpretation.” – Wikipedia
“The derivation of information from a signal or message may be thought of as the resolution of ambiguity or uncertainty that arises during the interpretation of patterns within the signal or message.” – Wikipedia
“Information can be encoded into various forms for transmission and interpretation (for example, information may be encoded into a sequence of signs, or transmitted via a signal). It can also be encrypted for safe storage and communication.” – Wikipedia
“The term “information” in colloquial speech is currently predominantly used as an abstract mass-noun used to denote any amount of data, code or text that is stored, sent, received or manipulated in any medium.” – SEP
“The detailed history of both the term “information” and the various concepts that come with it is complex and for the larger part still has to be written.” – SEP
“…Central is the concept of additivity: the combination of two independent datasets with the same amount of information contains twice as much information as the separate individual datasets.” – SEP
“The amount of information we get grows linearly with the amount by which it reduces our uncertainty until the moment that we have received all possible information and the amount of uncertainty is zero. The relation between uncertainty and information was probably first formulated by the empiricists (Locke 1689; Hume 1748). Hume explicitly observes that a choice from a larger selection of possibilities gives more information. This observation reached its canonical mathematical formulation in the function proposed by Hartley (1928) that defines the amount of information we get when we select an element from a finite set. The only mathematical function that unifies these two intuitions about extensiveness and probability is the one that defines the information in terms of the negative log of the probability (Shannon 1948; Shannon & Weaver 1949, Rényi 1961).” – SEP