Abstract
The word “information” bears a heavy burden of theory and facts of wide range. It invokes associated notions of messages, communication, computation, complexity, knowledge, semantics (significant meanings), intelligence, and memory and languages, among many others. A tour through its domain requires many major stops, often having the form of a question. They include the following: What are the chief kinds of information? Does nature necessarily consist of matter, energy, and information? Is Shannon (selective) Information theory the best primitive (syntactic) form? Is the universe a computer, and if so, analog, digital, or quantum? Are information (message, communication) entropy and thermodynamic entropy closely related concepts? How do our perceptions relate to information? How do we reconcile the apparent causal decoupling between brains and minds? Are some mental processes in principle computationally not simulatable by a Turing machine? This article addresses each question in turn and offers some suggestions as to their resolution. It provides a basis for evaluating conceptions of information in discussions of intelligence from first principles.