57 pages • 1 hour read
The final chapter picks up where the previous one left off: “philosophy with a deadline” (281). In this chapter, Tegmark investigates the nature of consciousness, one of the book’s central themes. For Tegmark, it is extremely important whether or not we can determine if AIs are conscious and how. If, for some reason, humans were replaced by AIs, it would be a tragedy if these AIs were merely zombies, i.e., utterly unconscious.
Tegmark gives a broad definition of consciousness as “subjective experience” (283). Basically, consciousness is what it’s like to be a thing and is composed of qualia, or instances of experience. The “problems” with understanding consciousness are layered. Figure 8.1 articulates a pyramidal structure for these problems, ranging from the scientifically achievable problems of brain processing to the philosophical problem of why anything is conscious at all. The later has often be dubbed the “hard problem of consciousness.” Tegmark relies on the work of Australian philosopher David Chalmers when discussing these issues. He contrasts his own view, “the physics perspective,” with substance dualism, or the view that mind and body are utterly dissimilar forces (286). The “physics perspective," often known as physicalism, assumes that consciousness is connected to particular physical arrangements.
Plus, gain access to 8,550+ more expert-written Study Guides.
Including features: