Here are 100 books that Programming the Universe fans have personally recommended if you like
Programming the Universe.
Shepherd is a community of 12,000+ authors and super readers sharing their favorite books with the world.
I have taught undergraduate and PhD students physics and biophysics for 36 years, and I never get tired of it. I always look for hot new topics and everyday things that we all see but rarely notice as interesting. I also look for “how could anything like that possibly happen at all?”-type questions and the eureka moment when some idea from physics or math pries off the lid, making a seemingly insoluble problem easy. Finally, I look for the skills and frameworks that will open the most doors to students in their future work.
Without one single formula, Feynman takes you to the heart of quantum theory. The foundations of everything you thought you knew about light get ripped out and replaced by new foundations that cover every success of the 19th-century theory yet offer new vistas. I will probably read this tiny book every year for the rest of my life; each time I get new insights into physics (and the presentation of physics).
After reading it, you will understand the stationary-phase principle better than most Physics PhD students without one formula.
Celebrated for his brilliantly quirky insights into the physical world, Nobel laureate Richard Feynman also possessed an extraordinary talent for explaining difficult concepts to the general public. Here Feynman provides a classic and definitive introduction to QED (namely, quantum electrodynamics), that part of quantum field theory describing the interactions of light with charged particles. Using everyday language, spatial concepts, visualizations, and his renowned "Feynman diagrams" instead of advanced mathematics, Feynman clearly and humorously communicates both the substance and spirit of QED to the layperson. A. Zee's introduction places Feynman's book and his seminal contribution to QED in historical context and…
I am a professor of physics, passionate about researching physics and inspiring non-scientists to enjoy learning about physics. My research addresses how to use quantum physics to accelerate the development of quantum information science including quantum computing, quantum communications, and quantum measurement. My current projects are in developing quantum satellite communications, increasing the precision of telescopes, and constructing a quantum version of the Internet—the Quantum Internet. These topics revolve around quantum optics—the study of how light interacts with matter. I originated the idea of a National Quantum Initiative and lobbied the U.S. Congress to pass it into law, resulting in large investments in the new, exciting field of quantum technology.
This masterful book goes one step further and presents a game-based analogy that goes a long way toward explaining how a quantum computer actually works. Working through the book, one gains an understanding of how qubits can be quantum entangled and how entanglement leads to computing tasks that could not be performed on an ordinary computer. Deceptively simple in appearance, the method leads you deep into the inner workings of quantum logic operations without realizing you are digesting some pretty advanced concepts. The author knows of what he writes, as his theoretical discoveries led to one of the world’s most ambitious quantum computing efforts.
COMPUTING. ENTANGLEMENT. REALITY. Books containing these three words are typically fluff or incomprehensible; this one is not. "Q is for Quantum" teaches a theory at the forefront of modern physics to an audience presumed to already know only basic arithmetic. Topics covered range from the practical (new technologies we can expect soon) to the foundational (old ideas that attempt to make sense of the theory). The theory is built up precisely and quantitatively. Deceptively vague jargon and analogies are avoided, and mysterious features of the theory are made explicit and not skirted. The tenacious reader will emerge with a better…
Since my first college course in quantum physics, I have been fascinated with this enigmatic, infinitely interesting theory. It's our most fundamental description of the universe, it's been found to be unerringly accurate, yet it's quite subtle to interpret. Even more intriguingly, "nobody really understands quantum physics" (as Richard Feynman put it). For example, the theory's central concept, the wave function, is interpreted radically differently by different physicists. I have always yearned to grasp, at least to my own satisfaction, a comprehensive understanding of this theory. Since retirement 23 years ago, I have pursued this passion nearly full-time and found some answers, leading to several technical papers and a popular book.
Baggott's book is a rich, readable account of quantum physics as viewed at 40 key "moments" in its history. These moments range from the trouble with classical physics in 1900, leading to the notion of discrete "quanta" of energy, to the hunt for the Higgs particle at the CERN accelerator laboratory. Other moments include the invention of Schrodinger's equation, the Uncertainty Principle, and the Standard Model of particle physics. The author is an experienced science writer and former academic scientist.
The twentieth century was defined by physics. From the minds of the world's leading physicists there flowed a river of ideas that would transport mankind to the pinnacle of wonderment and to the very depths of human despair. This was a century that began with the certainties of absolute knowledge and ended with the knowledge of absolute uncertainty. It was a century in which physicists developed weapons with the capacity to destroy our reality, whilst at the same time denying us the possibility that we can ever properly comprehend it.
Almost everything we think we know about the nature of…
Tap Dancing on Everest, part coming-of-age memoir, part true-survival adventure story, is about a young medical student, the daughter of a Holocaust survivor raised in N.Y.C., who battles self-doubt to serve as the doctor—and only woman—on a remote Everest climb in Tibet.
I am a professor of physics, passionate about researching physics and inspiring non-scientists to enjoy learning about physics. My research addresses how to use quantum physics to accelerate the development of quantum information science including quantum computing, quantum communications, and quantum measurement. My current projects are in developing quantum satellite communications, increasing the precision of telescopes, and constructing a quantum version of the Internet—the Quantum Internet. These topics revolve around quantum optics—the study of how light interacts with matter. I originated the idea of a National Quantum Initiative and lobbied the U.S. Congress to pass it into law, resulting in large investments in the new, exciting field of quantum technology.
The subtitle of this book is A Serious Comic on Entanglement. Normally I am not fond of comic-style presentations of physics (although I do love comics, as my Conan the Barbarian collection can attest). But I am happy to make an exception for this excellent book, written by a daughter-father team, the father being one of the leading philosophers of physics and the daughter being an artist and web designer. All the deep physics is there, presented in a fun, reader-friendly style. The acknowledgments section credits six ‘reviewers,’ ages 12 to 15, for reviewing and helping edit the book – now that’s inter-generational!
An eccentric comic about the central mystery of quantum mechanics
Totally Random is a comic for the serious reader who wants to really understand the central mystery of quantum mechanics--entanglement: what it is, what it means, and what you can do with it.
Measure two entangled particles separately, and the outcomes are totally random. But compare the outcomes, and the particles seem as if they are instantaneously influencing each other at a distance-even if they are light-years apart. This, in a nutshell, is entanglement, and if it seems weird, then this book is for you. Totally Random is a graphic…
I am an academic researcher and an avid non-fiction reader. There are many popular books on science or music, but it’s much harder to find texts that manage to occupy the space between popular and professional writing. I’ve always been looking for this kind of book, whether on physics, music, AI, or math – even when I knew that as a non-pro, I wouldn’t be able to understand everything. In my new book I’ve been trying to accomplish something similar: A book that can intrigue readers who are not professional economic theorists, that they will find interesting even if they can’t follow everything.
A simple (not perfect) test of whether you’re going to love this book: Just check out the author’s blog, called “shtetl-optimized”. The style is similar: sharp, funny, mixing professional theoretical Computer Science with broader takes.
I am still in the middle of the book, and nevertheless, I’m happy to recommend it. As an amateur with superficial CS knowledge, I am enjoying this introduction to classical complexity theory and the basic theory of quantum computation.
Aaronson’s distinctive style makes the ride all the more enjoyable. It’s neither a “real” textbook nor a pop-science book. It’s in a weird space somewhere in between, and I love it!
Written by noted quantum computing theorist Scott Aaronson, this book takes readers on a tour through some of the deepest ideas of maths, computer science and physics. Full of insights, arguments and philosophical perspectives, the book covers an amazing array of topics. Beginning in antiquity with Democritus, it progresses through logic and set theory, computability and complexity theory, quantum computing, cryptography, the information content of quantum states and the interpretation of quantum mechanics. There are also extended discussions about time travel, Newcomb's Paradox, the anthropic principle and the views of Roger Penrose. Aaronson's informal style makes this fascinating book accessible…
I am a professor of philosophy at New York University, but my interests have always fallen at the intersection of physics and philosophy. Unable to commit to just one side or the other, I got a joint degree in Physics and Philosophy from Yale and a PhD in History and Philosophy of Science at the University of Pittsburgh. My fascination with Bell’s Theorem began when I read an article in Scientific American in 1979, and I have been trying to get to the bottom of things ever since. My most recent large project is a Founder and Director of the John Bell Institute for the Foundations of Physics.
John Bell’s theorem about the unavoidability of what Einstein called “spooky action-at-a-distance” in quantum mechanics set off the second quantum revolution, leading to quantum computation, quantum cryptography, and quantum teleportation among other insights. This book collects Bell’s most important papers which range in style from professionally mathematical to popular and intuitive, so there is something for everyone. Beginners can start with “Quantum Mechanics for Cosmologists” or “Six Possible Worlds of Quantum Mechanics” or “Bertlmann’s Socks and the Nature of Reality” or “La Nouvelle Cuisine”. Experts can learn from “Against ‘Measurement’”. People interested in the mathematical details can find them, and people scared by math can largely avoid them.
John Bell, FRS was one of the leading expositors and interpreters of modern quantum theory. He is particularly famous for his discovery of the crucial difference between the predictions of conventional quantum mechanics and the implications of local causality, a concept insisted on by Einstein. John Bell's work played a major role in the development of our current understanding of the profound nature of quantum concepts and of the fundamental limitations they impose on the applicability of the classical ideas of space, time and locality. This book includes all of John Bell's published and unpublished papers on the conceptual and…
I've been teaching math and physics for more than 20 years as a private tutor. During this time, I experimented with different ways to explain concepts to make them easy to understand. I'm a big fan of using concept maps to show the connections between concepts and teaching topics in an integrated manner, including prerequisites and applications. While researching the material for my book, I read dozens of linear algebra textbooks and watched hundreds of videos, looking for the best ways to explain complicated concepts intuitively. I've tried to distill the essential ideas of linear algebra in my book and prepared this list to highlight the books I learned from.
This is a good example of a book that makes a complicated topic accessible and easy to understand. Strictly speaking, this is not a linear algebra book, but quantum computing is so closely linked to linear algebra that I'm including this gem.
Prof. Wong covers all quantum computing topics in a straightforward and intuitive manner. He goes out of his way to prepare hundreds of examples of quantum circuits that made my life easy as a reader. What I like particularly about this book is that it explains all the derivations and all the details without skipping any steps.
I can recognize the work of a true master teacher: whenever I ran into a confusing concept, it was explained a few lines later, as if reading my mind.
I played semi-professional baseball in France in 1986. If your baseball career has brought you to France, you should be rethinking your professional aspirations. No problem, I thought. I will write. I like to write. To my dismay, publishers were not fans of novels about French baseball players. The world of espionage I became acquainted with in Europe, however….
Ignatius’s most recent novel is in many respects a mashup of books no. 1 and 2 on this list: terrific storytelling and the latest spy recent tech: You’ll conclude that it’s just a matter of time until “bad actors” (spy speak for “bad guys”) can hack your brain. At the same time, you’ll enjoy the story.
A hyper-fast quantum computer is the digital equivalent of a nuclear bomb; whoever possesses one will be able to shred any encryption and break any code in existence. The question is: who will build one first, the U.S. or China?
In this gripping thriller, U.S. quantum research labs are compromised by a suspected Chinese informant, inciting a mole hunt of history-altering proportions. CIA officer Harris Chang leads the charge, pursuing his target from Singapore to Mexico and beyond. Do the leaks expose real secrets, or are they false trails meant to deceive the Chinese? The answer forces Chang to question…
Saying just the right words in just the right way can cause a box of electronics to behave however you want it to behave… that’s an idea that has captivated me ever since I first played around with a computer at Radio Shack back in 1979. I’m always on the lookout for compelling ways to convey the topic to people who are open-minded, but maybe turned off by things that are overly technical. I teach computer science and study artificial intelligence as a way of expanding what we can get computers to do on our behalf.
The fields of Psychology, Economics, and Biology are well-known for offering interesting and informative introductory courses that provide a doorway into the area for budding scientists but also essential background knowledge appropriate for any educated person.
In Computer Science, we don't really do things that way. I wanted to offer a new kind of Computer Science introductory course that laid out the coolest ideas we have to offer along with compelling descriptions of why they matter.
I ended up using this book as the required reading in the class I built because it tells a personal, moving story while taking the reader from the nuts and bolts of bits and bytes all the way up to cutting-edge ideas surrounding artificial intelligence. It's a great read! Plus, it's short so I thought I could get my students to actually finish it.
Most people are baffled by how computers work and assume that they will never understand them. What they don't realize,and what Daniel Hillis's short book brilliantly demonstrates,is that computers'seemingly complex operations can be broken down into a few simple parts that perform the same simple procedures over and over again. Computer wizard Hillis offers an easy-to-follow explanation of how data is processed that makes the operations of a computer seem as straightforward as those of a bicycle.Avoiding technobabble or discussions of advanced hardware, the lucid explanations and colourful anecdotes in The Pattern on the Stone go straight to the heart…
Forthcoming eclipses coming up in Australia include that of 22 July 2028, which will cross Australia from the Northern Territory to Sydney, home of the internationally famous sights of the Harbour Bridge and the Opera House. Eclipse Chasers will act as a guidebook for both locals and international visitors, giving…
My background is in computer science, specifically artificial intelligence. As a student, I was most interested in how our knowledge of the human brain could inform AI and vice versa. As such, I read as much neuroscience and psychology as I could and spent a lot of time thinking about how our minds create reality out of our senses. I always appreciate a novel that explores the fluidity of reality.
Would you like to live forever—or barring that, for a really long time? If the answer is yes, then who are you? Is the person you were last month you? If your consciousness from last month could be transferred to a clone of your body, would that clone be you?
Matthew FitzSimmons explores the reality of who we are and more in his fast-paced mystery sci-fi novel Constance.
If you’re like me, and you feel a hole in your reading life when you finish this book, the good news is that the sequel is just a click away. Enjoy!
A breakthrough in human cloning becomes one woman's waking nightmare in a mind-bending thriller by the Wall Street Journal bestselling author of the Gibson Vaughn series.
In the near future, advances in medicine and quantum computing make human cloning a reality. For the wealthy, cheating death is the ultimate luxury. To anticloning militants, it's an abomination against nature. For young Constance "Con" D'Arcy, who was gifted her own clone by her late aunt, it's terrifying.
After a routine monthly upload of her consciousness-stored for that inevitable transition-something goes wrong. When Con wakes up in the clinic, it's eighteen months later.…