CMIS 310 · Course Textbook

Introduction to
Digital Computation

A First-Principles Approach
By Wade Berner · Southern Illinois University Edwardsville · 2026

Everything is a computer. Okay, not literally everything — but far more than you might suspect. You're probably aware that desktops, laptops, and the phones in our pockets are all general-purpose computers with essentially the same hardware, running essentially the same operating systems and software. You may be less aware that the same is true of many other devices in our personal and professional lives. Home appliances like "smart" refrigerators, the digital X-ray machine at your dentist's office, gaming consoles, digital business phone systems, IP video surveillance cameras, fast-food ordering kiosks — computers, all of them. "The Cloud," which backs up your iPhone photos and powers your apps? That's just someone else's computer. Those AI datacenters that everyone hates but we keep building? Those are just warehouses with super-expensive air conditioners, full of computers.

The digital computer is ubiquitous and becoming ever more so. And while the world of technology changes rapidly, the foundational architecture of the digital computer has remained largely unchanged for decades. If this book were written in 1985 its contents would be essentially the same, and if it is read in 2065, I suspect learners will still find it useful. (At least as useful as learners in 2025 do, which is yet to be determined.)

This book is for anyone who wants to understand how digital computers actually work — not just how to use them, but what's happening underneath. Whether you're a student exploring computation for the first time, a professional looking to change careers, or simply a curious learner, no prior experience in computer science or engineering is required. All you need is curiosity and a willingness to think logically and explore how the pieces fit together.

This is an interactive, first-principles guide to the fundamentals of modern digital computing. You'll engage with concepts through real-world examples, hands-on widgets, and visual diagrams that break down complex systems into understandable parts. We'll build your understanding one layer at a time — from the physical meaning of a 1 and a 0, through processors, operating systems, and networks, all the way up to cloud infrastructure and information security. Every concept connects to the one before it.

If you're aiming for a career in cybersecurity, AI, datacenter infrastructure, or any of the booming technical fields of today and tomorrow, this knowledge is essential. So, as Lewis Carroll's King of Hearts would say: let's begin at the beginning — in our case, defining the very concept of digital computation itself.