Please note: If you see any glaring errors, feel free to post a comment.
The Basic Parts
There are a number of parts that are common to every home computer, whether it’s a Windows machine or a Mac, desktop or laptop.
The Hard Disk
This is a “short stack” of disks with a magnetic coating on both sides. Tiny arms with magnets on the end scan back and forth across the surface of the disks as they spin at around 7,000 rpm. The arms float so close to the actual disks that a smoke particle wouldn’t fit in the space in between. They’re that fast, that precise, and that fragile.
Hard disks hold huge amounts of information. In pure text form, your hard drive could hold most of, if not all of, the Library of Congress.
This is where all of your programs and files are stored.
The Memory, or RAM
RAM stands for Random Access Memory. When you open a file, the program that opens it (for example, Word opens Word files) is read into memory, along with the file itself. This is because a program in memory can be accessed much faster than a program on your hard disk. Remember how a hard disk is spinning thousands of times a minute? Sure, that’s fast, but it’s nothing compared to the memory chips that make up your RAM. Data is moving around on those chips via electrons at speeds that sometimes approach the speed of light.
And Now, Some Math Explaining How a Computer Sees the World
The tiniest piece of data there is is called a “bit.” A bit can only have two values: off and on. Well, that’s not very useful, so we collect 8 bits together and call it a byte. What’s so special about a byte? It’s just 8 bits, 8 little on/off switches. Oh, but it’s amazing what you can do with just 8 on/off switches.
Let’s say that each switch represents a number that is twice as big as the previous number, like this: 1 2 4 8 16 32 64 128. Now, say you wanted your byte to have a value of 3. That’s easy: turn on the first two bits (1+2.) Or how about something bigger, like 31: 1+2+4+8+16, or the first five bits. If you add up all the previous bits, they’ll always equal one less than the next bit (1+2+4=7, or one less than the next number, 8.) So we can represent any number up to 255 with just one byte.
Here’s one way that computers actually use bytes to store information: There are 26 letters in the alphabet, plus 26 more for capitals, plus 10 digits. That’s 62 different characters, not counting punctuation. So one thing you could do with a byte is to store one character in it. Let’s say “a” is represented by the number 0 (all bits off,) “b” is 1, “c” is 2, etc. You could put five bytes in a row with these values: 10 5 14 14 17… and that would translate into “hello.”
Starting with the bit (a single on/off switch,) here are some common memory sizes:
8 bits = 1 byte
1,000 bytes = 1 kilobyte (1K)
1,000 kilobytes = 1 megabyte (1M or 1MEG)
1,000 megabytes = 1 gigabyte (1G or 1GIG)
1,000 gigabytes = 1 terabyte (1T)
The document you’re reading is around 58,000 bytes long (58K.)
Okay, so what do we do with all these bits and bytes?
CPU stands for Central Processing Unit. This is the heart of any computer, large or small. It is a tiny wafer of silicone which has had millions of microscopic lines etched into it by special acids. The lines act like transistors, each of which can be switched “on” or “off” (like a bit, remember?) Modern CPU’s have 125,000,000 transistors etched on lines about 9/1000ths of the thickness of a human hair. These tiny chips – not even big enough to cover Roosevelt’s face on a dime – can consume as much electricity as a 300 watt light bulb and give off far more heat energy.
The CPU is hooked directly or indirectly to every part of your computer, and it has the ability to pass commands to those parts, and receive information in return.
A computer program is a set of instructions for the CPU. Well, not just instructions, but also “conditional” instructions, like a big “what to do if…” book. IF the user chooses “Open” from a menu, do this. IF the user types an “X,” do that. Every feature of a program, and every variation of that feature, and every error that could arise from that feature… it all has to be written out in the computer program.
I hope everyone’s read or at least knows about those “Choose Your Own Adventure” books. Well, that’s what a program is like. The CPU starts on page 1, goes until it reaches a decision, or branch point, then decides, based on what’s loaded into memory, which page to flip to next: The user has pressed a key. Is it a letter? Yes. Go to the page that handles pressing a letter key. Is the Command key being held down? No. Go to the page that handles actual typing, instead of keyboard commands. Is the shift key being held down? No. Go to the page that handles typing a lower-case letter. What letter is it? “x.” Go to the page that handles “x.”
In reality, it’s much more complex and much more granular than the example above. That’s why programmers get the big bucks.
Oh, and there’s one other huge little thing the CPU is doing: It’s running the operating system (i.e., Windows or Mac OS X,) which is handling all sorts of low-level stuff like drawing windows, moving the mouse cursor, keeping your RAM organized, deciding which programs get how much time with the CPU, and a million other things.
Fun Fact: I actually worked in a factory that made CPU’s for a little while. The environment had to be kept super clean, as even a dust particle could ruin a whole batch of chips. Us workers wore “bunny suits,” white overalls that cover you from head to toe. We also wore masks, so only our eyes were showing. More than once, I didn’t recognize someone when I saw them with their whole face exposed.
When entering the clean part of the factory, we’d step into this chamber which blew air at us from below at fifty miles per hour, as a final insurance against stray dust. In the clean area itself, more giant fans blew air from below and out the ceiling. It never happened while I was there, but if there had been a power outage, we would have had ten minutes to evacuate the clean area before the build up of acidic fumes from the etching chemicals would have rendered us unconscious.
Back to the Top
So the CPU is reading the operating system book when it’s told the user has double-clicked on Word. The CPU sends a message to the hard drive to jump the tiny floating magnetic heads to the part of the disk where Word is written and read it in to the RAM. Once it’s in RAM, the CPU will start reading the Word book and reacting to the user’s input.
Above, I focused on the parts of your computer that could slow it down, the parts you want to make sure you get right when buying a computer. I’ll wrap that up below, but first let’s fill in some gaps…
Keyboard and Mouse
These are known as “input devices,” since you use them to input information to the computer.
I won’t go into the details of how an electrical signal gets from the mouse to the computer. Suffice it to say that it won’t affect the speed at which your computer runs.
A CD is like a really long message in morse code. The surface has a series of microscopic pits and flat spots – millions of them, all in one continuous line. The laser shines on the surface and reads the pits and flats by measuring how much light is reflected back. And like everything else in the computer world, the pits and flats are converted to… bits.
Fun Fact: Unlike records, where the first track is on the outside of the disk, CD’s and DVD’s are recorded starting on the innermost part of the disk and spinning outwards.
This is only important if you like playing games. The graphics processor is really a whole ‘nother CPU. But this one is only in charge of drawing things on your screen. When you’re playing a game, having a fast graphics processor means you’ll be able to see better 3-D characters and landscapes, and at a faster frame rate, because the processor is doing the huge amount of math involved to bring such things to life.
Almost all new monitors are flat LCD monitors. Older, “TV-like” monitors are called CRT’s.
A CRT monitor (and this is how your TV works, too) is a big vacuum chamber. At the back of this chamber are three heated filaments that give off electrons. A large magnet concentrates the electrons into three tight beams. Then a large magnetic field guides the beams to a single point on the front of the screen (also the front of the vacuum chamber.) The screen is coated with a tiny grid of red, green, and blue phosphors (a material that will emit light when struck by radiation.)
The magnetic field guides the three beams to one point on the screen, where they each hit a different phosphor dot, either red, green, or blue. By varying the intensity of the beams, you can make almost any color from the combination of red, green, and blue.
Then the magnetic field is altered ever so slightly, and the beams now point at the next dot on the screen (an individual dot is called a “pixel” in computerese.) At the same time, the filaments are altering their intensity to create the right combination of colors for this new dot.
A typical monitor has around 1,300,000 of these dots. And they’re all hit with the beams at least 60 times per second! (Televisions have a much less demanding 252,000 dots, and they’re only hit 30 times a second.)
Fun Fact: While working at Apple, I sat next to someone who was testing their first huge, 21” monitor. It was a prototype with no shielding on the back. Whenever he turned it on, it produced an electrical pulse so big, his hair would friz out for a second.
LCD’s, the flat, thin monitors that are dominant today (and present in every single laptop) are a completely different beast. No electron guns, no magnets, no vacuum chamber, and no radiation.
LCD stands for Liquid Crystal Display. For every tiny dot on the screen, there’s a little transistor and capacitor hooked up to some liquid crystal material, and it’s all sandwiched between two pieces of glass. Behind the glass is a steady light source. Tiny wires send electricity to each dot. Different amounts of electricity cause more or less of the liquid crystal to line up in order, and allow different amounts and colors of light to get through from the light source behind the glass.
Consider that a typical LCD monitor will have at least 1,300,000 pixels (dots.) LCD’s are becoming less expensive because there are now a greater number of factories around the world that are capable of creating large, perfect batches of such a huge, complex, microscopically-detailed item.
Whether it’s the operating system (i.e., Windows or Mac OS X) or a web browser or a game or a word processor, all the software you have on your computer was written by the world’s best supercomputer: Humans. Still, Humans have been known to make mistakes, or to succumb to the pressure of a deadline, or to be greedy, or to be simply uninterested in the quality of their work.
Most of the programs you use today are still written by large corporations who ultimately answer only to their shareholders. As a particularly vile manager once told his employees (me among them,) “If we found out that we could make a higher profit by just firing all of you and instead selling all the art on the office walls, we’d do it in a second.” Most programmers and testers do take pride in their work. However, projects inside corporations all have deadlines, and sometimes there’s only so much you can do in a given amount of time.
There’s another way to make software that you might have heard of: “Open Source.” An open source project is created by a group of volunteer programmers and testers, all of whom can access a common set of code. Each person in the group makes a contribution to the code from within their area of expertise. It’s the ultimate in collaboration, and almost always it’s done for free. Why would highly skilled programmers donate their time to make a program that a corporation would charge hundreds of dollars for? One common reason is to keep the software “open,” that is, not under the control of a single company. If software is “open,” it’s more likely to be changed and updated based solely on the needs of its users, as opposed to the needs of a company’s bottom line. Or in a worst-case scenario, dropped altogether by a company that sees the software as “unprofitable.”
At this point in history, open source software is making major gains on corporate software in terms of stability and usability. However, for the foreseeable future, corporations will be able to radically outspend and thus out-develop most open source projects.
Fun Fact: When you write a piece of software that’s going to be used by the general public, you have to take every piece of text that the user will see (i.e. “Save before closing?”, “File not found”, etc.) and put it into a special file. Then translators can translate that part of your program into Japanese, French, Spanish… the local language of any country where it will be used. I’m proud to say that two of my programs for Apple were translated into well over a dozen languages. Neat… I mean… bueno.
Buying Your Computer
And now, to wrap it all up. What should you look for when buying a computer? Well, most of the parts I mentioned above have become like commodities. RAM and hard disks especially are available from dozens of companies, each trying to undercut the other’s price and performance.
How long will your computer “last?” That is, how long will it be useful? The co-founder of Intel, Gordon E. Moore, came up with “Moore’s Law” back in 1965: “The number of transistors on a chip doubles every 18 months.” In other words, there will be a computer that is twice as fast as your kick-ass new toy in 18 months. And as people with older computers have discovered, whenever the hardware guys provide a more capable chip, the software guys will eat up that new capability with more complex software – software that will be slow on older, less capable chips. For example, a “fast” computer from 1989 would not be able to play an MP3, because it wouldn’t be able to “decode” the file into music fast enough.
Meanwhile, RAM keeps getting cheaper, and hard drives keep getting cheaper. Your current home computer has more hard disk space than most banks did in the 1970’s. Your iPod is more powerful than the computer aboard the Apollo spacecraft.
If you want your computer to last, you should buy the best machine available for the money you’re willing to spend. Fastest CPU (most important,) most RAM (almost as important,) and a hard disk that will meet your needs (unless your needs involve editing video, the hard disk specs are less important.) Using this strategy, your computer’s “useful life,” that is, the amount of time before it becomes notably “slow,” or unable to run some newer programs, will be around four years. This is probably the way to go for the average person, whose primary uses for a computer include e-mail, web surfing, writing, home finance, music listening, video watching, and maybe a game or two.
If you want to update your computer on a regular basis, say, every two years, you can afford to take a step or two down from “the best.” By the time the “next generation” comes out, you’ll be ready to make another purchase. As someone who makes his living on a computer, this is the strategy I use. Since I can’t afford to use a computer that has become “slow,” I upgrade part or all of my computer on a much faster basis than most people.
Fun Fact: The first computer I ever owned was a Commodore PET. It had a whopping 8K of RAM. Not 8Megs, 8K. 8,000 bytes. The case was solid steel. There was no hard disk, only a cassette tape player that you could record your programs onto. It took three minutes to read in a full 8K program. There were no graphics on the screen, only text, and there was no mouse. It cost $800 in 1978.
I’m a Mac person. I bought one of the original Macs in 1984, I’ve worked for Apple, and I use a program that’s only available on the Mac to do much of my work.
I’ve also worked for Microsoft, and my job there required me to use a Windows machine for five years. So, I’ve been in both “camps,” seen both sides, and I’ve chosen the Mac.
To me, the Mac OS (OS X) seems much more robust, much more “finished.” It doesn’t feel like it was rushed out the door. It feels like it was made for real people, not computer geeks. If I ask it to do something, it does it. And it doesn’t give me grief, or mysterious errors I can’t possibly understand. OS X is not perfect, it’s just way, way, way closer to perfect than Windows.
Over the past couple of years, my friends have come to depend on me for help when something goes wrong with their computer.
When a Mac person calls me with a question, it’s mundane: How do I duplicate this CD? How do I get Real Player?
When a Windows user calls me, its: Why is my computer so slow all of a sudden? Why won’t it boot anymore? Why are there 25 alerts when I start my computer? Why can’t I play music? Why can’t I sync up my Palm Pilot? Where did my network go? I’m a good person… why is this happening to me?
I think the longest I’ve spent trying to resuscitate a dying Windows machine is ten hours. Not a month goes by without a one to two hour phone call where I try to guide someone through a debilitating Windows problem, a problem that has rendered their $1,000 – $2,000 machine useless. Now I know how the mission control guys felt during Apollo 13.
Sometimes I have to make housecalls. The most viruses I’ve ever removed from a Windows machine: 168.
You might have heard recently that Apple decided to start using Intel CPU’s – the same CPU’s used in most Windows machines – in their new Macs. This doesn’t really matter to most users. However, a new program (called “Parallels”) has been created which allows a Mac to also run Windows programs. Yes, a window pops up on your Mac’s screen, and inside is… Windows. So, if you want the ease and stability of a Mac, but still want to be able to occasionally run a Windows program should the need arise, now you can do it.
Apple also has a program that lets you boot your Mac into either a Windows partition or OS X (the Mac operating system.) It’s called “Boot Camp.” I prefer Parallels because I don’t have to reboot to get to a Windows program that I might only need for a second. On the other hand, Boot Camp is free.
And don’t worry about Macs using the same CPU’s as Windows machines. The main problem with Windows machines is Windows, not the CPU.
To me, this gives the “what kind to buy” question a very simple answer: Get a new Intel-based Mac. By the way, just to be clear, it doesn’t work the other way around. You cannot buy a Dell or HP computer and run Mac software on it.
There’s also the matter of “style” and engineering prowess. Macs are constantly winning awards for the best computer design, both inside and outside. When you walk into a Starbucks with an HP laptop, nobody ever goes, “Ooooh.” I’m not implying that a Mac will get you laid, but I am suggesting it.
Fun Fact: A friend of mine said, “Ken, the way you talk about Macs… You’re in a Mac cult.” I told her that cults are collections of people who engage in behaviors harmful to themselves and others, justified by strange, made-up rules that only make sense to other cult members. Mac users are not in a cult. Windows users are. I’m a deprogrammer.
If you do decide to buy a PC, go to a computer store and actually try out the model you’re thinking of buying to make sure it fits your needs. Chances are, the staff will be completely unable to answer any of your questions, or help you in any way. See if you can find the model you want to buy on Amazon, or some other place that has owner feedback.
If you decide to get a Mac, go to an Apple Store and try out the model you’re thinking of buying. On average, the staff should be able to answer your questions. Before you buy, you might want to check out the Apple Refurb Store (on the Apple web site, go to the “Store” section, then look for the “Save” tag near the bottom right.) There are computers there that are basically new, but because the box was opened or the machine was used for a few days, they can’t be sold at full price. They come with a one year warranty, like regular Macs, and they clean and test the machines to make sure they’re “like new.” I have such a machine, and I’ve helped several other people buy Macs from the Refurb Store. None of us has ever had a problem with our purchase.
Okay, so, that’s all I can think of. A little history, a few things to watch out for, how stuff works. I hope you’ve gained something from the read.