Introduction

College students have long played an integral role in the development and adoption of new technology. Students, along with businesspeople, comprised the bulk of the portable electric typewriter market in the 1960s and 1970s. In the mid-1970s, two students—Bill Gates and Steve Ballmer—met while living in the same hall at Harvard, and went on to play critical roles in the development of the personal computer in the 1980s and 1990s. Universities were among the first institutions to support the growth of the internet, and for a time provided high-speed internet access to more people than did corporations. In the late 1990s, a Northeastern University student named Shawn Fanning and his uncle developed Napster, one of the first popular peer-to-peer file sharing programs. Again at Harvard, Mark Zuckerberg and fellow computer science majors developed Facebook, which was initially only available to college students, but now is the second most-trafficked website, after Google. Google itself was born through the collaboration of two Stanford University graduate students, Larry Page and Sergey Brin.

Icons of file sharing, social media, and internet search: all hatched on college campuses

Today’s college students are universally expected to be computer-literate. Every college campus in America has computing centers with anywhere from a handful to hundreds of networked systems available for student use. Most campuses provide extensive wireless internet access to students. Technophile professors like my own graduate adviser at the University of Wisconsin-Madison, John Hawks, often communicate with students via blogs, Twitter, and even Facebook. Many assignments are expected to be submitted electronically, and professors increasingly incorporate novel forms of coursework and evaluation like videos uploaded to YouTube and Wikis produced by students. That is, it is impossible for today’s college student to be successful without extensive utilization of computing technology. Of course, millions of Americans who take online distance learning courses are entirely dependent upon access to a personal computer and the internet.

What kind of technology does a college student need to buy?

To be blunt, the answer is not much. Most colleges and universities provide more than sufficient access to technology, such that some students never buy a personal computer, let alone a printer, scanner, or other gadgets. I wouldn’t recommend this—it’s inconvenient and restricts your schedule. School-provided hardware is also sometimes aggravatingly outdated, and campus networks do not always work. But college is already incredibly expensive, and it’s hard to reduce your technology budget to less than zero dollars. You have to be very familiar with your school’s technology resources before attempting to get your degree without your own PC. This is a less-than-ideal solution, and spending some money on personal technology can make a student’s life much, much easier.

College is not just about learning Latin declensions, radioisotope decay chains, and great works of fiction. It’s also about learning how to live more or less independently. Our lives are steeped in technology, and college students are just like anyone else with a job—there is no one correct technology solution. The most basic computing solution for a college student entails one personal computer, be it a desktop or a laptop.

A desktop or a laptop?

In the context of college, desktops and laptops both have their advantages and disadvantages. Desktops almost always are more powerful for their cost, are easier to modify as needs change as well as repair, and are harder to steal or lose. Desktops also take up more space, and aren’t portable. A laptop's most notable advantage is portability—you can take it anywhere to get work done. They also occupy less volume, a major consideration for cramped dorm rooms. But they’re also a prime target for theft on campuses, and are more expensive considering their specifications.

Since the rise of netbooks and the ever-decreasing cost of desktops, I’ve come to think that asking whether to use a desktop or a laptop is asking the wrong question. Netbooks are frequently less than $300, with some as inexpensive as $200 (or even less on sale or clearance). A basic desktop can be built or bought for $500 or less, monitor included. Rather than deciding to buy a laptop or desktop, I think it’s wiser to ask yourself what your computing needs are. Most college students need to be able to browse the web and use office applications to type papers and make presentations. These tasks do not require the latest and greatest (and therefore most expensive) tech. If you do not need more than basic computing capabilities, I’ve found that having a less expensive netbook or budget laptop and a standard office computer is a far better solution than having one powerful laptop or potent desktop.

Another important consideration is how long you expect your computer(s) to last. It is perfectly reasonable to expect today’s budget gear to be able to passably browse the web and type papers for the next four years. It is not reasonable to expect today’s budget gear to be able to play 2015’s games and run Adobe Creative Suite 6 or 7 very well. It is difficult to predict what you’ll need for the next four years, but speaking with older students in your program and your professors can give you a good idea of what you’ll be doing as a senior. For those looking to buy a new PC—laptop or desktop—the next few pages cover DIY and off-the-shelf (retail) desktop computers and monitors as well as netbooks and laptops.