In 1984, I got my first computer after having worked all summer at a local pig farm to save enough money (seriously!). It was a Commodore 64. I was out of my mind with excitement about it until I realized that the computers in my high school's new lab were Apple IIs. My ability to transfer my skills (and...errr...acquire new software) from school to home would be severely limited. I sold the C64 and bought an Apple II+. I haven't left the Apple family since.
I loved that computer, tricked out with a green screen monitor, dual 5.25" floppies, and an Apple joystick. I used to spend hours in my room pounding away at the keyboard, poring over the manuals (the II+ came with a technical manual that included a circuit diagram and a printout of the OS code, believe it or not). Even though I loved all things electronic, I never had much of an aptitude for math, though I always had a deep love of science. I was interested in stories, getting my computer to do cool stuff, and (of course...I was a teenage boy!) playing games. To me, my trusty Apple II+ wasn't "technology" that required a deep knowledge of math and electronics to use - it was a box of possibilities that allowed me to express my own odd sense of humor by making it do stuff it wasn't supposed to do. It was a sandbox where I could play and, if I made a mess, I could always just delete it.
I had no idea what I could do (or, probably more importantly, what I couldn't do), so I just went for it. I'd always loved language, so a lot of my early experiments in coding involved writing programs to translate sentences in odd ways, writing very basic interactive fiction (kind of like "choose your own adventure" stories), or prank programs that would fill the screen with blinking characters, someone's name rendered in big letters made up of smaller letters, or say rude things when someone was foolish enough to type something at the command prompt. Yeah, they were pretty amateurish and a far cry from what real programmers could do, but they were a heck of a lot of fun to make.
Naturally I signed up for programming classes in high school. But that's when things began to change. All of a sudden I was being forced to write programs to solve boring math problems when instead I really wanted to write games. I retaliated in my FORTRAN class by writing "Mole Hunt," an ASCII-based game where the player used the arrow keys to move a "mole" (rendered using a capital "M") around an island (stunningly depicted using keyboard characters) to avoid getting eaten by foxes (again, believe it or not, rendered using a capital "F") and escape over a bridge. Not having any real way to fight back against the foxes, the mole used his only advantage: access to four "mole holes" located in the corners of the rectangular island to randomly "travel" to one of the other "holes" to avoid the foxes. It was surprisingly fun, though hardly Super Mario World.
I stopped taking programming classes after my second one because they just didn't seem relevant. I wanted to play. I wanted to warp words. I wanted to play jokes on my friends. Unfortunately, teachers thought that I needed to achieve Stephen Hawking-level understandings of complex mathematics. They wanted me to solve boring "so a train leaves the station traveling 50kph" type of problems to learn programming. Not my thing. I turned my pursuits to music and girls and getting into college (that list is in actual 17-year-old priority order, by the way) and moved on. But I always kept my Apple II+ by my side, even if I'd been convinced by the educators that I wasn't a "real" computer person since I wasn't all that enthused about math.
In college I became an English and psychology major, one advantage of which was that I only had to complete up through pre-calculus. I still loved tinkering with my computer, though, so when I had the chance I decided to take the hardest programming class I could get myself into: VAX assembly programming language. I should have known it was a bad idea when on the first day of class, the professor (having seen my major listed on his roster) called me aside after class and asked me point blank "Why the f**k are you in my class?" I told him I thought it was interesting. He told me he wouldn't help me because he had "real" computer science students who needed to pass the class. I refused to drop out. Game on.
Unfortunately while I was able to understand the incredibly arcane "language" that was VAX assembler (programs were written by literally moving bytes around in memory by using three-letter codes), the assignments were math problems that I couldn't do on paper, much less instruct a computer how to solve. The best I could do was write programs that compiled without crashing and would accept an input and produce an output. I didn't even have the math skills to check if the output was right. But I was stubborn: if it compiled and ran, I turned it in. I got a "D" in the class and was never prouder of a grade that wrecked my GPA.
In grad school (English literature), I focused on critical theory and played with my computer in my spare time. I bought a Mac Classic with my student discount (still over $2,000 at the time) and discovered a program called HyperCard, which let you write programs by creating "cards" for input and output. Better yet, you could link cards together with "hyperlinks" to create nifty stuff like annotated stories. You could even include pictures! To my 1991 self, this was magic. I could finally use my computer to do the kinds of things that interested me.
At the same time I started to read a lot of French philosophers who wrote about how meaning was constructed through language through "difference" and how they linked to other words (scholars out there, please forgive this gross oversimplification of Jacques Derrida, OK?). That's when I had my epiphany: computers (especially HyperCard) worked the same way! I finally had found a way to merge my interests. I was completely hooked!
Coincidentally, it was also the same time that Tim Berners-Lee was developing Mosaic, the first web browser. I stumbled across it after sneaking into one of the computer science labs (the ones with the UNIX machines) and all of a sudden I knew I'd discovered the thing that was going to change the world. Not to pat myself on the back too much, but it turns out that I was right.
Why did the first clunky, ugly web browser have such a profound effect on me? Simply put, it finally gave me the ability to do what I'd always wanted to do with computers: communicate, entertain, and share my ideas with other people. As lame as HTML 1.0 may seem by today's standards, it provided an easy-to-use and powerful way to connect knowledge and, more importantly, publish new things to a global audience by completely bypassing all the old gatekeepers and intermediaries. It was a blank canvas for creative ideas. It was a global sandbox waiting to be shaped into whatever we could imagine. And it really did change the world - without requiring that those who wanted to use it be able to factor fourth-order polynomial equations.
I'm writing this right after the death of Steve Jobs because I wanted to pay tribute to how his creativity changed my world for the better and opened up the technological revolution that has improved lives all over the world. After thinking about what he'd accomplished, it occurred to me that his true visionary spirit and creativity came out of the fact that he wasn't a "techie" in the classic sense. In his obituary for Steve Jobs in "The New York Times," John Markoff said it best when he wrote:
"Mr. Jobs was neither a hardware engineer or a software programmer, nor did he think of himself as a manager. He considered himself a technology leader, choosing the best people possible, encouraging and prodding them, and making the final call on product design."
When I read that I realized "Hey! That's me!" and, more importantly, that's all of us who work to create new things on the web. Sure, some of you reading this may be coders or engineers, or (shudder!) managers, but the one thing that I'd bet we all have in common (based on the hundreds of folks in the biz I've met over the years) is that at heart we're generalists who happen to have realized that we could use technology to express ourselves in ways that would never have been possible otherwise. At our best we're inventors, innovators, writers, strategists, and creators who know that we succeed not by our individual actions but by using our talents to inspire teams of people to bring (to paraphrase Mr. Jobs) "insanely cool" new stuff into the world online through our collective efforts.
That's the legacy of Steve Jobs. He's shown that it's possible to be insanely successful by taking risks, being true to your vision, and by being smart about collecting and guiding the talents of those we work with. He's shown that you don't have to be a "techie" to use technology to change the world; you just have to understand the possibilities and have the courage and vision to push the limits of those possibilities. Steve Jobs may have left us, but his work has inspired countless others who, like him and like us at our best, continue to see the possibilities.
Thanks, Steve. We'll miss you.
Know your Ambiguous Customer: Effective Multi-Channel Tracking
Wednesday, June 5 at 1pm ET - Learn why a move from the "batch and blast" email approach enables better conversations with your customers.
Register today - don't miss this free webinar!
Sean Carton has recently been appointed to develop the Center for Digital Communication, Commerce, and Culture at the University of Baltimore and is chief creative officer at idfive in Baltimore. He was formerly the dean of Philadelphia University's School of Design + Media and chief experience officer at Carton Donofrio Partners, Inc.
June 5, 2013
1:00pm ET / 10:00am PT
June 20, 2013
1:00pm ET / 10:00am PT