Shocking, but true: At one time, creatures called “typists” roamed corporate halls.
Mind you, I’d thought I’d never actually seen one of these “typists,” but recently, after undergoing intense regression hypnosis, my wife revealed she got her big break in advertising as a so-called “copy typist.” Yes, unbelievable as it sounds (I’m still seeking independent confirmation of the claim), her job consisted of taking handwritten ad copy and typing it into her computer so it could be used for layouts or scripts. Wow.
Silly as this job sounds now, it made sense at the time. Computers (and things called “word processors”) were fairly new in the workplace. Creative types couldn’t be called upon to learn the intricacies of new technology. They continued to produce copy in longhand. Companies knew they needed copy in digital form and actually paid human beings (not well, I gather) to move data from ink to bits.
Of course, over the years computers became easier to use. Foot-dragging types were finally forced to learn how to use them. This change, one in which technology became an easy-to-use commodity, radically transformed the workplace. Typists, secretaries, mechanical paste-up artists, and others whose skills were in difficult-to-acquire specialties became irrelevant. Now most are gone, except the few who remain as the servants of highly paid, dinosaur executives who bellow out for someone to “take an email.”
The same thing happened in the online world. In 1996, I paid princely sums to HTML jockeys. Most sites had to be hand-coded. Today, HTML coders are a dime a dozen and rapidly diminishing even further in value as Web development technology progresses.
Sure, the skills are somewhat important for cutting-edge work, but I foresee a day in the not-so-distant future when I’ll be able to drive my pickup truck down to a local university that’s still cranking out HTML people and pick up the new media day laborers milling on the corner, looking for jobs.
Step back and examine these trends over time. You’ll discover a pattern. New technologies develop, then a select few learn them and score highly paid jobs because their skills are scarce. Eventually, the technology becomes easier to use and those skills are less important. The market is glutted with people who jumped on the bandwagon late, and, finally, the technology develops to a point where it’s a commodity purchased by the lowest bidder.
This pattern continues in Web development today. Unfortunately, many clients and companies haven’t caught on. They’re making big mistakes… mistakes that are often reflected in the way they bid out work. Mistakes that can have a major impact on a project’s success or failure.
Let’s step back in time again and see where trends go.
In the mid-’90s, everything in Web development was made to measure. Everything from content management systems (CMSs) down to Java news tickers was constructed from scratch. Web development companies hired experts in everything. Developing a Web site was more about technology than anything else.
Over time, that’s changed drastically. Today, almost anything can be bought shrink-wrapped (if not downloadable). It’s been through enough versions to actually work. Some big players create amazing software that can be installed in days, rather than months of custom coding.
Accompanying the move to tech standardization is another trend. The installed base of products is pushing out custom solutions. Very few entities other than new companies create new Web sites. Though we’re still in the midst of moving nearly everyone from static to dynamic sites, that trend will soon crest. In the short term, any site of consequence will be built on dynamic, database-driven platforms.
What happens to companies that invested in major CMSs? First, they’ll be pretty resistant to changing those CMSs. Once databases are developed, business rules defined, workflows specified, and staff changed, few companies want the time (or expense) of changing systems, unless those systems really stink (many don’t). Standardized technology established a beachhead. It’s not going away any time soon.
What’s Web development’s future? Technology will be a smaller issue when companies revamp their Web presences. Sure, they may integrate new technologies and swap out solutions that haven’t scaled sufficiently, but you can be sure once someone invested $500,000 in a Vignette implementation, he’s not going to be too eager to switch when it’s time to upgrade the brand or launch a new initiative.
Technology drove Web development for years. Now, it’s a commodity. Even if people want custom work, competition and pressure from offshore coding shops drive prices ever lower. New tools make it ever easier to do the job.
As technology becomes more accessible and established, branding pressures will become more intense. The Web has a leveling effect on corporate brands. On the Internet, everyone (to a large extent) is in danger of looking the same to their potential customers. Quality and a certain level of design are “table stakes.” It’s expected any company worth its salt will have a decent site. The trick is to rise above the competition by providing a superior experience.
Tom Peters once said, “Design is the principal difference between love and hate.” It’s a profound statement and gets to the heart of what we should do online. Consumers don’t care about technology behind a site as long as it doesn’t stand in their way. They do care about the brand experience, the emotional connection, usability, and a site’s ability to solve the problems or answer the questions that drove them there. These are currently addressed with design and information architecture.
When Napster exploded, IT people got all giddy about “peer-to-peer technology.” It wasn’t tech that drove people to Napster. Napster let them download free music. This is true of most technology: It’s not what it does that matters, it’s what people can do with it.
It’s important we learn the past’s lessons. Recognize the most important part of any Web development project isn’t the technology behind the site, but the creativity that pulls it together. Those of us in the business (and we who hire people to do Web development) must understand differentiation can only be achieved through a site’s brand experience, not its technology. What technology to use is moot. Many technologies as good as any other, and lots work pretty darn well. Once they do, it rarely makes sense to change. It’s the brand that matters.
A Web development firm needs a handle on the tech but also need to recognize it’s easier than ever to integrate something off the shelf or hire programmers when required. Clients or companies putting out requests for proposal must understand having superior technology and having a superior brand experience are two separate issues. Hiring firms that specialize in technology and treat design and branding as afterthoughts make about as much sense as hiring copywriters because they’re adept with Microsoft Word. It makes more sense to separate superior technology and superior creative in budgets and procurement.
Meet Sean at ClickZ E-Mail Strategies in New York City on May 19 and 20.
The technology industry is lagging behind many other sectors when it comes to the proportion of women taking up entry level positions. ... read more
Nurcin Erdogan Loeffler, head of strategy and innovation, Vizeum China, outlines the seven ways businesses can future proof their digital strategies.
Chief marketing officers have shared their views on technology, innovation and how they see their roles transforming into the near future at an ... read more
Every brand would love to see its hashtag trending on social media, but what if it’s for the least expected reason? Should you ... read more