Since the early 1980s, I’ve been privileged to witness several “revolutions” in computing.
Looking backwards, now we’ve seen how they played out, each seems inevitable, although the real lesson of each is that how – and even whether – seeming technology revolutions play out is impossible to see at the time.
The micro computer revolution of the late 1970s and early ’80s saw computing personalised. Not initially something driven by large companies, either as providers or users of the technology, it was enthusiasts and hobbyists who created and bought the early Apple and other micro computers that long predated the IBM PC and MS-DOS that became the model for personal computers for the next 25 years or more.
Desktop publishing, enabled in part by the GUI and WYSIWYG revolution, and the laser printer (along with PostScript, the foundation of Adobe’s success) – all technologies that are essentially the result of research at XEROX’s fabled PARC, even if largely commercialised elsewhere – is, as I’ve written before, an under appreciated stepping stone toward the Web.
It created new marketplaces for digital content (ultimately printed on paper), an explosion in magazines, and increased the number of people capable of creating and designing content – writing, illustration, photography, and page layout – that set the scene for the Web, a few years later.
Without this cohort of digital creators, would the Web have had content, and the creators of it – so vital for those early adopters to find value in?
The Web itself seems inevitable now but was far from it in 1990, when already many commercial hypertext systems, and commercial internet or internet-like providers like AOL and CompuServe were successful, and – compared to the Web – were relatively sophisticated.
The smartphone revolution, driven initially by the iPhone and then increasingly Android, also seems inevitable now. How could have it turned out any other way?
But, as a recent in depth history of the iPhone made plain, within Apple the push for a phone that was essentially an extension of the iPod – not at all the powerful Web enabled computer in your pocket that the iPhone became – made it a very real contender for what Apple launched in 2007.
If the University of Minnesota hadn’t started charging licensing fees for Gopher servers in 1993, at the time Gopher was at least as well established and advanced as the Web – perhaps even more so – would Gopher have been the global interconnected network that the Web became?
Things are never inevitable. Not in our individual lives, and not in broader cultural and societal trends.
We have to make bets, based on hunches, because the alternative – doing nothing until the outcome is entirely clear – is, at least for many of us, no alternative at all.
Yesterday, we held our first AI conference. In my opening remarks, I observed that I studied AI at university in the 1980s, and my enormous enthusiasm was somewhat diminished when it turned out it wasn’t about Turing tests and Asimovian intelligent robots, but hill climbing algorithms.
AI has, in many ways, been the “next big thing” for so many years, perhaps even since the 1950s. But, as you might be able to tell from my recent writing and presentations, the AI conference – and a good deal of the content at our Summit in November – my hunch is that now is the time to really start thinking hard about the impact on our work, our businesses, and our products of these technologies.
As someone observed yesterday at AI, “No-one knows anything about this stuff”. But I remember when that was true of the Web, of the mobile Web, and even of personal computers.
As I wrote a little cheekily last week, and repeated yesterday, if you do it today, you’ll look like a genius. If you’re not doing it in two years time you’ll look like an idiot.
A rare chance for me to come speak to your team
If you are based in Australia, and keen for me to come and speak to your team – indeed, anyone in your company you think might value from these thoughts (and perhaps help you move your organisation toward adopting or further investing in these technologies and approaches) – then early next week we’re announcing something that you’ll be the first to hear about here.
For teams of eight or more who attend Web Directions Summit, I’ll come to your company, and deliver my new presentation, The Web in a post app world, which looks at the place the Web has in a world increasingly driven by AI, AR, and non screen based interfaces.
Not only that, your whole team will get access to not just the videos from the Summit, but also from our two other major conferences in 2017, Respond and Code. And all that for the price of a Classic ticket.
To get all this, and have me come speak (and do a Q&A session, as well), just register eight or more folks from your company, and we’ll organise it all.
These places are strictly limited, so if you’re keen, start organising your team now. And if you have any questions, just drop me a line.