A broad concept, Technology can refer to both tangible tools such as utensils and machines, and intangible systems like computer programs. It can also encompass anything that enhances learning or makes it easier to perform certain tasks, such as videoconferencing tools for collaboration in the classroom.
The word ‘technology’ may bring to mind a lot of consumer gadgets – mobile phones, the Internet, computers, big TVs and HiFi’s, cars, drones or robotic grass cutters. But in the enterprise world, talk tends to gravitate towards what’s known as IT, Information Technology – the hardware and software that underpins these other technologies.
In the 1999 book Visions of Technology: A Century Of Vital Debate About Machines Systems And The Human World, Pulitzer Prize-winning historian Richard Rhodes assembled a fantastic collection of essays about technology, from a wide range of scholars and thinkers across the 20th century. It’s a great book to have on your bookshelf, and it reveals some interesting differences in how different people understood technology at various times.
One idea that’s common is that, at a fundamental level, every technological device necessarily prioritizes some routes and ends while neglecting others. For example, when digital cameras became commonplace, they shifted the route to photographs away from film and darkrooms. It’s not that analogue photography was ‘worse’ than digital, but that the new technology simply made the old way seem less attractive or necessary.
Another important idea is that, for any technology to be effective, it must’seize the means as well as the ends of the activity.’ It must efficiently route people’s finite energy and attention, and that necessarily involves a hierarchy of priorities.