Source: © Authors
In his 1964 book, Understanding Media, Marshall McLuhan stated
that ”….by means of electric media, we set up a dynamic by which all
previous technologies - including cities - will be translated into
information systems” (McLuhan, 1964). In 1966, Karl Steinbuch, a German
computer science pioneer also predicted that ”In a few decades time,
computers will be interwoven into almost every industrial product”
(Mattern & Floerkemeier, 2010, p. 242)
The World Wide Web (Web 1.0) - a network of linked HTML documents that
resided on top of the Internet architecture - characterized the early
days of the Classical Internet – the Internet as we know it today. This
network of static HTML pages progressively evolved into Web 2.0, a term
describing the use of World Wide Web technology and web design that
enabled creativity, secure information sharing, collaboration and
functionality of the web. With Web 2.0, two-way communication became
ubiquitous and allowed user participation, collaboration, and
interaction (Whitmore, Agarwal, & Da Xu, 2015). Web 2.0 technologies
include social networking services, electronic messaging services,
blogs, and wikis—technologies that have become indispensable to modern
social interaction as well as for global business.
While Web 2.0 currently dominates the Internet, there has been the
emergence of Semantic Web or Web 3.0. A technology that makes markup web
content understandable by machines, allowing machines and search engines
to behave more intelligently (Whitmore et al., 2015). Marking up web
content in standardized formats would allow machines to process and
share data on their own, without the need for human mediation (Whitmore
et al., 2015). Alongside developments in the Internet technologies,
technologies in Sensor Networks, Near Field Communication using RFID
tags, synthetic biology, biotechnology, cognitive sciences, and
nanotechnology have also been evolving. Convergence of Web 2.0, Web 3.0,
and these technologies, has led to a paradigm being referred to as the
Internet of Things (IoT).
IoT is maturing and continues to be the latest, a most hyped concept in
the IT world. It was added to the 2011 annual Gartner Hype Cyclethat tracks technology life-cycles from ”technology trigger” to ”plateau
of productivity,” and it hit the Hype Cycle’s ”Peak of Inflated
Expectations” in 2014. As of August 2017, the term IoT was still at the
“Peak of Inflated Expectations”. Gartner’s Information Technology Hype
Cycle (Gubbi et al., 2013) is popularly known for representing
emergence, adoption, maturity, and impact on applications of specific
technologies (Ferguson, 2002). It was forecasted in 2012 that IoT would
take between 5-10 years for market adoption and every indication now is
evident it was predicted right.
Riggins and Wamba, (2015) grouped the level of IoT adoption through Big
Data analytics usage to the following categories:
- Society level where IoT mainly influences and improves
government services by reducing cost and increasing government
transparency and accountability,
- Industry level in which manufacturing, emergency services,
retailing, and education have been studied as examples,
- Organizational level in which IoT can bring the same type of
benefits as those mentioned in society level,
- Individual-level where daily life improvements and individual
efficiency and productivity growth are marked as IoT benefits.
The IoT has been referred to with different terminologies, but the
objective of IoT is same in the broad sense (Madakam, Ramaswamy, &
Tripathi, 2015). The taxonomical labels of IoT include Internet of
Everything, Web of Things, Internet of Objects, Embedded Intelligence,
Connected Devices and Technology Omnipotent, Omniscient and Omnipresent.
In addition to these taxonomical labels, IoT has also been variously
described as follows (Madakam et al., 2015):
- Cyber-Physical Systems : Integrations of computation and
physical processes, in which bringing the real and virtual worlds
together.
- Pervasive Computing : A computer environment in which virtually
every object has processing power with wireless or wired connections
to a global network
- Ubiquitous Computing or Calm technology : Where technology
becomes virtually invisible in our lives
- Machine-to-Machine Interaction : Means no human intervention
while devices are communicating end-to-end
- Human-Computer Interaction : Involves the study, planning, and
design of interaction between people and computers
- Ambient Intelligence : It is a developing technology that will
increasingly make our everyday environment sensitive and responsive.
There are varying definitions of IoT, and there is not a standard one
agreed to by all. However, there is a common understanding of what it is
and its prospects. “What all of the definitions have in common is the
idea that the first version of the Internet was about data created by
people, while the next version is about data created by things”
(Madakam, Ramaswamy, & Tripathi, 2015, p.165). The thing in IoT
can be a person with a heart monitor implant, a farm animal with a
biochip transponder, a crop with a nanochip for precision agriculture,
an automobile that has built-in sensors to alert the driver when the
tire pressure is low—or any other natural or man-made object that can
be assigned an IP address, and provided with the ability to transfer
data over a network (Shin, 2014).
Fundamentally, the IOT can be described as a global network which
facilitates the communication between human-to-human, human-to-things,
and things-to-things, which is anything in the world by providing a
unique identity to every object (Aggarwal & Das, 2012). Madakam et al.,
(2015) define IoT as “An open and comprehensive network of intelligent
objects that can auto-organize, share information, data, and resources,
react and act in the face of situations and changes in the environment”
(p. 165)
Figure 2 . The Quantum of Internet of things