Tony Afejuku
The Internet of Things
Talk about the Internet of Things (loT) and you would not be
mistaken as well. This is for its ability to abridge things, reduce exertions
and eliminate the drives of yesterday. We may argue that loT is suitable for
learning, bringing about a seemly or conducive environment, solving intricate
problems and for a greater experience. Outside learning, it is superb for
remote working, for automation and for good time management, amongst other
things.
Bruce Schneier describes it as “a web that we connect to. It
is a computerised, networked, and interconnected world.” He stressed that it is
the future of living, which we refer to as the Internet of Things (loT). It is
truly the future where the things we could know by other means are harvested by
devices and presented to us in quicker, smarter paces and places.
It is a future where humans may not be able to compete with
them. Common to all is that they are still evolving. Whilst we might find them
awesome, we are assured of higher surprises because the competition and
innovations are continuing. Important in the matrix is the confusion of
learning, or the changing narrative of pedagogy. It tends to minimise research,
as we find what we research on already available in Al.
In simplifying processes and procedures, the technologies
are fast diminishing scholarship as we know it, and challenging the sequence of
imbibing or internalising. These sequences were once owned by humanity, through
a systematic process. It involves stages of listening, reading, experiencing,
trying and failing, and trying again. It encompasses structures and
institutions and a painstaking governance process, emplaced by the state, in
agreement with the citizenry.
While these bodies remain, the intervention of technology
has become disruptive; and likely forcing us to review things many times over.
This is because of the merging of truth and falsehood and the linking of hate
with love.
It is so because of the obfuscation of the real and the
fake, or what is and what is not. While humanity is at once somewhat
celebratory, they are also jittery about declining values and virtues. The good
and the bad are now being muddled.
Learning is dignifying, confidence building and exposing
what needs to be positively exposed. It grows the individual as much as it
benefits society. It is gratifying to human resource development for the
enhancement of society, and the future of mankind. But this is the case when
learning is original, natural and is scientifically and technologically
procedural and follows a process. It is the case when learning is easily
identifiable, reflected in the change it will bring to the individual.
With the new technologies, regardless, we are in the world
of assisted learning, a world of machine learning, a kind of scholarship where
originality is threatened, obliterated and un-definable. Learners are in a new
world where the temptation to outsource to technologies is high. This
temptation is innate and unconscious for its fast pace over huge spaces, and in
the manner where perfection is likely.
Resulting therefrom is the reduction of dynamism that would
have helped internalisation, processes that would have assisted memories and
institutions that would have been useful for monitoring and governance. ChatGPT
is a quick case to reference. Its capacities are thrilling, for being able to
process things, probably as close as the man would have done, or even better in
some cases.
Upon instructions, it responds with appreciable precision,
leaving the learner to do little or nothing, depending on the quality of the
question. Here then lies the question of what more the learner should do. How
do we then contextualise the losses in the important process of engaging texts,
pictures and images, which could be didactic? Deepfake is another case for its
capacity to mimic and parrot a real.
Despite the frill around it, the uninitiated could get
swayed, deceived or confused. He might be carried away by a confusing reality,
leading to a wrong action, or at the least, the formation of wrong notions.
Don’t get us wrong.
The inventions are groundbreaking, pointing at the new world
of growth in thinking capacities and creations. In these learning transitions,
machines and robots must never replace human activities, reasoning,
rationality, values and feelings.
After all, Mark Zuckerberg is quoted to have said “Whenever
I hear people saying Al is going to hurt people in the future I think, yeah,
technology can generally always be used for good and bad and you need to be
careful about how you build it. If you are arguing against Al then you’re
arguing against safer cars that aren’t going to have accidents, and you are
arguing against being able to better diagnose people when they are sick.”
Regardless, the attendant downsides, especially for learning
are an invitation to more critical evaluation. We are surely at another
crossroads between a past and a present, which has hallmarked inventions. The
aero-plane came and eased long-distance travel, but crashes also came with it.
The motorcar similarly came with accidents. The coming of electricity came with
sparks and shocks that could lead to deaths (and actually led to deaths).
So does the invention of dynamite, leading to misuse,
terrorism, wars and deaths. With particular reference to dynamites, Alfred
Nobel, its chief inventor and commercialist, had to establish the Nobel Prizes
in Literature, Peace, Economics, Medicine, and Science to rescue his name and
legacy from being tagged (and being remembered) as a merchant of death.
Despite the ages and flipsides of these inventions, humanity
has never ceased to ask questions about them. They are still trying, year in
and year out to perfect them, or encourage appropriate use. The coming of the
Internet, the rise of Artificial Intelligence, The Internet of Things (loT) and
Hypertext Documents may not also be different in revealing its downsides, one
of which is the redefinition of learning.
Elon Musk, one of the key actors amongst the big techs
recently warned at MIT’s AeroAstro Centennial Symposium that he is
“increasingly inclined to think that there should be some regulatory oversight,
maybe at the national and international level, just to make sure that we do not
do something very foolish. I mean with artificial intelligence, we are
summoning the demon.”
We are in the age of technological advancement without a
human face or humane attribution—an age which beckons to the end of humanity,
the end of the world, at the irrational, distempered moment of pressing the
lethal button of the hydrogen bomb.
It may not be demonic, but the outsourcing of reasoning
(surrendering the power of reasoning to machines, and robots) is some kind of
wizardry requiring our continuous examination, just so that reading or learning
will not die. This is the point that Klaus Schwab made when he opined that “We
must address, individually and collectively, moral and ethical issues raised by
cutting-edge research in artificial intelligence and biotechnology, which will
enable significant life extension, designer babies, and memory extraction.” We
cannot agree more.
Again, do we need an educational system patterned after
foreign cultures and experiences or a system that evolved from our original
needs? Do we have a true understanding of our peculiar histories and
environments such that our curriculum, theories and notions are instructed by
these shared values?
How much effort do we invest into reviews, just so that we
are not parroting models that are foreign to us and are likely to be irrelevant
to our practical realities?
Are we also sure we are not carried away by fancies and
fantasies that are distant from our industries and production lines, in what
would have boosted our economy for the common good? Abigail Adams told us that
“Learning is not attained by chance; it must be sought for with ardour and
attended to with diligence.” Interesting in this quotation is the element of
diligence, which envisages the need to be painstaking.
Doing this will open our minds to our common desires,
regarding whether or not we need more agriculturists, more technologists and
technicians, and more statisticians, humanists and storytellers than space
scientists, astronauts, or similar others.
Being diligent will permit us to locate our needs for a
system in transition, just so we would not only have to look elsewhere for the
real manpower we need because we have either over-trained or under-trained
ourselves.
Let us remember also that global education funds are
available in billions of dollars. An estimate puts it at over $100,000 billion,
accessible in grants and aids, if we get our arts and research concepts right.
Corporations and philanthropists are also ever ready to support ideas that can
lead to innovations, change, and new paths of profits.
We cannot overstate the fact that research is a driver of
development. Development is the centerpiece of social progress. The university
should be the leader in this narrative, especially in linking the peculiarities
of communities with knowledge production, partnerships, and collaborations, in
addition to the execution of the policy outlines of government. These are
triggers of competition, necessarily leading to quality and efficiency.
“A man’s mind, stretched by new ideas, may never return to
its original dimension” and state. Oliver Wendell Holmes Jr. once told us of
the alteration of the mind through education. In making alterations, however,
we are asking that it should have a root, or relevance, for substance, and for
the proper benefit of a society in transition.
To be continiued next Friday
Afejuku can be reached via 08055213059.