• With the exception of MYCIN,
a system identifying bacteria
and recommending antibiotics
for patient treatments and the
Proceedings of the IEEE in
furthering natural language
processing, the late 1970s marked
the beginning of the slow-down
of AI investment, interest,
and research that culminated
in the bursting of the AI bubble
in the 80s.
• The AI bubble burst continued
into the 90s, caused by a lack of
adequate basic theories, a myriad
of accident-prone autonomous
vehicle attempts, and the failure
to make the major impact that
AI promised to deliver to society.
There were still some
advancements, such as the
development of Chabot A.L.I.C.E
(artificial linguistic internet
computer entity) in 1995 enabling
the Web; the emergence of the
term Internet of Things (IoT)
in 1999, LG’s announcement
of the first Internet refrigerator,
the introduction of real-time
locating systems (RTLS) in 1998,
as well as the ongoing research
of IBM’s Watson Center.
42 I ICT TODAY
AI debuted its merits in the 2000s
when Watson competed on Jeopardy
and defeated two of the game’s best
champions in 2011. Furthermore,
AI reascended as the enabling
technology for the 4th Industrial
Revolution (Industry 4.0), catapulted
in 2011 by IoT, digitalization,
and the IP-enabled smart
applications being implemented
today and envisioned for tomorrow
in response to 5G and other yet to
be defined emerging technologies.
ICT’S SUBTLE
CONTRIBUTIONS TO
AI ADVANCEMENTS
Most historic advancements
in AI are attributed to university
researchers in conjunction with
leading software giant companies.
However, one of the most successful
AI applications was ICT specific
when AT&T, using artificial
intelligence, developed its
Automated Cable Expert (ACE)
system in the late 1970s; it was
later implemented within the
Bell System in 1982. This telephone
cable maintenance system provided
timely troubleshooting reports,
management analyses, and repair
of high capital cost equipment
once performed by expert
maintenance personnel.10
When considering innovation
in machine automation and AI,
ICT field technicians and installers
often do not realize that many
optical fiber fusion splicers have
evolved into amazing computers.
Evolving from large, clunky
machines into IP-enabled hand-held
splicing devices, today’s fusion
splicers have eliminated much of
the manual processes and human
error in optical fiber termination.
Working behind the scenes in some
of the higher-end core alignment
and V-groove mass (ribbon) splicers
are sensors, lab quality lenses,
and a lot of intelligent back-end
software programming that
incorporates geometry, the laws
of physics (e.g., Marcuse’s equation
for core alignment) with imaging
processes and AI algorithms
to account for the multitude of
factors that can cause a bad splice.
To obtain precise splice loss
estimations, the fusion splicer
integrates lab-grade lenses with
sensors in order to extract
successfully the vital information
(e.g., curves) from the fiber image.
Once curves are derived, the result
is compared by an algorithm
to hundreds of images stored
in the splicer’s memory. Stored loss
estimates are compared to those that
were calculated to obtain the most
accurate splice loss possible to avoid
network failures and unnecessary
and costly downtime before
OTDR testing.
Clearly, many of today’s fusion
splicers through its evolution are
Research forecasts predominantly
agree that global AI uses will grow
at a CAGR between 50 to 63
percent over the next three years.