The development of information technology is
well documented. Processing power and cost have followed Moore's exponential
law for several decades, leading some experts to claim that technological
development has reached an inflection point, beyond which, computers will
become capable of solving problems that had not previously been considered
Smart phones, tablets, and wearable devices
illustrate the computer's transition from, unwieldy equipment to the desktop
and increasingly to the person. Touch screens, voice activation and head up
displays are expanding the ways in which users interface with systems. The
shift to server-side processing, the proliferation of data and constant
connectivity allow access to the world's knowledge and huge processing power
anywhere and at a potentially tiny marginal cost. Modular development, open
source platforms and increasing interoperability allow relatively small organisations
to develop significant pieces of IT infrastructure.
Ethical, legal, regulatory and security
concerns, along with failed top-down projects, local fragmentation, long
procurement cycles and barriers to supplier entry have meant that IT in healthcare
has lagged behind the cutting edge. Even in healthcare though, the Internet has
proved to be a powerful source of information for clinicians and patients.
Clinicians are increasingly using Electronic Patient Records to keep structured
care notes that are accessible throughout organisations and beyond. The public
are accessing personalised health and lifestyle apps in large numbers, with
some even investing in wearable monitoring technology.
Systems have been developed that can make
diagnoses and suggest management plans based on signs, symptoms and
investigation results. Some of these systems can process natural language
treatment protocols and clinical research papers and can interface with
electronic record systems. Arguably,
however these innovations have not yet had a fundamental impact on the
healthcare of most patients.