What sparked the development of improved computer technology in the 1960s

documents stored on server computers, and give each document a unique name that can be used by a browser program to locate and retrieve the document. Because the unique names (called universal resource locators, or URLs) are long, including the DNS name of the host on which they are stored, URLs would be represented as shorter hypertext links in other documents. When the user of a browser clicks a mouse on a link, the browser retrieves and displays the document named by the URL.

This idea was implemented by Timothy Berners-Lee and Robert Cailliau at CERN, the high-energy physics laboratory in Geneva, Switzerland, funded by the governments of participating European nations. Berners-Lee and Cailliau proposed to develop a system of links between different sources of information. Certain parts of a file would be made into nodes, which, when called up, would link the user to other, related files. The pair devised a document format called HYpertext Markup Language (HTML), a variant of the Standard Generalized Markup Language used in the publishing industry since the 1950s. It was released at CERN in May 1991. In July 1992, a new Internet protocol, the Hypertext Transfer Protocol (HTTP), was introduced to improve the efficiency of document retrieval. Although the Web was originally intended to improve communications within the physics community at CERN, it—like e-mail 20 years earlier—rapidly became the new killer application for the Internet.

The idea of hypertext was not new. One of the first demonstrations of a hypertext system, in which a user could click a mouse on a highlighted word in a document and immediately access a different part of the document (or, in fact, another document entirely), occurred at the 1967 Fall Joint Computer Conference in San Francisco. At this conference, Douglas Engelbart of SRI gave a stunning demonstration of his NLS (Engelbart, 1986), which provided many of the capabilities of today's Web browsers, albeit limited to a single computer. Engelbart's Augment project was supported by funding from NASA and ARPA. Engelbart was awarded the Association for Computing Machinery's 1997 A. M. Turing Award for this work. Although it never became commercially successful, the mouse-driven user interface inspired researchers at Xerox PARC, who were developing personal computing technology.

Widespread use of the Web, which now accounts for the largest volume of Internet traffic, was accelerated by the development in 1993 of the Mosaic graphical browser. This innovation, by Marc Andreessen at the NSF-funded National Center for Supercomputer Applications, enabled the use of hyperlinks to video, audio, and graphics, as well as text. More important, it provided an effective interface that allowed users to point-and-click on a menu or fill in a blank to search for information.

The development of the Internet and the World Wide Web has had a tremendous impact on the U.S. economy and society more broadly. By

In the 1960s, discussions began about how computers might enhance the practice of medicine. Computer technology seemed to hold promise for improved decision making by clinicians. It was believed that doctors might be able to use computers for much faster access to both procedure results and to the literature. Computers might also help caregivers to reduce medical errors through the provision of reminders and alerts. Nonetheless, at that time, physicians generally did not adopt computers on a number of sensible grounds. Equipment then was expensive, slow, cumbersome and unreliable. Medical administrators shied away from investing in technology that could not yet guarantee sufficient financial benefit. Furthermore, four decades ago physicians preferred to maintain their own autonomy and had scant interest in formal systems to support their medical decisions.

During the 1980s, computers improved dramatically. Graphic user interfaces and networking technologies to connect computers were adopted widely. These developments fostered a new need: the need for a data interchange protocol for heath care. This need, in turn, led to the creation of Health Level 7 (HL7). HL7 refers to standards for electronic exchange of clinical, financial, and administrative information among health care oriented computer systems. Yet at the same time, the implementation of diagnosis-related groups (DRGs) for hospitals and the rise of managed care significantly lowered funding levels for health care providers. Medical practices had little cash for information systems despite growing evidence that they could improve outcomes and reduce costs.

In 1991 the Institute of Medicine (IOM) published the Computer-Based Patient Record: An Essential Technology for Health Care. This document was the first to comprehensively examine the possibilities inherent in electronic medical records (EMRs). The Institute defined 12 functions for the EMR that focused on the patient, not on technology for its own sake (see Table 1).

Table 1.

Original IOM Attributes for the EMR

1.  Support a problem list
2.  Measure health status and functional levels
3.  Document clinical reasoning and rationale
4.  Provide dynamic links to other patient records
5.  Guarantee confidentiality, privacy, and audit trails
6.  Offer continuous access for authorized users
7.  Support simultaneous multiple user views
8.  Support timely access to local and remote information resources
9.  Facilitate clinical problem solving
10. Support direct data entry by users
11. Support practitioners in measuring costs and improving quality
12. Support the existing and evolving needs of clinical specialties

Open in a separate window

During the 1990s, cost remained the major barrier to widespread adoption of EMRs by physicians. Patient confidentiality became the mantra in health care information technology in America, culminating in the passage of the Health Insurance Portability and Accountability Act of 1996 (HIPAA). That law provided for the establishment of a National Committee on Vital and Health Statistics (NCVHS). This committee was responsible for advising the Department of Health and Human Services (DHHS) on issues related to confidentiality, security, patient and physician identifiers, and standards for computer-based patient records. The committee started to advocate for a national health information infrastructure that could assure the creation of a fully interconnected system of health care networks. By the late 1990s, many other countries, including Canada and Australia, already used EHRs as the hubs of their health care systems.

In the 1990s too, the Internet captured the attention of the medical establishment and of the entire developed world. The Internet's key attribute is its ability to provide virtually universal access to information that is interactive and up to date. It allows users to personalize the ways that they access, use and store vast amounts of information. Accordingly, the Internet makes possible, for the first time, a closer and more cooperative relationship between doctors caring for a given patient, between patients and their doctors, between caregiving physicians and researchers and between other parties involved in health care.

As computer systems have improved, they have allowed researchers to make sweeping comparisons about the nature of medical care, the kinds of procedures being used, patient outcomes and cost all analyzed by ZIP code, hospital or provider. Computers have also permitted the generation of unprecedented amounts of data about significant medical errors of omission and commission. These kinds of findings have led to serious doubts arising in the minds of patients, payers, and government bureaucrats and politicians regarding the ability of our health care system to uniformly provide high-quality, efficient and cost-effective care that is based on evidence-based medicine without an EHR.

Oncologists are often surprised to learn that medical care in general and cancer care in particular have been criticized frequently for their failure to deliver state-of-the-art care. This criticism has been relayed in several high-level reports. One was from the President's Information Technology Advisory Council (PITAC) 2001, and several others were issued from the Institute of Medicine. These reports fault the health care system for failing to use information technology effectively and to establish a regular monitoring system to track the quality of care.

In 2004, President Bush appointed David Brailer, MD, PhD, as National Health Information Technology Coordinator. His office will coordinate initiatives undertaken by the federal government and by the private sector in health care informatics to meet the goal of having an electronic health care record established for every American within 10 years.

Dr. Brailer has identified several key initiatives. One initiative is a certification requirement that will be a public-private effort to set minimal standards of functionality, security and interoperability to help physicians make informed purchasing decisions. Certification standards would help ensure that an EHR acquired by a practice would meet minimal performance requirements. Another initiative concerns electronic prescriptions. The Center for Medicare and Medicaid Services (CMS) is to work on standards for electronic prescriptions within the Medicare system. These standards are slated for publication during 2005. A third initiative involves the provision of seed money by the Health Resources and Services Administration for the implementation of community-based health information exchanges (see Dr. Brailer's plan at www.hhs.gov/onchit/framework, and the Journal's interview elsewhere in this issue).

Gathering Momentum for Information Standards

Consistent standards that govern the structure of messages and medical terminology allow different computer systems to communicate. Messaging and terminology standards allow two different computers to exchange a transaction such as an electronic prescription order. Standards will also have to apply to the authentication of users, system security and to permissible interfaces between software components.

As online file sharing evolves, new methods allow the exchange of material online without the use of a central file server. With certain precautions in place, it will be possible for a patient's most current information (residing on multiple computers linked by the Internet) to come into synchronization with a medical record in a given physician's office, giving that physician a full view of that patient's current medical data. The precautions will include appropriate adherence to HIPAA regulations, permission from the patient, the use of a controlled vocabulary and universal identifiers for patients and physicians.

Data will be continually exchanged between sites and systems. For such exchanges to work, each patient, provider and site of service must possess a unique identifier obtained from a central medical database. That way, each application refers in exactly the same way to a given user, patient, encounter, test or event.

A viable National Health Information Infrastructure (NHII) will rest upon a strictly controlled vocabulary, with precise definitions for terms to ensure uniformity. To provide this specialized vocabulary the National Library of Medicine (NLM) licensed SNOMED-CT (Systematized Nomenclature of Medicine-Clinical Terms). SNOMED-CT is a machine-readable, clinically rich lexicon. It uses a controlled vocabulary dataset to standardize clinical communications. The NLM and DHHS have recently made SNOMED-CT available free of charge to all physicians and, many EHR vendors make use of it in their applications.

The set of technologies that underlie the Internet will most likely serve as a foundation for creating EHRs for patients, providers and payers to share. Adjustments will be required to ensure the maintenance of appropriate encryption, security, confidentiality and audit trails. The sending of clinical messages is being standardized around the HL7 version 3 standard and XML (Extensible Markup Language). These standards will allow exchanges like this one: An oncologist will be able to pull up a list of medications being taken by a certain patient and e-mail that list to a colleague; the colleague in turn will be able to paste the list into his or her hospital's own database record for this patient, and so on.

The cancer Biomedical Informatics Grid, or caBIG, is a voluntary virtual informatics infrastructure created by the National Cancer Institute (NCI) that connects data, research tools, scientists, and organizations to leverage their combined strengths and expertise in an open environment with common standards and shared tools. Effectively forming a World Wide Web of cancer research, caBIG seeks to link communities and information sources forming a component of the NHII. It contains a clinical trials management system, has tissue bank and pathology tools for integrating cancer research, and uses standardized vocabularies and common data elements with a unifying architecture that should ensure interoperability. Combined with a future oncology EHR it would be an ideal tool for increasing clinical trial accrual and coordinating oncology translational research. Detailed information regarding caBIG can be obtained at the NCI Web site, http://caBIG.nci.nih.gov. The NCI has also updated the prescribed terminology and grading system for reporting adverse events. The NCI's Common Toxicity Criteria have become a standard for reporting adverse events and could be used easily as part of an oncology EHR dataset.

In 2003, both the public and private sectors took major steps to ensure that EHRs become a regular feature of medical offices within five to eight years. Initially, the DHHS agreed to adopt standards for transmitting electronic medical information. In its Consolidated Health Informatics (CHI) initiative, the DHHS announced that it would urge Congress to help fund health care providers' information technology investments.

The Department also announced a list of five standards that it had selected and mandated for the sharing of medical information among federal agencies:

  1. Messaging standards from Health Level Seven Inc.

  2. Standards for retail pharmacy orders from the National Council for Prescription Drug Programs (NCPDP).

  3. A series of standards for medical devices from the Institute of Electrical and Electronic Engineers (IEEE 1073).

  4. Standard for images from Digital Imaging Communications in Medicine (DICOM).

  5. Standards for the reporting of results from clinical laboratories, that is the Logical Observation Identifier Name codes (LOINC).

    What new technologies were emerging in the 1960s?

    The inventions of the 1960s were all about transforming science fiction into fact. Robots, satellites and a trip to the moon help make what was once only fantasy, become a reality. Featured inventions include: the Lunar Lander, weather satellites, video game consoles, Tasers, and industrial robots.

    What happened in 1960 with computers?

    By the mid 1960's the computer was seen as an information processor, being part of a management information system. Advertisers stressed the "flexibility, versatility, expandability, and ...the capacity of the computer to make logical decisions." IBM in particular was very successful during this decade.

    What was another development in computer during the 1960s?

    The integrated circuit enabled the development of much smaller computers. The minicomputer was a significant innovation in the 1960s and 1970s. It brought computing power to more people, not only through more convenient physical size but also through broadening the computer vendor field.

    What type of computers were used in the 1960s?

    An early transistorized computer, the NEAC (Nippon Electric Automatic Computer) includes a CPU, console, paper tape reader and punch, printer and magnetic tape units.