HIPAA Security Regulations: Security Standards for the Protection of Electronic PHI: Technical Safeguards - § 164.312

As Contained in the HHS HIPAA Security Rules

HHS Security Regulations as Amended January 2013
Security Standards for the Protection of Electronic PHI: Technical Safeguards - § 164.312

A covered entity or business associate must, in accordance with §164.306:

(a)(1) Standard: Access control. Implement technical policies and procedures for electronic information systems that maintain electronic protected health information to allow access only to those persons or software programs that have been granted access rights as specified in §164.308(a)(4).

(2) Implementation specifications:

(i) Unique user identification (Required). Assign a unique name and/or number for identifying and tracking user identity.

(ii) Emergency access procedure (Required). Establish (and implement as needed) procedures for obtaining necessary electronic protected health information during an emergency.

(iii) Automatic logoff (Addressable). Implement electronic procedures that terminate an electronic session after a predetermined time of inactivity.

(iv) Encryption and decryption (Addressable). Implement a mechanism to encrypt and decrypt electronic protected health information.

(b) Standard: Audit controls. Implement hardware, software, and/or procedural mechanisms that record and examine activity in information systems that contain or use electronic protected health information.

(c)(1) Standard: Integrity. Implement policies and procedures to protect electronic protected health information from improper alteration or destruction.

(2) Implementation specification: Mechanism to authenticate electronic protected health information (Addressable). Implement electronic mechanisms to corroborate that electronic protected health information has not been altered or destroyed in an unauthorized manner.

(d) Standard: Person or entity authentication. Implement procedures to verify that a person or entity seeking access to electronic protected health information is the one claimed.

(e)(1) Standard: Transmission security. Implement technical security measures to guard against unauthorized access to electronic protected health information that is being transmitted over an electronic communications network.

(2) Implementation specifications:

(i) Integrity controls (Addressable). Implement security measures to ensure that electronically transmitted electronic protected health information is not improperly modified without detection until disposed of.

(ii) Encryption (Addressable). Implement a mechanism to encrypt electronic protected health information whenever deemed appropriate.

HHS Description
Security Standards for the Protection of Electronic PHI: Technical Safeguards

We proposed five technical security services requirements with supporting implementation features: Access control; Audit controls; Authorization control; Data authentication; and Entity authentication. We also proposed specific technical security mechanisms for data transmitted over a communications network, Communications/network controls with supporting implementation features; Integrity controls; Message authentication; Access controls; Encryption; Alarm; Audit trails; Entity authentication; and Event reporting.

In this final rule, we consolidate these provisions into § 164.312. That section now includes standards regarding access controls, audit controls, integrity (previously titled data authentication), person or entity authentication, and transmission security. As discussed below, while certain implementation specifications are required, many of the proposed security implementation features are now addressable implementation specifications. The function of authorization control has been incorporated into the information access management standard under § 164.308, Administrative safeguards

Access Control (§ 164.312(a)(1))

In the proposed rule, we proposed to require that the access controls requirement include features for emergency access procedures and provisions for context-based, role-based, and/or user-based access; we also proposed the optional use of encryption as a means of providing access control. In this final rule, we require unique user identification and provision for emergency access procedures, and retain encryption as an addressable implementation specification. We also make "Automatic logoff" an addressable implementation specification. "Automatic logoff" and "Unique user identification" were formerly implementation features under the proposed "Entity authentication" (see § 164.312(d)).

Audit Controls (§ 164.312(b))

We proposed that audit control mechanisms be put in place to record and examine system activity. We adopt this requirement in this final rule.

Integrity (§ 164.312(c)(1))

We proposed under the "Data authentication" requirement, that each organization be required to corroborate that data in its possession have not been altered or destroyed in an unauthorized manner and provided examples of mechanisms that could be used to accomplish this task. We adopt the proposed requirement for data authentication in the final rule as an addressable implementation specification "Mechanism to authenticate data," under the "Integrity" standard.

Person or Entity Authentication (§ 164.312(d))

We proposed that an organization implement the requirement for "Entity authentication", the corroboration that an entity is who it claims to be. "Automatic logoff" and "Unique user identification" were specified as mandatory features, and were to be coupled with at least one of the following features: (1) a "biometric" identification system; (2) a "password" system; (3) a "personal identification number"; and (4) "telephone callback," or a "token" system that uses a physical device for user identification.

In this final rule, we provide a general requirement for person or entity authentication without the specifics of the proposed rule.

Transmission Security (§ 164.312(e)(1))

Under "Technical Security Mechanisms to Guard Against Unauthorized Access to Data that is Transmitted Over a Communications Network," we proposed that "Communications/network controls" be required to protect the security of health information when being transmitted electronically from one point to another over open networks, along with a combination of mandatory and optional implementation features. We proposed that some form of encryption must be employed on "open" networks such as the internet or dial-up lines.

In this final rule, we adopt integrity controls and encryption, as addressable implementation specifications.

HHS Response to Comments Received
Security Standards for the Protection of Electronic PHI: Technical Safeguards

Access Control (§ 164.312(a)(1))

Comment: Some commenters believe that in specifying "Context," "Role," and "User" based controls, use of other controls would effectively be excluded, for example, "Partition rule-based access controls," and the development of new access control technology.

Response: We agree with the commenters that other types of access controls should be allowed. There was no intent to limit the implementation features to the named technologies and this final rule has been reworded to make it clear that use of any appropriate access control mechanism is allowed. Proposed implementation features titled "Context-based access," "Role-based access," and "User-based access" have been deleted and the access control standard at § 164.312(a)(1) states the general requirement.

Comment: A large number of comments were received objecting to the identification of "Automatic logoff" as a mandatory implementation feature. Generally the comments asked that we not be so specific and allow other forms of inactivity lockout, and that this type of feature be made optional, based more on the particular configuration in use and a risk assessment/analysis.

Response: We agree with the comments that mandating an automatic logoff is too specific. This final rule has been written to clarify that the proposed implementation feature of automatic logoff now appears as an addressable access control implementation specification and also permits the use of an equivalent measure.

Comment: We received comments asking that encryption be deleted as an implementation feature and stating that encryption is not required for "data at rest."

Response: The use of file encryption is an acceptable method of denying access to information in that file. Encryption provides confidentiality, which is a form of control. The use of encryption, for the purpose of access control of data at rest, should be based upon an entity's risk analysis. Therefore, encryption has been adopted as an addressable implementation specification in this final rule.

Comment: We received one comment stating that the proposed implementation feature "Procedure for emergency access," is not access control and recommending that emergency access be made a separate requirement.

Response: We believe that emergency access is a necessary part of access controls and, therefore, is properly a required implementation specification of the "Access controls" standard. Access controls will still be necessary under emergency conditions, although they may be very different from those used in normal operational circumstances. For example, in a situation when normal environmental systems, including electrical power, have been severely damaged or rendered inoperative due to a natural or man-made disaster, procedures should be established beforehand to provide guidance on possible ways to gain access to needed electronic protected health information.

Access Control (§ 164.312(a)(1))

Comment: We received a comment stating that "Audit controls" should be an implementation feature rather than the standard, and suggesting that we change the title of the standard to "Accountability," and provide additional detail to the audit control implementation feature.

Response: We do not adopt the term "Accountability" in this final rule because it is not descriptive of the requirement, which is to have the capability to record and examine system activity. We believe that it is appropriate to specify audit controls as a type of technical safeguard. Entities have flexibility to implement the standard in a manner appropriate to their needs as deemed necessary by their own risk analyses. For example, see NIST Special Publication 800-14, Generally Accepted Principles and Practices for Securing Information Technology Systems and NIST Special Publication 800-33, Underlying Technical Models for Information Technology Security

Comment: One commenter recommended that this final rule state that audit control mechanisms should be implemented based on the findings of an entity's risk assessment and risk analysis. The commenter asserted that audit control mechanisms should be utilized only when appropriate and necessary and should not adversely affect system performance.

Response: We support the use of a risk assessment and risk analysis to determine how intensive any audit control function should be. We believe that the audit control requirement should remain mandatory, however, since it provides a means to assess activities regarding the electronic protected health information in an entity's care.

Comment: One commenter was concerned about the interplay of State and Federal requirements for auditing of privacy data and requested additional guidance on the interplay of privacy rights, laws, and the expectation for audits under the rule.

Response: In general, the security standards will supercede any contrary provision of State law. Security standards in this final rule establish a minimum level of security that covered entities must meet. We note that covered entities may be required by other Federal law to adhere to additional, or more stringent security measures. Section 1178(a)(2) of the statute provides several exceptions to this general rule. With regard to protected health information, the preemption of State laws and the relationship of the Privacy Rule to other Federal laws is discussed in the Privacy Rule beginning at 65 FR 82480; the preemption provisions of the rule are set out at 45 CFR part 160, subpart B.

It should be noted that although the Privacy Rule does not incorporate a requirement for an "audit trail" function, it does call for providing an accounting of certain disclosures of protected health information to an individual upon request. There has been a tendency to assume that this Privacy Rule requirement would be satisfied via some sort of process involving audit trails. We caution against assuming that the Security Rule's requirement for an audit capability will satisfy the Privacy Rule's requirement regarding accounting for disclosures of protected health information. The two rules cover overlapping, but not identical information. Further, audit trails are typically used to record uses within an electronic information system, while the Privacy Rule requirement for accounting applies to certain disclosures outside of the covered entity (for example, to public health authorities).

Integrity (§ 164.312(c)(1))

Comment: We received a large number of comments requesting clarification of the "Data authentication" requirement. Many of these comments suggested that the requirement be called "Data integrity" instead of "Data authentication." Others asked for guidance regarding just what "data" must be authenticated. A significant number of commenters indicated that this requirement would put an extraordinary burden on large segments of the health care industry, particularly when legacy systems are in use. Requests were received to make this an "optional" requirement, based on an entity's risk assessment and analysis.

Response: We adopt the suggested "integrity" terminology because it more clearly describes the intent of the standard. We retain the meaning of the term "Data authentication" under the addressable implementation specification "Mechanism to authenticate data," and provide an example of a potential means to achieve data integrity. Error-correcting memory and magnetic disc storage are examples of the built-in data authentication mechanisms that are ubiquitous in hardware and operating systems today. The risk analysis process will address what data must be authenticated and should provide answers appropriate to the different situations faced by the various health care entities implementing this regulation. Further, we believe that this standard will not prove difficult to implement, since there are numerous techniques available, such as processes that employ digital signature or check sum technology to accomplish the task.

Comment: We received numerous comments suggesting that "Double keying" be deleted as a viable "Data authentication" mechanism, since this practice was generally associated with the use of punched cards.

Response: We agree that the process of "Double keying" is outdated. This final rule omits any reference to "Double keying."

Person or Entity Authentication (§ 164.312(d))

Comment: We received comments from a number of organizations requesting that the implementation features for entity authentication be either deleted in their entirety or at least be made optional. On the other hand, comments were received requesting that the use of digital signatures and soft tokens be added to the list of implementation features.

Response: We agree with the commenters that many different mechanisms may be used to authenticate entities, and this final rule now reflects this fact by not incorporating a list of implementation specifications, in order to allow covered entities to use whatever is reasonable and appropriate. "Digital signatures" and "soft tokens" may be used, as well as many other mechanisms, to implement this standard.

The proposed mandatory implementation feature, "Unique user identification," has been moved from this standard and is now a required implementation specification under "Access control" at § 164.312(a)(1). "Automatic logoff" has also been moved from this standard to the "Access control" standard and is now an addressable implementation specification.

Transmission Security (§ 164.312(e)(1))

Comment: We received a number of comments asking for overall clarification as well as a definition of terms used in this section. A definition for the term "open networks" was the most requested action, but there was a general expression of dislike for the manner in which we approached this section, with some comments suggesting that the entire section be rewritten. A significant number of comments were received on the question of encryption requirements when dial-up lines were to be employed as a means of connectivity. The overwhelming majority strongly urged that encryption not be mandatory when using any transmission media other than the Internet, but rather be considered optional based on individual entity risk assessment/analysis. Many comments noted that there are very few known breaches of security over dial-up lines and that nonjudicious use of encryption can adversely affect processing times and become both financially and technically burdensome. Only one commenter suggested that "most" external traffic should be encrypted.

Response: In general, we agree with the commenters who asked for clarification and revision. This final rule has been significantly revised to reflect a much simpler and more direct requirement. The term "Communications/network controls" has been replaced with "Transmission security" to better reflect the requirement that, when electronic protected health information is transmitted from one point to another, it must be protected in a manner commensurate with the associated risk.

We agree with the commenters that switched, point-to-point connections, for example, dial-up lines, have a very small probability of interception.

Thus, we agree that encryption should not be a mandatory requirement for transmission over dial-up lines.

We also agree with commenters who mentioned the financial and technical burdens associated with the employment of encryption tools. Particularly when considering situations faced by small and rural providers, it became clear that there is not yet available a simple and interoperable solution to encrypting e-mail communications with patients. As a result, we decided to make the use of encryption in the transmission process an addressable implementation specification. Covered entities are encouraged, however, to consider use of encryption technology for transmitting electronic protected health information, particularly over the internet.

As business practices and technology change, there may arise situations where electronic protected health information being transmitted from a covered entity would be at significant risk of being accessed by unauthorized entities. Where risk analysis showed such risk to be significant, we would expect covered entities to encrypt those transmissions, if appropriate, under the addressable implementation specification for encryption.

We do not use the term "open network" in this final rule because its meaning is too broad. We include as an addressable implementation specification the requirement that transmissions be encrypted when appropriate based on the entity's risk analysis.

Comment: We received comments requesting that the implementation features be deleted or made optional. Three commenters asked that the requirement for an alarm be deleted.

Response: This final rule has been revised to reflect deletion of the following implementation features: (1) the alarm capability; (2) audit trail; (3) entity authentication; and (4) event reporting. These features were associated with a proposed requirement for "Communications/network controls" and have been deleted since they are normally incorporated by telecommunications providers as part of network management and control functions that are included with the provision of network services. A health care entity would not expect to be responsible for these technical telecommunications features. "Access controls" has also been deleted from the implementation features since the consideration of the use of encryption will satisfy the intent of this feature. We retain as addressable implementation specifications two features: (1) "integrity controls" and "encryption". "Message authentication" has been deleted as an implementation feature because the use of data authentication codes (called for in the "integrity controls" implementation specification) satisfies the intent of "Message authentication."

Comment: A number of comments were received asking that this final rule establish a specific (or at least a minimum) cryptographic algorithm strength. Others recommended that the rule not specify an encryption strength since technology is changing so rapidly. Several commenters requested guidelines and minimum encryption standards for the Internet. Another stated that, since an example was included (small or rural providers for example), the government should feel free to name a specific encryption package. One commenter stated that the requirement for encryption on the Internet should reference the "CMS Internet Security Policy."

Response: We remain committed to the principle of technology neutrality and agree with the comment that rapidly changing technology makes it impractical and inappropriate to name a specific technology. Consistent with this principle, specification of an algorithm strength or specific products would be inappropriate. Moreover, rapid advances in the success of "brute force" cryptanalysis techniques suggest that any minimum specification would soon be outmoded. We maintain that it is much more appropriate for this final rule to state a general requirement for encryption protection when necessary and depend on covered entities to specify technical details, such as algorithm types and strength. Because "CMS Internet Security Policy" is the policy of a single organization and applies only to information sent to CMS, and not between all covered entities, we have not referred to it here.

Comment: The proposed definition of "Integrity controls" generated comments that asked that the word "validity" be changed to "Integrity." Commenters were concerned about the ability of an entity to ensure that information was "valid."

Response: We agree with the commenters about the meaning of the word "validity" in the context of the proposed definition of "Integrity controls." We have named "integrity controls" as an implementation specification in this final rule to require mechanisms to ensure that electronically transmitted information is not improperly modified without detection (see § 164.312(c)(1)).

Comment: Three commenters asked for clarification and guidance regarding the unsolicited electronic receipt of health information in an unsecured manner, for example, when the information was submitted by a patient via e-mail over the Internet. Commenters asked for guidance as to what was their obligation to protect data received in this manner.

Response: The manner in which electronic protected health information is received by a covered entity does not affect the requirement that security protection must subsequently be afforded to that information by the covered entity once that information is in possession of the covered entity.

Jump to Page

Necessary Cookies

Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

Analytical Cookies

Analytical cookies help us improve our website by collecting and reporting information on its usage. We access and process information from these cookies at an aggregate level.