How to implement a Log management system

The implementation of a Company log management system is often very difficult. Usually due to the lack of clear project objectives and because of the absence of the preparatory activities that are its foundation.

There are two main types of log management systems:

  • Information Management System
  • Event Management System

The goals set for Information Management Systems and Event Management Systems are different

An Information Management System is an access control solution that enables the generation, collection, analysis and storage of log files in order to preserve the confidentiality, integrity and availability of them. A log file contains an identifier uniquely associated to an individual, the time reference, and the command or query performed.

An Event Management System has the aim of finding the information for the detection of security incidents, within a huge amount of data, by using as source data from IDS/IPS, log from network equipment or systems. It involves using data-mining query with correlation mechanisms to detect the “patterns” associated with malware and/or possible attacks.

From a pure technological point of view, the two aforementioned categories may have elements in common, nevertheless it is good to address the two tasks separately in distinct projects and then reuse common components, such as log collection, log parsing, etc.

In this post we will focus on the first type, highlighting all of the project stages required for its success.

A classic mistake is to think that a log management solution is solely based on the installation and configuration of a software package without keeping the real goals of the project in mind.

An Information Management System has two principal objectives:

  • Compliance with internal policies and/or external regulations (in our case PCI DSS but could also be SOX, GDPR, etc.)
  • The need to maintain the computer evidences as protection from possible fraudulent activities involving customer personal data made by employees or third parties

The solution that will be implemented is strongly interconnected to the clear definition of these objectives.

First phase: Account Survey and Cleaning

A very good starting point is if the company has already implemented a solution of Identity and Access Management (IAM), otherwise it is necessary to provide a phase of user account survey and filtering. A prerequisite for a successful implementation of an IMS is that all user IDs are related to a unique person. Even application accounts must be associated with an employee. In the presence of privileged accounts it is necessary, as far as possible, to not allow direct access to the systems with those accounts, thus forcing the user to identify themselves with their personal username first.

Second phase: LOG Production

When the logs are not present in the system it is necessary to clearly define how they will be generated. The problem is not trivial, because such an activity, in most cases, must be carried out on systems in production that are not designed to take log management into account. The best solution is the one that will not have an impact on the system or on the business process that the system supports.

Far from wanting to make a close examination of all possible log generators on computer systems, it is useful to mention at least the two basic operating system categories:

  • Windows
  • Unix/Linux

With Windows we have to deal with an operating system that has poor default trace, which saves the information in binary form and has difficulty tracking your activities related to access via remote control. The solution in this case is often the installation of market products that provide the agent and a console for the production of reports with access to the system and the file system objects. An appliance that records the remote sessions on the target platform, even in a video file, usually supports this kind of solution.

Unix/Linux has far less problems. The solution then could be one that is the least invasive, like a customized shell writing all the commands in a file, typed inside an SSH session.

Third phase: Collection and Protection

For the collection of log files, when locally generated, there are two basic modes:

  • push mode: sends the log file from the server to the central repository once the session is closed
  • pull mode: the transfer of the log file is centrally controlled

Pull mode has the advantage of easier management, while the push mode gives better assurance regarding the integrity of the file.

To ensure the integrity of the log files and to prevent that the log files generated by the system are not changed in subsequent periods after being created, there are various solutions that follow the same principle: on every log file, once removed from the source system, one must perform:

  • the calculation of the string hash
  • writing of the timestamp with a centralized and synchronized time server (identification with hours, minutes, seconds, day, month and year) which marks the hash received calculating, at the same time, the digital signature

At this point, the storage of the three elements is possible: the log file, string hash and timestamp. The last two making it possible to verify the reliability of the stored log file. Of course, it is not enough to produce a log file in court, but is more than sufficient for the purpose of PCI DSS compliance.

Andra Blogginlägg

Person working on laptop and looking at online secure file sharing and inspection
Blogg

Säker lagring: Del 4

Tydlig spårbarhet genom loggning av alla åtkomstförsök och datatransaktioner är en avgörandekomponent för att förhindra säkerhetsincidenter.

Läs mer »
Person working on laptop and looking at online secure file sharing
Blogg

Säker lagring: Del 3

Att kunna dela information säkert är en kritisk funktion för många organisationer, särskilt de somarbetar i projekt med externa partners

Läs mer »