Check for missing log data by either:
- Reviewing recent log performance statistics.
- Looking for gaps in the sequential number reported by each Real-Time Log Delivery software agent.
If our service is unable to deliver log data, then we will store it for up to 3 days and deliver it when communication resumes. If we cannot deliver log data within 3 days, then it will be permanently deleted.
Log Performance Statistics
The Log Delivery Performance page indicates the total number of log delivery attempts. It also provides a donut graph for successful and failed log delivery attempts for up to the last 30 days.
RTLD automatically retries log delivery after a failed attempt. As a result, failed log delivery attempts do not necessarily mean that a RTLD software agent was unable to eventually deliver the corresponding log data. Find out if there are missing log files by manually checking for gaps in the sequence number reported by each RTLD software agent.
To view log delivery statistics
-
Navigate to the Realtime Log Delivery page.
- From the Edgio Console, select the desired private space or organization.
- Select the desired property.
- From the left-hand pane, select the desired environment from under the Environments section.
- From the left-hand pane, select Realtime Log Delivery.
-
Find the desired profile and then click on its (analytics) icon.
-
Choose the time period for which log performance statistics will be reported from the upper-right hand corner of the page.
Checking for Sequence Number Gaps
Use the following information when assessing whether there is a gap in the sequential number reported by each Real-Time Log Delivery software agent.
-
A software agent’s unique ID is reported within the:
-
Log file name (AgentID) - AWS S3, Azure Blob Storage, and Google Cloud Storage only
-
-
A software agent’s sequence number is reported within the:
-
Log file name (SequenceNumber) - AWS S3, Azure Blob Storage, and Google Cloud Storage only
-
-
The sequential number reported for each software agent starts at 0.
-
This sequential number resets to 0 at the start of a new day (UTC). The date on which log data was generated is reported within the:
- Log file name (DateStamp) - AWS S3, Azure Blob Storage, and Google Cloud Storage only
- JSON payload (date-stamp)
-
If a software agent stops running, then it will be assigned a new unique ID.
If log data uses either the CSV, JSON Array, or JSON Lines log format, then it will not contain information that uniquely identifies a set of log data. If log data using one of these formats is delivered to a destination other than AWS S3, Azure Blob Storage, or Google Cloud Storage, then there is no way to check for gaps in sequence numbers when attempting to identify missing log data.
Log File Example
Let’s assume that your AWS S3 bucket, Azure Blob container, or Google Cloud Storage bucket contains the following log files:
1wpc_0001_123_0114_0000000000000123_0.json.gz2wpc_0001_123_0114_0000000000000123_1.json.gz3wpc_0001_123_0114_0000000000000123_3.json.gz
In this situation, we can tell that there is missing log data. Specifically, the log entries associated with the following log file are missing:
wpc_0001_123_0114_0000000000000123_2.json.gz