As a framework user, I want to be able to collect structured logs with Kaa and deliver them to the Kaa server for further analysis.
- The logs data structure must conform to the logs data schema.
- The data schema must support versioning within the scope of Kaa application.
- EP SDK must expose APIs that would accept log entry objects for transfer to Kaa server.
- Log entry objects must be auto-generated by the framework based on the selected log data schema version.
- EP SDK must implement transient log storage mechanism that would make it possible to delay and batch the log transfer. Log storage must be abstracted from the specific persistence mechanism (the client application would have to fulfill the requested storage interface).
- EP SDK must have built-in parameters that would control:
- maximum allowed volume of the batched logs: EP would have to start cleaning up the logs after the maximum allowed volume is exceeded;
- logs batch piecemeal volume;
- logs volume threshold: after the stored logs volume reaches the threshold, EP library would call back the client code to notify it.
- Kaa server must enforce logs security by making them accessible only to the tenant the logs were collected for.
- The server must abstract the specific log appender implementation, thus making it possible to implement custom logs persistence (in a database, in files, etc.).
- Reference MongoDB persistence implementation must store log entries as queryable documents.