Skip to content

Logger for pyDKB #91

@mgolosova

Description

@mgolosova

We need to add some kind of the common logger for the Dataflow stages and the library itself to avoid manual formatting, writing to sys.stderr, etc.
The logger should:

  • be aware of different log levels (fatal, error, warning, info, trace);
  • accept one message or list of messages;
  • coordinate formatting of message lines with Kafka ExternalProcessLogger [1];
  • never forget about newline in the end of message (as we often do, writing then manually);
  • output messages to the STDERR;
  • add timing to the messages [2];
  • add info about the program produced the message [3].

There are, of course, some loggers for Python that can be used (say, logging). They should be studied to see if there`s a standard solution for we need (I think there must be, as we`re not looking for something extraordinary), and if found -- introduced to the common workflow. If -- al of a sudden -- nothing appropriate was found, implement the custom logger.


[1] The suggestion is to leave already existing agreement about starting the message with smth like (ERROR) (or any other log level marker), but also teach Java part to take (==) as "same level as above" for multiline messages. Having a lot of (ERROR)s in the log looks ugly.

[2] Then we must teach Java part to remove this timing, as it will be added via Java logger as well.

[3] Then Java part must not add the information about the external command, form which STDERR the message was taken.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions