logging module
What is logging module
Logging module is a module provided by python for logging
Why logging is needed
We can open the file by ourselves and write in the log, but these operations are repetitive and without any technical content, so python has packaged it for us. With logging, we only need to simply call the interface when recording the log, which is very convenient!
log level
Before you start logging, you need to be clear about the level of the log.
With the passage of time, there will be a lot of log records, thousands of lines, how to quickly find the required log records has become a problem.
The solution is to level the logs.
logging module divides logs into five levels, from high to low:
1.info Conventional Information
2.debug debugging information
3.warning Warning Information
4.error error message
5.cretical Serious Error
Essentially, they use numbers to represent levels, ranging from high to low, 10, 20, 30, 40, 50, respectively.
The Use of logging Module
#1. Import module import logging #2. Output log logging.info("info") logging.debug("debug") logging.warning("warning") logging.error("error") logging.critical("critical") #Output WARNING:root:warning #Output ERROR:root:error #Output CRITICAL:root:critical
We found that neither info nor debug output, because they are not high-level enough.
By default:
The lowest display level of logging is warning, and the corresponding value is 30. The log was printed to the console. The log output format is: level log generator name log message
How to modify the default behavior? We need to configure it ourselves.
Custom Configuration
import logging logging.basicConfig() """Available parameters filename: Create with the specified filename FiledHandler(I'll explain it later. handler The logs are stored in the specified file. filemode: File opening mode, specified filename This parameter is used when the default value is“ a"It can also be designated as“ w". format: Appoint handler The log display format used. datefmt: Specify date and time format. level: Set up rootlogger(Specific concepts will be explained later.) Log level """ #Case: logging.basicConfig( filename="aaa.log", filemode="at", datefmt="%Y-%m-%d %H:%M:%S %p", format="%(asctime)s - %(name)s - %(levelname)s - %(module)s: %(message)s", level=10 )
Format all available names
(name)s: Logger's name, not the user's name. (level no) s: log level in digital form (level name) s: log level in text form (pathname)s: The full path name of the module calling the log output function, which may not be available (filename)s: The file name of the module that calls the log output function (module)s: The name of the module that calls the log output function (funcName)s: The name of the function that calls the log output function (lineno)d: The line of code where the statement calling the log output function is located (created)f: Current time, expressed in UNIX standard floating point numbers for time (relativeCreated)d: The number of milliseconds since Logger was created to output log information (asctime)s: The current time in string form. The default format is "2003-07-08 16:49:45,896". After the comma is milliseconds. (thread)d: Thread ID. Maybe not. (threadName)s: Thread name. Maybe not. (process)d: Process ID. Maybe not. (message)s: User output message
So far we can configure the basic information ourselves, but when we want to export the same log to different locations, these basic configurations can not be achieved.
For example, a login registration function needs to record logs, while generating two copies for the programmer to see, one for the boss to see, as a programmer should see more detailed logs, two bosses should be simple, because he does not need to care about the details of the program.
To achieve this, we need to understand the loggin module systematically.
Four Core Roles of logging Module
1.Logger Log Generator Generates Logs
2.Filter Log Filter Log
3. The Handler Log Processor formats the logs and outputs them to the specified location (console or file)
4.Formater Processing Log Format
A log's complete life cycle
1. logger generates log - > 2. Give it to the filter to determine whether it is filtered - > 3. Distribute log messages to all bound processors - > 4 processors output logs according to the bound formatted objects.
The first step is to check that the log level is not executed if it is below the set level.
The second step is to use few scenarios and use object-oriented technology points later.
The third step also checks the log level and does not output if the logs obtained are below their own log level.
The generator level should be lower than the handle level, otherwise it is meaningless to set the handle level. For example, handler is set to 20 generators and 30 generators Logs below 30 will not be generated at all
Step 4: If no format is specified, follow the default format
Use of logging Roles (Understanding)
# generator logger1 = logging.getLogger("Log Object 1") # File handle handler1 = logging.FileHandler("log1.log",encoding="utf-8") handler2 = logging.FileHandler("log2.log",encoding="utf-8") # Console handle handler3 = logging.StreamHandler() # Formatting objects fmt1 = logging.Formatter( fmt="%(asctime)s - %(name)s - %(levelname)s: %(message)s", datefmt="%m-%d %H:%M:%S %p") fmt2 = logging.Formatter( fmt="%(asctime)s - %(levelname)s : %(message)s", datefmt="%Y/%m/%d %H:%M:%S") # Binding Format Objects and File Handles handler1.setFormatter(fmt1) handler2.setFormatter(fmt2) handler3.setFormatter(fmt1) # Binding Generator and File Handle logger1.addHandler(handler1) logger1.addHandler(handler2) logger1.addHandler(handler3) # Setting Log Level logger1.setLevel(10) #Generator log level handler1.setLevel(20) #Handle log level # test logger1.debug("debug msessage") logger1.info("info msessage") logger1.warning("warning msessage") logger1.critical("critical msessage")
So far we have achieved the above requirements, but this is not the way we ultimately achieve it, because it is painful to write such code every time.
Inheritance of logging (Understanding)
You can specify one log as a child log or a descendant log of another log
When a descendant log receives a log when an inheritance relationship exists, the log is passed up
Designated inheritance relationship:
import logging log1 = logging.getLogger("mother") log2 = logging.getLogger("mother.son") log3 = logging.getLogger("mother.son.grandson") # handler fh = logging.FileHandler(filename="cc.log",encoding="utf-8") # formatter fm = logging.Formatter("%(asctime)s - %(name)s -%(filename)s - %(message)s") # binding log1.addHandler(fh) log2.addHandler(fh) log3.addHandler(fh) # Binding format fh.setFormatter(fm) # test # log1.error("test") # log2.error("test") log3.error("test") # Cancel delivery log3.propagate = False # Testing again log3.error("test")
Configure the log module through dictionary (emphasis)
It's very cumbersome to write code to configure every time. We can write a complete configuration and save it for later direct use.
import logging.config logging.config.dictConfig(LOGGING_DIC) logging.getLogger("aa").debug("test")
LOGGING_DIC Template
standard_format = '[%(asctime)s][%(threadName)s:%(thread)d][task_id:%(name)s][%(filename)s:%(lineno)d]' \ '[%(levelname)s][%(message)s]' #Where name is the name specified for getlogger simple_format = '[%(levelname)s][%(asctime)s][%(filename)s:%(lineno)d]%(message)s' id_simple_format = '[%(levelname)s][%(asctime)s] %(message)s' logfile_path = "Configuration file path" LOGGING_DIC = { 'version': 1, 'disable_existing_loggers': False, 'formatters': { 'standard': { 'format': standard_format }, 'simple': { 'format': simple_format }, }, 'filters': {}, 'handlers': { #Print logs to terminals 'console': { 'level': 'DEBUG', 'class': 'logging.StreamHandler', # Print to screen 'formatter': 'simple' }, #Print logs to files and collect info and above logs 'default': { 'level': 'DEBUG', 'class': 'logging.handlers.RotatingFileHandler', # Save to file 'formatter': 'standard', 'filename': logfile_path, # log file 'maxBytes': 1024*1024*5, # Log size 5M 'backupCount': 5, #Maximum number of log files 'encoding': 'utf-8', # Coding of log files }, }, 'loggers': { #logger configuration obtained by logging. getLogger (_name_) 'aa': { 'handlers': ['default', 'console'], # Here we add both handler s defined above, that is, log data is written to the file and printed to the screen. 'level': 'DEBUG', 'propagate': True, # Pass up (logger with higher level) }, }, }
Supplement:
The getLogger parameter corresponds to the key of loggers in the dictionary. If there is no matching key, it returns the default generator of the system. We can set a generator to default by using an empty key in the dictionary.
'loggers': { # Set the key to empty '': { 'handlers': ['default', 'console'], # Here we add both handler s defined above, that is, log data is written to the file and printed to the screen. 'level': 'DEBUG', 'propagate': True, # Pass up (logger with higher level) }, },
The function provided by the module can be invoked later to output the log
logging.info("Test Information!")
In addition, we did not specify a generator when we first used the log, but we can use it because the system has a default generator name called root.
Final to complete the previous requirements:
There is a login registration function that needs to record logs, while generating two copies for the programmer to see, one for the boss to see, as a programmer should see more detailed logs, two bosses should be simple, because he does not need to care about the details of the program.
# Format for programmers to see standard_format = '[%(asctime)s][%(threadName)s:%(thread)d][task_id:%(name)s][%(filename)s:%(lineno)d]' \ '[%(levelname)s][%(message)s]' #Where name is the name specified for getlogger logfile_path1 = "coder.log" # The format that the boss looks at simple_format = '[%(levelname)s][%(asctime)s]%(message)s' logfile_path2 = "boss.log" LOGGING_DIC = { 'version': 1, 'disable_existing_loggers': False, 'formatters': { 'standard': { 'format': standard_format }, 'simple': { 'format': simple_format }, }, 'filters': {}, 'handlers': { #Print logs to terminals 'console': { 'level': 'DEBUG', 'class': 'logging.StreamHandler', # Print to screen 'formatter': 'simple' }, #Print logs to files and collect info and above logs 'std': { 'level': 'DEBUG', 'class': 'logging.handlers.RotatingFileHandler', # Save to file 'formatter': 'standard', 'filename': logfile_path1, # log file 'maxBytes': 1024*1024*5, # Log size 5M 'backupCount': 5, #Maximum number of log files 'encoding': 'utf-8', # Coding of log files }, 'boss': { 'level': 'DEBUG', 'class': 'logging.handlers.RotatingFileHandler', # Save to file 'formatter': 'simple', 'filename': logfile_path2, # log file 'maxBytes': 1024 * 1024 * 5, # Log size 5M 'backupCount': 5, # Maximum number of log files 'encoding': 'utf-8', # Coding of log files } }, 'loggers': { #logger configuration obtained by logging. getLogger (_name_) 'aa': { 'handlers': ['std', 'console',"boss"], # The handler defined above is added here, that is, log data is output to three locations at the same time. 'level': 'INFO', 'propagate': True, # Pass up (logger with higher level) }, }, }