The logging module in Python is a standard module that provides a flexible framework for outputting log messages from Python programs. It is an invaluable tool for tracking events that happen during the execution of a program, which is especially helpful for understanding program flow and diagnosing issues.

### Log Levels

Python’s logging module defines the following log levels, each indicating the severity of events:

1. **DEBUG**: Detailed information, typically of interest only when diagnosing problems.
2. **INFO**: Confirmation that things are working as expected.
3. **WARNING**: An indication that something unexpected happened, or indicative of some problem in the near future (e.g., ‘disk space low’). The software is still working as expected.
4. **ERROR**: Due to a more serious problem, the software has not been able to perform some function.
5. **CRITICAL**: A very serious error, indicating that the program itself may be unable to continue running.

### Configuring Logging Format

The format of log messages can be configured to include information such as the time of the message, the log level, and the message itself.

Here’s an example of how to configure the basic logging:

“`python
import logging

logging.basicConfig(
level=logging.DEBUG,
format=’%(asctime)s – %(name)s – %(levelname)s – %(message)s’,
datefmt=’%Y-%m-%d %H:%M:%S’
)

logging.debug(“This is a debug message”)
logging.info(“This is an info message”)
logging.warning(“This is a warning message”)
logging.error(“This is an error message”)
logging.critical(“This is a critical message”)
“`

### Handlers

The logging module supports different types of handlers, which are used to direct log messages to specific destinations. Some common handlers include:

– **StreamHandler**: Sends log messages to streams like `sys.stdout` or `sys.stderr`.
– **FileHandler**: Sends log messages to a file.
– **RotatingFileHandler**: Similar to `FileHandler`, but it rotates the log file size reaches a certain limit.
– **SMTPHandler**: Sends logs via email.
– **HTTPHandler**: Sends logs to a web server.

### Logging to a File

To log messages to a file, you can use a `FileHandler`:

“`python
import logging

# Create a custom logger
logger = logging.getLogger(__name__)

# Create handlers
c_handler = logging.StreamHandler()
f_handler = logging.FileHandler(‘file.log’)

# Set different logging levels
c_handler.setLevel(logging.WARNING)
f_handler.setLevel(logging.ERROR)

# Create formatters and add them to handlers
c_format = logging.Formatter(‘%(name)s – %(levelname)s – %(message)s’)
f_format = logging.Formatter(‘%(asctime)s – %(name)s – %(levelname)s – %(message)s’)

c_handler.setFormatter(c_format)
f_handler.setFormatter(f_format)

# Add handlers to the logger
logger.addHandler(c_handler)
logger.addHandler(f_handler)

logger.warning(‘This is a warning’) # Will be printed to the console
logger.error(‘This is an error’) # Will be printed to the console and file
“`

### Real-world Examples

1. **Tracking Errors in Scripts**
– You can use the logging module to track errors or failed operations in your script. For example, if you have a script processing data and encounters a serious issue, logging allows you to capture the error message along with relevant metadata, which can be crucial for debugging.

2. **Monitoring ETL Pipelines**
– In ETL (Extract, Transform, Load) pipelines, logging can be used for auditing and troubleshooting data processing. For instance, you can log the start and end of each operation (extracting data, transforming it, and loading it to a destination) to ensure that all steps are being executed properly and to track down which step might have failed in case of errors.

“`python
import logging

def extract_data():
logger.info(‘Started extracting data’)
# Extraction logic…
logger.info(‘Finished extracting data’)

def transform_data():
logger.info(‘Started transforming data’)
# Transformation logic…
logger.info(‘Finished transforming data’)

def load_data():
logger.info(‘Started loading data’)
# Loading logic…
logger.info(‘Finished loading data’)

if __name__ == ‘__main__’:
logger = logging.getLogger(‘ETL_Pipeline’)
handler = logging.FileHandler(‘etl_pipeline.log’)
formatter = logging.Formatter(‘%(asctime)s – %(name)s – %(levelname)s – %(message)s’)
handler.setFormatter(formatter)
logger.addHandler(handler)
logger.setLevel(logging.INFO)

try:
extract_data()
transform_data()
load_data()
except Exception as e:
logger.error(f”Exception occurred: {e}”, exc_info=True)
“`

In conclusion, Python’s logging module is a versatile and powerful tool for recording application behavior and critical events. By using appropriate log levels, handlers, and formats, developers can capture detailed information needed to monitor, debug, and optimize their applications.

Scroll to Top