Skip to content

Logger

Logger utility

Usage Documentation

Logger

CLASS DESCRIPTION
Logger

Creates and setups a logger to format statements in JSON.

FUNCTION DESCRIPTION
log_uncaught_exception_hook

Callback function for sys.excepthook to use Logger to log uncaught exceptions

set_package_logger

Set an additional stream handler, formatter, and log level for aws_lambda_powertools package logger.

Logger

Logger(
    service: str | None = None,
    level: str | int | None = None,
    child: bool = False,
    sampling_rate: float | None = None,
    stream: IO[str] | None = None,
    logger_formatter: PowertoolsFormatter | None = None,
    logger_handler: Handler | None = None,
    log_uncaught_exceptions: bool = False,
    json_serializer: Callable[[dict], str] | None = None,
    json_deserializer: Callable[
        [dict | str | bool | int | float], str
    ]
    | None = None,
    json_default: Callable[[Any], Any] | None = None,
    datefmt: str | None = None,
    use_datetime_directive: bool = False,
    log_record_order: list[str] | None = None,
    utc: bool = False,
    use_rfc3339: bool = False,
    serialize_stacktrace: bool = True,
    buffer_config: LoggerBufferConfig | None = None,
    **kwargs,
)

Creates and setups a logger to format statements in JSON.

Includes service name and any additional key=value into logs It also accepts both service name or level explicitly via env vars

Environment variables

POWERTOOLS_SERVICE_NAME : str service name POWERTOOLS_LOG_LEVEL: str logging level (e.g. INFO, DEBUG) POWERTOOLS_LOGGER_SAMPLE_RATE: float sampling rate ranging from 0 to 1, 1 being 100% sampling

PARAMETER DESCRIPTION
service

service name to be appended in logs, by default "service_undefined"

TYPE: str DEFAULT: None

level

The level to set. Can be a string representing the level name: 'DEBUG', 'INFO', 'WARNING', 'ERROR', 'CRITICAL' or an integer representing the level value: 10 for 'DEBUG', 20 for 'INFO', 30 for 'WARNING', 40 for 'ERROR', 50 for 'CRITICAL'. by default "INFO"

TYPE: str, int optional DEFAULT: None

child

create a child Logger named ., False by default

TYPE: bool DEFAULT: False

sampling_rate

sample rate for debug calls within execution context defaults to 0.0

TYPE: float | None DEFAULT: None

stream

valid output for a logging stream, by default sys.stdout

TYPE: IO[str] | None DEFAULT: None

logger_formatter

custom logging formatter that implements PowertoolsFormatter

TYPE: PowertoolsFormatter | None DEFAULT: None

logger_handler

custom logging handler e.g. logging.FileHandler("file.log")

TYPE: Handler | None DEFAULT: None

log_uncaught_exceptions

logs uncaught exception using sys.excepthook

TYPE: bool DEFAULT: False

buffer_config

logger buffer configuration

See: https://docs.python.org/3/library/sys.html#sys.excepthook

TYPE: LoggerBufferConfig | None DEFAULT: None

Parameters propagated to LambdaPowertoolsFormatter

json_serializer : Callable, optional function to serialize obj to a JSON formatted str, by default json.dumps json_deserializer : Callable, optional function to deserialize str, bytes, bytearraycontaining a JSON document to a Pythonobj, by default json.loads json_default : Callable, optional function to coerce unserializable values, by defaultstr()` Only used when no custom formatter is set utc : bool, optional set logging timestamp to UTC, by default False to continue to use local time as per stdlib log_record_order : list, optional set order of log keys when logging, by default ["level", "location", "message", "timestamp"]

Example

Setups structured logging in JSON for Lambda functions with explicit service name

1
2
3
4
5
>>> from aws_lambda_powertools import Logger
>>> logger = Logger(service="payment")
>>>
>>> def handler(event, context):
        logger.info("Hello")

Setups structured logging in JSON for Lambda functions using env vars

1
2
3
4
5
6
7
$ export POWERTOOLS_SERVICE_NAME="payment"
$ export POWERTOOLS_LOGGER_SAMPLE_RATE=0.01 # 1% debug sampling
>>> from aws_lambda_powertools import Logger
>>> logger = Logger()
>>>
>>> def handler(event, context):
        logger.info("Hello")

Append payment_id to previously setup logger

1
2
3
4
5
6
>>> from aws_lambda_powertools import Logger
>>> logger = Logger(service="payment")
>>>
>>> def handler(event, context):
        logger.append_keys(payment_id=event["payment_id"])
        logger.info("Hello")

Create child Logger using logging inheritance via child param

1
2
3
4
5
6
7
8
>>> # app.py
>>> import another_file
>>> from aws_lambda_powertools import Logger
>>> logger = Logger(service="payment")
>>>
>>> # another_file.py
>>> from aws_lambda_powertools import Logger
>>> logger = Logger(service="payment", child=True)

Logging in UTC timezone

1
2
3
4
5
>>> # app.py
>>> import logging
>>> from aws_lambda_powertools import Logger
>>>
>>> logger = Logger(service="payment", utc=True)

Brings message as the first key in log statements

1
2
3
4
5
>>> # app.py
>>> import logging
>>> from aws_lambda_powertools import Logger
>>>
>>> logger = Logger(service="payment", log_record_order=["message"])

Logging to a file instead of standard output for testing

1
2
3
4
5
>>> # app.py
>>> import logging
>>> from aws_lambda_powertools import Logger
>>>
>>> logger = Logger(service="payment", logger_handler=logging.FileHandler("log.json"))
RAISES DESCRIPTION
InvalidLoggerSamplingRateError

When sampling rate provided is not a float

METHOD DESCRIPTION
append_context_keys

Context manager to temporarily add logging keys.

clear_buffer

Clear the internal buffer cache.

clear_state

Removes all custom keys that were appended to the Logger.

flush_buffer

Flush all buffered log records associated with current execution.

get_correlation_id

Gets the correlation_id in the logging json

inject_lambda_context

Decorator to capture Lambda contextual info and inject into logger

refresh_sample_rate_calculation

Refreshes the sample rate calculation by reconfiguring logging settings.

set_correlation_id

Sets the correlation_id in the logging json

structure_logs

Sets logging formatting to JSON.

ATTRIBUTE DESCRIPTION
handlers

List of registered logging handlers

TYPE: list[Handler]

registered_formatter

Convenience property to access the first logger formatter

TYPE: BasePowertoolsFormatter

registered_handler

Convenience property to access the first logger handler

TYPE: Handler

Source code in aws_lambda_powertools/logging/logger.py
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
def __init__(
    self,
    service: str | None = None,
    level: str | int | None = None,
    child: bool = False,
    sampling_rate: float | None = None,
    stream: IO[str] | None = None,
    logger_formatter: PowertoolsFormatter | None = None,
    logger_handler: logging.Handler | None = None,
    log_uncaught_exceptions: bool = False,
    json_serializer: Callable[[dict], str] | None = None,
    json_deserializer: Callable[[dict | str | bool | int | float], str] | None = None,
    json_default: Callable[[Any], Any] | None = None,
    datefmt: str | None = None,
    use_datetime_directive: bool = False,
    log_record_order: list[str] | None = None,
    utc: bool = False,
    use_rfc3339: bool = False,
    serialize_stacktrace: bool = True,
    buffer_config: LoggerBufferConfig | None = None,
    **kwargs,
) -> None:
    self.service = resolve_env_var_choice(
        choice=service,
        env=os.getenv(constants.SERVICE_NAME_ENV, "service_undefined"),
    )
    self.sampling_rate = resolve_env_var_choice(
        choice=sampling_rate,
        env=os.getenv(constants.LOGGER_LOG_SAMPLING_RATE),
    )
    self._default_log_keys: dict[str, Any] = {"service": self.service, "sampling_rate": self.sampling_rate}
    self.child = child
    self.logger_formatter = logger_formatter
    self._stream = stream or sys.stdout

    self.log_uncaught_exceptions = log_uncaught_exceptions

    self._is_deduplication_disabled = resolve_truthy_env_var_choice(
        env=os.getenv(constants.LOGGER_LOG_DEDUPLICATION_ENV, "false"),
    )
    self._logger = self._get_logger()
    self.logger_handler = logger_handler or self._get_handler()

    # NOTE: This is primarily to improve UX, so IDEs can autocomplete LambdaPowertoolsFormatter options
    # previously, we masked all of them as kwargs thus limiting feature discovery
    formatter_options = {
        "json_serializer": json_serializer,
        "json_deserializer": json_deserializer,
        "json_default": json_default,
        "datefmt": datefmt,
        "use_datetime_directive": use_datetime_directive,
        "log_record_order": log_record_order,
        "utc": utc,
        "use_rfc3339": use_rfc3339,
        "serialize_stacktrace": serialize_stacktrace,
    }

    self._buffer_config = buffer_config
    if self._buffer_config:
        self._buffer_cache = LoggerBufferCache(max_size_bytes=self._buffer_config.max_bytes)

    # Used in case of sampling
    self.initial_log_level = self._determine_log_level(level)

    self._init_logger(
        formatter_options=formatter_options,
        log_level=level,
        buffer_config=self._buffer_config,
        buffer_cache=getattr(self, "_buffer_cache", None),
        **kwargs,
    )

    if self.log_uncaught_exceptions:
        logger.debug("Replacing exception hook")
        sys.excepthook = functools.partial(log_uncaught_exception_hook, logger=self)

handlers property

handlers: list[Handler]

List of registered logging handlers

Notes

Looking for the first configured handler? Use registered_handler property instead.

registered_formatter property

registered_formatter: BasePowertoolsFormatter

Convenience property to access the first logger formatter

registered_handler property

registered_handler: Handler

Convenience property to access the first logger handler

append_context_keys

append_context_keys(
    **additional_keys: Any,
) -> Generator[None, None, None]

Context manager to temporarily add logging keys.

PARAMETER DESCRIPTION
**additional_keys

Key-value pairs to include in the log context during the lifespan of the context manager.

TYPE: Any DEFAULT: {}

Example

Logging with contextual keys

1
2
3
4
logger = Logger(service="example_service")
with logger.append_context_keys(user_id="123", operation="process"):
    logger.info("Log with context")
logger.info("Log without context")
Source code in aws_lambda_powertools/logging/logger.py
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
@contextmanager
def append_context_keys(self, **additional_keys: Any) -> Generator[None, None, None]:
    """
    Context manager to temporarily add logging keys.

    Parameters
    -----------
    **additional_keys: Any
        Key-value pairs to include in the log context during the lifespan of the context manager.

    Example
    --------
    **Logging with contextual keys**

        logger = Logger(service="example_service")
        with logger.append_context_keys(user_id="123", operation="process"):
            logger.info("Log with context")
        logger.info("Log without context")
    """
    with self.registered_formatter.append_context_keys(**additional_keys):
        yield

clear_buffer

clear_buffer() -> None

Clear the internal buffer cache.

This method removes all items from the buffer cache, effectively resetting it to an empty state.

RETURNS DESCRIPTION
None
Source code in aws_lambda_powertools/logging/logger.py
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
def clear_buffer(self) -> None:
    """
    Clear the internal buffer cache.

    This method removes all items from the buffer cache, effectively resetting it to an empty state.

    Returns
    -------
    None
    """
    if self._buffer_config:
        self._buffer_cache.clear()

clear_state

clear_state() -> None

Removes all custom keys that were appended to the Logger.

Source code in aws_lambda_powertools/logging/logger.py
832
833
834
835
836
837
838
def clear_state(self) -> None:
    """Removes all custom keys that were appended to the Logger."""
    # Clear all custom keys from the formatter
    self.registered_formatter.clear_state()

    # Reset to default keys
    self.structure_logs(**self._default_log_keys)

flush_buffer

flush_buffer() -> None

Flush all buffered log records associated with current execution.

Notes

Retrieves log records for current trace from buffer Immediately processes and logs each record Warning if some cache was evicted in that execution Clears buffer after complete processing

RAISES DESCRIPTION
Any exceptions from underlying logging or buffer mechanisms
will be propagated to caller
Source code in aws_lambda_powertools/logging/logger.py
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
def flush_buffer(self) -> None:
    """
    Flush all buffered log records associated with current execution.

    Notes
    -----
    Retrieves log records for current trace from buffer
    Immediately processes and logs each record
    Warning if some cache was evicted in that execution
    Clears buffer after complete processing

    Raises
    ------
    Any exceptions from underlying logging or buffer mechanisms
    will be propagated to caller
    """

    tracer_id = get_tracer_id()

    # Flushing log without a tracer id? Return
    if not tracer_id:
        return

    # is buffer empty? return
    buffer = self._buffer_cache.get(tracer_id)
    if not buffer:
        return

    if not self._buffer_config:
        return

    # Check ALC level against buffer level
    lambda_log_level = self._get_aws_lambda_log_level()
    if lambda_log_level:
        # Check if buffer level is less verbose than ALC
        if logging.getLevelName(lambda_log_level) > logging.getLevelName(self._buffer_config.buffer_at_verbosity):
            warnings.warn(
                "Advanced Logging Controls (ALC) Log Level is less verbose than Log Buffering Log Level. "
                "Some logs might be missing",
                PowertoolsUserWarning,
                stacklevel=2,
            )

    # Process log records
    for log_line in buffer:
        self._create_and_flush_log_record(log_line)

    # Has items evicted?
    if self._buffer_cache.has_items_evicted(tracer_id):
        warnings.warn(
            message="Some logs are not displayed because they were evicted from the buffer. "
            "Increase buffer size to store more logs in the buffer",
            category=PowertoolsUserWarning,
            stacklevel=2,
        )

    # Clear the entire cache
    self._buffer_cache.clear()

get_correlation_id

get_correlation_id() -> str | None

Gets the correlation_id in the logging json

RETURNS DESCRIPTION
(str, optional)

Value for the correlation id

Source code in aws_lambda_powertools/logging/logger.py
915
916
917
918
919
920
921
922
923
924
925
def get_correlation_id(self) -> str | None:
    """Gets the correlation_id in the logging json

    Returns
    -------
    str, optional
        Value for the correlation id
    """
    if isinstance(self.registered_formatter, LambdaPowertoolsFormatter):
        return self.registered_formatter.log_format.get("correlation_id")
    return None

inject_lambda_context

inject_lambda_context(
    lambda_handler: AnyCallableT,
    log_event: bool | None = None,
    correlation_id_path: str | None = None,
    clear_state: bool | None = False,
    flush_buffer_on_uncaught_error: bool = False,
) -> AnyCallableT
inject_lambda_context(
    lambda_handler: None = None,
    log_event: bool | None = None,
    correlation_id_path: str | None = None,
    clear_state: bool | None = False,
    flush_buffer_on_uncaught_error: bool = False,
) -> Callable[[AnyCallableT], AnyCallableT]
inject_lambda_context(
    lambda_handler: AnyCallableT | None = None,
    log_event: bool | None = None,
    correlation_id_path: str | None = None,
    clear_state: bool | None = False,
    flush_buffer_on_uncaught_error: bool = False,
) -> Any

Decorator to capture Lambda contextual info and inject into logger

PARAMETER DESCRIPTION
clear_state

Instructs logger to remove any custom keys previously added

TYPE: bool DEFAULT: False

lambda_handler

Method to inject the lambda context

TYPE: Callable DEFAULT: None

log_event

Instructs logger to log Lambda Event, by default False

TYPE: bool DEFAULT: None

correlation_id_path

Optional JMESPath for the correlation_id

TYPE: str | None DEFAULT: None

Environment variables

POWERTOOLS_LOGGER_LOG_EVENT : str instruct logger to log Lambda Event (e.g. "true", "True", "TRUE")

Example

Captures Lambda contextual runtime info (e.g memory, arn, req_id)

1
2
3
4
5
6
7
from aws_lambda_powertools import Logger

logger = Logger(service="payment")

@logger.inject_lambda_context
def handler(event, context):
    logger.info("Hello")

Captures Lambda contextual runtime info and logs incoming request

1
2
3
4
5
6
7
from aws_lambda_powertools import Logger

logger = Logger(service="payment")

@logger.inject_lambda_context(log_event=True)
def handler(event, context):
    logger.info("Hello")
RETURNS DESCRIPTION
decorate

Decorated lambda handler

TYPE: Callable

Source code in aws_lambda_powertools/logging/logger.py
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
def inject_lambda_context(
    self,
    lambda_handler: AnyCallableT | None = None,
    log_event: bool | None = None,
    correlation_id_path: str | None = None,
    clear_state: bool | None = False,
    flush_buffer_on_uncaught_error: bool = False,
) -> Any:
    """Decorator to capture Lambda contextual info and inject into logger

    Parameters
    ----------
    clear_state : bool, optional
        Instructs logger to remove any custom keys previously added
    lambda_handler : Callable
        Method to inject the lambda context
    log_event : bool, optional
        Instructs logger to log Lambda Event, by default False
    correlation_id_path: str, optional
        Optional JMESPath for the correlation_id

    Environment variables
    ---------------------
    POWERTOOLS_LOGGER_LOG_EVENT : str
        instruct logger to log Lambda Event (e.g. `"true", "True", "TRUE"`)

    Example
    -------
    **Captures Lambda contextual runtime info (e.g memory, arn, req_id)**

        from aws_lambda_powertools import Logger

        logger = Logger(service="payment")

        @logger.inject_lambda_context
        def handler(event, context):
            logger.info("Hello")

    **Captures Lambda contextual runtime info and logs incoming request**

        from aws_lambda_powertools import Logger

        logger = Logger(service="payment")

        @logger.inject_lambda_context(log_event=True)
        def handler(event, context):
            logger.info("Hello")

    Returns
    -------
    decorate : Callable
        Decorated lambda handler
    """

    # If handler is None we've been called with parameters
    # Return a partial function with args filled
    if lambda_handler is None:
        logger.debug("Decorator called with parameters")
        return functools.partial(
            self.inject_lambda_context,
            log_event=log_event,
            correlation_id_path=correlation_id_path,
            clear_state=clear_state,
            flush_buffer_on_uncaught_error=flush_buffer_on_uncaught_error,
        )

    log_event = resolve_truthy_env_var_choice(
        env=os.getenv(constants.LOGGER_LOG_EVENT_ENV, "false"),
        choice=log_event,
    )

    @functools.wraps(lambda_handler)
    def decorate(event, context, *args, **kwargs):
        lambda_context = build_lambda_context_model(context)
        cold_start = _is_cold_start()

        if clear_state:
            self.structure_logs(cold_start=cold_start, **lambda_context.__dict__)
        else:
            self.append_keys(cold_start=cold_start, **lambda_context.__dict__)

        if correlation_id_path:
            self.set_correlation_id(
                jmespath_utils.query(envelope=correlation_id_path, data=event),
            )

        if log_event:
            logger.debug("Event received")
            self.info(extract_event_from_common_models(event))

        # Sampling rate is defined, and this is not ColdStart
        # then we need to recalculate the sampling
        # See: https://github.com/aws-powertools/powertools-lambda-python/issues/6141
        if self.sampling_rate and not cold_start:
            self.refresh_sample_rate_calculation()

        try:
            # Execute the Lambda handler with provided event and context
            return lambda_handler(event, context, *args, **kwargs)
        except:
            # Flush the log buffer if configured to do so on uncaught errors
            # Ensures logging state is cleaned up even if an exception is raised
            if flush_buffer_on_uncaught_error:
                logger.debug("Uncaught error detected, flushing log buffer before exit")
                self.flush_buffer()
            # Re-raise any exceptions that occur during handler execution
            raise
        finally:
            # Clear the cache after invocation is complete
            if self._buffer_config:
                self._buffer_cache.clear()

    return decorate

refresh_sample_rate_calculation

refresh_sample_rate_calculation() -> None

Refreshes the sample rate calculation by reconfiguring logging settings.

RETURNS DESCRIPTION
None
Source code in aws_lambda_powertools/logging/logger.py
391
392
393
394
395
396
397
398
399
400
def refresh_sample_rate_calculation(self) -> None:
    """
    Refreshes the sample rate calculation by reconfiguring logging settings.

    Returns
    -------
        None
    """
    self._logger.setLevel(self.initial_log_level)
    self._configure_sampling()

set_correlation_id

set_correlation_id(value: str | None) -> None

Sets the correlation_id in the logging json

PARAMETER DESCRIPTION
value

Value for the correlation id. None will remove the correlation_id

TYPE: str

Source code in aws_lambda_powertools/logging/logger.py
905
906
907
908
909
910
911
912
913
def set_correlation_id(self, value: str | None) -> None:
    """Sets the correlation_id in the logging json

    Parameters
    ----------
    value : str, optional
        Value for the correlation id. None will remove the correlation_id
    """
    self.append_keys(correlation_id=value)

structure_logs

structure_logs(
    append: bool = False,
    formatter_options: dict | None = None,
    **keys,
) -> None

Sets logging formatting to JSON.

Optionally, it can append keyword arguments to an existing logger, so it is available across future log statements.

Last keyword argument and value wins if duplicated.

PARAMETER DESCRIPTION
append

append keys provided to logger formatter, by default False

TYPE: bool DEFAULT: False

formatter_options

LambdaPowertoolsFormatter options to be propagated, by default {}

TYPE: dict DEFAULT: None

Source code in aws_lambda_powertools/logging/logger.py
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
def structure_logs(self, append: bool = False, formatter_options: dict | None = None, **keys) -> None:
    """Sets logging formatting to JSON.

    Optionally, it can append keyword arguments
    to an existing logger, so it is available across future log statements.

    Last keyword argument and value wins if duplicated.

    Parameters
    ----------
    append : bool, optional
        append keys provided to logger formatter, by default False
    formatter_options : dict, optional
        LambdaPowertoolsFormatter options to be propagated, by default {}
    """
    formatter_options = formatter_options or {}

    # There are 3 operational modes for this method
    ## 1. Register a Powertools for AWS Lambda (Python) Formatter for the first time
    ## 2. Append new keys to the current logger formatter; deprecated in favour of append_keys
    ## 3. Add new keys and discard existing to the registered formatter

    # Mode 1
    log_keys = {**self._default_log_keys, **keys}
    is_logger_preconfigured = getattr(self._logger, LOGGER_ATTRIBUTE_PRECONFIGURED, False)
    if not is_logger_preconfigured:
        formatter = self.logger_formatter or LambdaPowertoolsFormatter(**formatter_options, **log_keys)
        self.registered_handler.setFormatter(formatter)

        # when using a custom Powertools for AWS Lambda (Python) Formatter
        # standard and custom keys that are not Powertools for AWS Lambda (Python) Formatter parameters
        # should be appended and custom keys that might happen to be Powertools for AWS Lambda (Python)
        # Formatter parameters should be discarded this prevents adding them as custom keys, for example,
        # `json_default=<callable>` see https://github.com/aws-powertools/powertools-lambda-python/issues/1263
        custom_keys = {k: v for k, v in log_keys.items() if k not in RESERVED_FORMATTER_CUSTOM_KEYS}
        return self.registered_formatter.append_keys(**custom_keys)

    # Mode 2 (legacy)
    if append:
        # Maintenance: Add deprecation warning for major version
        return self.append_keys(**keys)

    # Mode 3
    self.registered_formatter.clear_state()
    self.registered_formatter.thread_safe_clear_keys()
    self.registered_formatter.append_keys(**log_keys)

log_uncaught_exception_hook

log_uncaught_exception_hook(
    exc_type, exc_value, exc_traceback, logger: Logger
) -> None

Callback function for sys.excepthook to use Logger to log uncaught exceptions

Source code in aws_lambda_powertools/logging/logger.py
1293
1294
1295
def log_uncaught_exception_hook(exc_type, exc_value, exc_traceback, logger: Logger) -> None:
    """Callback function for sys.excepthook to use Logger to log uncaught exceptions"""
    logger.exception(exc_value, exc_info=(exc_type, exc_value, exc_traceback))  # pragma: no cover

set_package_logger

set_package_logger(
    level: str | int = logging.DEBUG,
    stream: IO[str] | None = None,
    formatter: Formatter | None = None,
) -> None

Set an additional stream handler, formatter, and log level for aws_lambda_powertools package logger.

Package log by default is suppressed (NullHandler), this should only used for debugging. This is separate from application Logger class utility

Example

Enables debug logging for Powertools for AWS Lambda (Python) package

1
2
>>> aws_lambda_powertools.logging.logger import set_package_logger
>>> set_package_logger()
PARAMETER DESCRIPTION
level

log level, DEBUG by default

TYPE: str | int DEFAULT: DEBUG

stream

log stream, stdout by default

TYPE: IO[str] | None DEFAULT: None

formatter

log formatter, "%(asctime)s %(name)s [%(levelname)s] %(message)s" by default

TYPE: Formatter | None DEFAULT: None

Source code in aws_lambda_powertools/logging/logger.py
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
def set_package_logger(
    level: str | int = logging.DEBUG,
    stream: IO[str] | None = None,
    formatter: logging.Formatter | None = None,
) -> None:
    """Set an additional stream handler, formatter, and log level for aws_lambda_powertools package logger.

    **Package log by default is suppressed (NullHandler), this should only used for debugging.
    This is separate from application Logger class utility**

    Example
    -------
    **Enables debug logging for Powertools for AWS Lambda (Python) package**

        >>> aws_lambda_powertools.logging.logger import set_package_logger
        >>> set_package_logger()

    Parameters
    ----------
    level: str, int
        log level, DEBUG by default
    stream: sys.stdout
        log stream, stdout by default
    formatter: logging.Formatter
        log formatter, "%(asctime)s %(name)s [%(levelname)s] %(message)s" by default
    """
    if formatter is None:
        formatter = logging.Formatter("%(asctime)s %(name)s [%(levelname)s] %(message)s")

    if stream is None:
        stream = sys.stdout

    logger = logging.getLogger("aws_lambda_powertools")
    logger.setLevel(level)
    handler = logging.StreamHandler(stream)
    handler.setFormatter(formatter)
    logger.addHandler(handler)