diff --git a/CHANGELOG.md b/CHANGELOG.md
index 8cd2040d780..95bfb7ad91e 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -4,6 +4,24 @@
# Unreleased
+## Documentation
+
+* **readme:** add lambda layer latest version badge
+
+## Features
+
+* **parser:** add KinesisFirehoseModel ([#1556](https://github.com/awslabs/aws-lambda-powertools-python/issues/1556))
+
+## Maintenance
+
+* **deps-dev:** bump types-requests from 2.28.11.1 to 2.28.11.2 ([#1576](https://github.com/awslabs/aws-lambda-powertools-python/issues/1576))
+* **deps-dev:** bump typing-extensions from 4.3.0 to 4.4.0 ([#1575](https://github.com/awslabs/aws-lambda-powertools-python/issues/1575))
+* **layer:** remove unsused GetFunction permission for the canary
+* **layer:** bump to latest version 37
+
+
+
+## [v1.30.0] - 2022-10-05
## Bug Fixes
* **apigateway:** update Response class to require status_code only ([#1560](https://github.com/awslabs/aws-lambda-powertools-python/issues/1560))
@@ -29,22 +47,23 @@
## Maintenance
-* **deps:** bump codecov/codecov-action from 3.1.0 to 3.1.1 ([#1529](https://github.com/awslabs/aws-lambda-powertools-python/issues/1529))
+* **dep:** bump pyproject to pypi sync
+* **deps:** bump fastjsonschema from 2.16.1 to 2.16.2 ([#1530](https://github.com/awslabs/aws-lambda-powertools-python/issues/1530))
* **deps:** bump actions/setup-python from 3 to 4 ([#1528](https://github.com/awslabs/aws-lambda-powertools-python/issues/1528))
-* **deps:** bump email-validator from 1.2.1 to 1.3.0 ([#1533](https://github.com/awslabs/aws-lambda-powertools-python/issues/1533))
+* **deps:** bump codecov/codecov-action from 3.1.0 to 3.1.1 ([#1529](https://github.com/awslabs/aws-lambda-powertools-python/issues/1529))
* **deps:** bump dependabot/fetch-metadata from 1.3.3 to 1.3.4 ([#1565](https://github.com/awslabs/aws-lambda-powertools-python/issues/1565))
-* **deps:** bump fastjsonschema from 2.16.1 to 2.16.2 ([#1530](https://github.com/awslabs/aws-lambda-powertools-python/issues/1530))
-* **deps-dev:** bump mypy-boto3-s3 from 1.24.36.post1 to 1.24.76 ([#1531](https://github.com/awslabs/aws-lambda-powertools-python/issues/1531))
+* **deps:** bump email-validator from 1.2.1 to 1.3.0 ([#1533](https://github.com/awslabs/aws-lambda-powertools-python/issues/1533))
* **deps-dev:** bump mypy-boto3-secretsmanager from 1.24.54 to 1.24.83 ([#1557](https://github.com/awslabs/aws-lambda-powertools-python/issues/1557))
+* **deps-dev:** bump mkdocs-material from 8.5.3 to 8.5.4 ([#1563](https://github.com/awslabs/aws-lambda-powertools-python/issues/1563))
* **deps-dev:** bump pytest-cov from 3.0.0 to 4.0.0 ([#1551](https://github.com/awslabs/aws-lambda-powertools-python/issues/1551))
* **deps-dev:** bump flake8-bugbear from 22.9.11 to 22.9.23 ([#1541](https://github.com/awslabs/aws-lambda-powertools-python/issues/1541))
-* **deps-dev:** bump mypy-boto3-ssm from 1.24.80 to 1.24.81 ([#1544](https://github.com/awslabs/aws-lambda-powertools-python/issues/1544))
+* **deps-dev:** bump types-requests from 2.28.11 to 2.28.11.1 ([#1571](https://github.com/awslabs/aws-lambda-powertools-python/issues/1571))
* **deps-dev:** bump mypy-boto3-ssm from 1.24.69 to 1.24.80 ([#1542](https://github.com/awslabs/aws-lambda-powertools-python/issues/1542))
* **deps-dev:** bump mako from 1.2.2 to 1.2.3 ([#1537](https://github.com/awslabs/aws-lambda-powertools-python/issues/1537))
* **deps-dev:** bump types-requests from 2.28.10 to 2.28.11 ([#1538](https://github.com/awslabs/aws-lambda-powertools-python/issues/1538))
-* **deps-dev:** bump mkdocs-material from 8.5.3 to 8.5.4 ([#1563](https://github.com/awslabs/aws-lambda-powertools-python/issues/1563))
-* **deps-dev:** bump types-requests from 2.28.11 to 2.28.11.1 ([#1571](https://github.com/awslabs/aws-lambda-powertools-python/issues/1571))
* **deps-dev:** bump mkdocs-material from 8.5.1 to 8.5.3 ([#1532](https://github.com/awslabs/aws-lambda-powertools-python/issues/1532))
+* **deps-dev:** bump mypy-boto3-ssm from 1.24.80 to 1.24.81 ([#1544](https://github.com/awslabs/aws-lambda-powertools-python/issues/1544))
+* **deps-dev:** bump mypy-boto3-s3 from 1.24.36.post1 to 1.24.76 ([#1531](https://github.com/awslabs/aws-lambda-powertools-python/issues/1531))
* **docs:** bump layer version to 36 (1.29.2)
* **layers:** add dummy v2 layer automation
* **lint:** use new isort black integration
@@ -2378,7 +2397,8 @@
* Merge pull request [#5](https://github.com/awslabs/aws-lambda-powertools-python/issues/5) from jfuss/feat/python38
-[Unreleased]: https://github.com/awslabs/aws-lambda-powertools-python/compare/v1.29.2...HEAD
+[Unreleased]: https://github.com/awslabs/aws-lambda-powertools-python/compare/v1.30.0...HEAD
+[v1.30.0]: https://github.com/awslabs/aws-lambda-powertools-python/compare/v1.29.2...v1.30.0
[v1.29.2]: https://github.com/awslabs/aws-lambda-powertools-python/compare/v1.29.1...v1.29.2
[v1.29.1]: https://github.com/awslabs/aws-lambda-powertools-python/compare/v1.29.0...v1.29.1
[v1.29.0]: https://github.com/awslabs/aws-lambda-powertools-python/compare/v1.28.0...v1.29.0
diff --git a/README.md b/README.md
index 2065a983342..c1845f43ce7 100644
--- a/README.md
+++ b/README.md
@@ -1,8 +1,9 @@
+
# AWS Lambda Powertools for Python
[](https://github.com/awslabs/aws-lambda-powertools-python/actions/workflows/python_build.yml)
[](https://app.codecov.io/gh/awslabs/aws-lambda-powertools-python)
-  
+   
[](https://discord.gg/B8zZKbbyET)
A suite of Python utilities for AWS Lambda functions to ease adopting best practices such as tracing, structured logging, custom metrics, and more. (AWS Lambda Powertools [Java](https://github.com/awslabs/aws-lambda-powertools-java) and [Typescript](https://github.com/awslabs/aws-lambda-powertools-typescript) is also available).
diff --git a/aws_lambda_powertools/metrics/metrics.py b/aws_lambda_powertools/metrics/metrics.py
index 00e083d4a7f..cbf1d2eb2e2 100644
--- a/aws_lambda_powertools/metrics/metrics.py
+++ b/aws_lambda_powertools/metrics/metrics.py
@@ -77,7 +77,8 @@ def __init__(self, service: Optional[str] = None, namespace: Optional[str] = Non
self.namespace: Optional[str] = namespace
self.metadata_set = self._metadata
self.default_dimensions = self._default_dimensions
- self.dimension_set = {**self._default_dimensions, **self._dimensions}
+ self.dimension_set = self._dimensions
+ self.dimension_set.update(**self._default_dimensions)
super().__init__(
metric_set=self.metric_set,
diff --git a/aws_lambda_powertools/shared/functions.py b/aws_lambda_powertools/shared/functions.py
index 2212eb77e18..30070382d31 100644
--- a/aws_lambda_powertools/shared/functions.py
+++ b/aws_lambda_powertools/shared/functions.py
@@ -71,7 +71,7 @@ def resolve_env_var_choice(
def base64_decode(value: str) -> bytes:
try:
- logger.debug("Decoding base64 Kafka record item before parsing")
+ logger.debug("Decoding base64 record item before parsing")
return base64.b64decode(value)
except (BinAsciiError, TypeError):
raise ValueError("base64 decode failed")
diff --git a/aws_lambda_powertools/utilities/parser/envelopes/__init__.py b/aws_lambda_powertools/utilities/parser/envelopes/__init__.py
index 4b0e4c943a2..0f985f29d88 100644
--- a/aws_lambda_powertools/utilities/parser/envelopes/__init__.py
+++ b/aws_lambda_powertools/utilities/parser/envelopes/__init__.py
@@ -6,6 +6,7 @@
from .event_bridge import EventBridgeEnvelope
from .kafka import KafkaEnvelope
from .kinesis import KinesisDataStreamEnvelope
+from .kinesis_firehose import KinesisFirehoseEnvelope
from .lambda_function_url import LambdaFunctionUrlEnvelope
from .sns import SnsEnvelope, SnsSqsEnvelope
from .sqs import SqsEnvelope
@@ -17,6 +18,7 @@
"DynamoDBStreamEnvelope",
"EventBridgeEnvelope",
"KinesisDataStreamEnvelope",
+ "KinesisFirehoseEnvelope",
"LambdaFunctionUrlEnvelope",
"SnsEnvelope",
"SnsSqsEnvelope",
diff --git a/aws_lambda_powertools/utilities/parser/envelopes/kinesis.py b/aws_lambda_powertools/utilities/parser/envelopes/kinesis.py
index 9ff221a7b7b..24104ebd40c 100644
--- a/aws_lambda_powertools/utilities/parser/envelopes/kinesis.py
+++ b/aws_lambda_powertools/utilities/parser/envelopes/kinesis.py
@@ -16,7 +16,7 @@ class KinesisDataStreamEnvelope(BaseEnvelope):
Regardless of its type it'll be parsed into a BaseModel object.
Note: Records will be parsed the same way so if model is str,
- all items in the list will be parsed as str and npt as JSON (and vice versa)
+ all items in the list will be parsed as str and not as JSON (and vice versa)
"""
def parse(self, data: Optional[Union[Dict[str, Any], Any]], model: Type[Model]) -> List[Optional[Model]]:
diff --git a/aws_lambda_powertools/utilities/parser/envelopes/kinesis_firehose.py b/aws_lambda_powertools/utilities/parser/envelopes/kinesis_firehose.py
new file mode 100644
index 00000000000..c8dd936512c
--- /dev/null
+++ b/aws_lambda_powertools/utilities/parser/envelopes/kinesis_firehose.py
@@ -0,0 +1,47 @@
+import logging
+from typing import Any, Dict, List, Optional, Type, Union, cast
+
+from ..models import KinesisFirehoseModel
+from ..types import Model
+from .base import BaseEnvelope
+
+logger = logging.getLogger(__name__)
+
+
+class KinesisFirehoseEnvelope(BaseEnvelope):
+ """Kinesis Firehose Envelope to extract array of Records
+
+ The record's data parameter is a base64 encoded string which is parsed into a bytes array,
+ though it can also be a JSON encoded string.
+ Regardless of its type it'll be parsed into a BaseModel object.
+
+ Note: Records will be parsed the same way so if model is str,
+ all items in the list will be parsed as str and not as JSON (and vice versa)
+
+ https://docs.aws.amazon.com/lambda/latest/dg/services-kinesisfirehose.html
+ """
+
+ def parse(self, data: Optional[Union[Dict[str, Any], Any]], model: Type[Model]) -> List[Optional[Model]]:
+ """Parses records found with model provided
+
+ Parameters
+ ----------
+ data : Dict
+ Lambda event to be parsed
+ model : Type[Model]
+ Data model provided to parse after extracting data using envelope
+
+ Returns
+ -------
+ List
+ List of records parsed with model provided
+ """
+ logger.debug(f"Parsing incoming data with Kinesis Firehose model {KinesisFirehoseModel}")
+ parsed_envelope: KinesisFirehoseModel = KinesisFirehoseModel.parse_obj(data)
+ logger.debug(f"Parsing Kinesis Firehose records in `body` with {model}")
+ models = []
+ for record in parsed_envelope.records:
+ # We allow either AWS expected contract (bytes) or a custom Model, see #943
+ data = cast(bytes, record.data)
+ models.append(self._parse(data=data.decode("utf-8"), model=model))
+ return models
diff --git a/aws_lambda_powertools/utilities/parser/models/__init__.py b/aws_lambda_powertools/utilities/parser/models/__init__.py
index 52059cb9ee7..62e28a62374 100644
--- a/aws_lambda_powertools/utilities/parser/models/__init__.py
+++ b/aws_lambda_powertools/utilities/parser/models/__init__.py
@@ -37,6 +37,11 @@
KinesisDataStreamRecord,
KinesisDataStreamRecordPayload,
)
+from .kinesis_firehose import (
+ KinesisFirehoseModel,
+ KinesisFirehoseRecord,
+ KinesisFirehoseRecordMetadata,
+)
from .lambda_function_url import LambdaFunctionUrlModel
from .s3 import S3Model, S3RecordModel
from .s3_object_event import (
@@ -86,6 +91,9 @@
"KinesisDataStreamModel",
"KinesisDataStreamRecord",
"KinesisDataStreamRecordPayload",
+ "KinesisFirehoseModel",
+ "KinesisFirehoseRecord",
+ "KinesisFirehoseRecordMetadata",
"LambdaFunctionUrlModel",
"S3Model",
"S3RecordModel",
diff --git a/aws_lambda_powertools/utilities/parser/models/kinesis.py b/aws_lambda_powertools/utilities/parser/models/kinesis.py
index be868ca44ba..ffc89bcbdaa 100644
--- a/aws_lambda_powertools/utilities/parser/models/kinesis.py
+++ b/aws_lambda_powertools/utilities/parser/models/kinesis.py
@@ -1,14 +1,10 @@
-import base64
-import logging
-from binascii import Error as BinAsciiError
from typing import List, Type, Union
from pydantic import BaseModel, validator
+from aws_lambda_powertools.shared.functions import base64_decode
from aws_lambda_powertools.utilities.parser.types import Literal
-logger = logging.getLogger(__name__)
-
class KinesisDataStreamRecordPayload(BaseModel):
kinesisSchemaVersion: str
@@ -19,11 +15,7 @@ class KinesisDataStreamRecordPayload(BaseModel):
@validator("data", pre=True, allow_reuse=True)
def data_base64_decode(cls, value):
- try:
- logger.debug("Decoding base64 Kinesis data record before parsing")
- return base64.b64decode(value)
- except (BinAsciiError, TypeError):
- raise ValueError("base64 decode failed")
+ return base64_decode(value)
class KinesisDataStreamRecord(BaseModel):
diff --git a/aws_lambda_powertools/utilities/parser/models/kinesis_firehose.py b/aws_lambda_powertools/utilities/parser/models/kinesis_firehose.py
new file mode 100644
index 00000000000..c59d8c680e5
--- /dev/null
+++ b/aws_lambda_powertools/utilities/parser/models/kinesis_firehose.py
@@ -0,0 +1,32 @@
+from typing import List, Optional, Type, Union
+
+from pydantic import BaseModel, PositiveInt, validator
+
+from aws_lambda_powertools.shared.functions import base64_decode
+
+
+class KinesisFirehoseRecordMetadata(BaseModel):
+ shardId: str
+ partitionKey: str
+ approximateArrivalTimestamp: PositiveInt
+ sequenceNumber: str
+ subsequenceNumber: str
+
+
+class KinesisFirehoseRecord(BaseModel):
+ data: Union[bytes, Type[BaseModel]] # base64 encoded str is parsed into bytes
+ recordId: str
+ approximateArrivalTimestamp: PositiveInt
+ kinesisRecordMetadata: Optional[KinesisFirehoseRecordMetadata]
+
+ @validator("data", pre=True, allow_reuse=True)
+ def data_base64_decode(cls, value):
+ return base64_decode(value)
+
+
+class KinesisFirehoseModel(BaseModel):
+ invocationId: str
+ deliveryStreamArn: str
+ region: str
+ sourceKinesisStreamArn: Optional[str]
+ records: List[KinesisFirehoseRecord]
diff --git a/docs/index.md b/docs/index.md
index f831cf1d620..a0b7d7f1bb5 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -14,7 +14,7 @@ A suite of utilities for AWS Lambda functions to ease adopting best practices su
Powertools is available in the following formats:
-* **Lambda Layer**: [**arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPython:36**](#){: .copyMe}:clipboard:
+* **Lambda Layer**: [**arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPython:37**](#){: .copyMe}:clipboard:
* **PyPi**: **`pip install aws-lambda-powertools`**
???+ hint "Support this project by using Lambda Layers :heart:"
@@ -32,28 +32,28 @@ You can include Lambda Powertools Lambda Layer using [AWS Lambda Console](https:
| Region | Layer ARN |
| ---------------- | -------------------------------------------------------------------------------------------------------- |
- | `af-south-1` | [arn:aws:lambda:af-south-1:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `ap-east-1` | [arn:aws:lambda:ap-east-1:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `ap-northeast-1` | [arn:aws:lambda:ap-northeast-1:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `ap-northeast-2` | [arn:aws:lambda:ap-northeast-2:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `ap-northeast-3` | [arn:aws:lambda:ap-northeast-3:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `ap-south-1` | [arn:aws:lambda:ap-south-1:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `ap-southeast-1` | [arn:aws:lambda:ap-southeast-1:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `ap-southeast-2` | [arn:aws:lambda:ap-southeast-2:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `ap-southeast-3` | [arn:aws:lambda:ap-southeast-3:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `ca-central-1` | [arn:aws:lambda:ca-central-1:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `eu-central-1` | [arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `eu-north-1` | [arn:aws:lambda:eu-north-1:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `eu-south-1` | [arn:aws:lambda:eu-south-1:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `eu-west-1` | [arn:aws:lambda:eu-west-1:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `eu-west-2` | [arn:aws:lambda:eu-west-2:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `eu-west-3` | [arn:aws:lambda:eu-west-3:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `me-south-1` | [arn:aws:lambda:me-south-1:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `sa-east-1` | [arn:aws:lambda:sa-east-1:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `us-east-1` | [arn:aws:lambda:us-east-1:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `us-east-2` | [arn:aws:lambda:us-east-2:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `us-west-1` | [arn:aws:lambda:us-west-1:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
- | `us-west-2` | [arn:aws:lambda:us-west-2:017000801446:layer:AWSLambdaPowertoolsPython:36](#){: .copyMe}:clipboard: |
+ | `af-south-1` | [arn:aws:lambda:af-south-1:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `ap-east-1` | [arn:aws:lambda:ap-east-1:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `ap-northeast-1` | [arn:aws:lambda:ap-northeast-1:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `ap-northeast-2` | [arn:aws:lambda:ap-northeast-2:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `ap-northeast-3` | [arn:aws:lambda:ap-northeast-3:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `ap-south-1` | [arn:aws:lambda:ap-south-1:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `ap-southeast-1` | [arn:aws:lambda:ap-southeast-1:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `ap-southeast-2` | [arn:aws:lambda:ap-southeast-2:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `ap-southeast-3` | [arn:aws:lambda:ap-southeast-3:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `ca-central-1` | [arn:aws:lambda:ca-central-1:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `eu-central-1` | [arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `eu-north-1` | [arn:aws:lambda:eu-north-1:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `eu-south-1` | [arn:aws:lambda:eu-south-1:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `eu-west-1` | [arn:aws:lambda:eu-west-1:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `eu-west-2` | [arn:aws:lambda:eu-west-2:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `eu-west-3` | [arn:aws:lambda:eu-west-3:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `me-south-1` | [arn:aws:lambda:me-south-1:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `sa-east-1` | [arn:aws:lambda:sa-east-1:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `us-east-1` | [arn:aws:lambda:us-east-1:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `us-east-2` | [arn:aws:lambda:us-east-2:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `us-west-1` | [arn:aws:lambda:us-west-1:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
+ | `us-west-2` | [arn:aws:lambda:us-west-2:017000801446:layer:AWSLambdaPowertoolsPython:37](#){: .copyMe}:clipboard: |
??? question "Can't find our Lambda Layer for your preferred AWS region?"
You can use [Serverless Application Repository (SAR)](#sar) method, our [CDK Layer Construct](https://github.com/aws-samples/cdk-lambda-powertools-python-layer){target="_blank"}, or PyPi like you normally would for any other library.
@@ -67,7 +67,7 @@ You can include Lambda Powertools Lambda Layer using [AWS Lambda Console](https:
Type: AWS::Serverless::Function
Properties:
Layers:
- - !Sub arn:aws:lambda:${AWS::Region}:017000801446:layer:AWSLambdaPowertoolsPython:36
+ - !Sub arn:aws:lambda:${AWS::Region}:017000801446:layer:AWSLambdaPowertoolsPython:37
```
=== "Serverless framework"
@@ -77,7 +77,7 @@ You can include Lambda Powertools Lambda Layer using [AWS Lambda Console](https:
hello:
handler: lambda_function.lambda_handler
layers:
- - arn:aws:lambda:${aws:region}:017000801446:layer:AWSLambdaPowertoolsPython:36
+ - arn:aws:lambda:${aws:region}:017000801446:layer:AWSLambdaPowertoolsPython:37
```
=== "CDK"
@@ -93,7 +93,7 @@ You can include Lambda Powertools Lambda Layer using [AWS Lambda Console](https:
powertools_layer = aws_lambda.LayerVersion.from_layer_version_arn(
self,
id="lambda-powertools",
- layer_version_arn=f"arn:aws:lambda:{env.region}:017000801446:layer:AWSLambdaPowertoolsPython:36"
+ layer_version_arn=f"arn:aws:lambda:{env.region}:017000801446:layer:AWSLambdaPowertoolsPython:37"
)
aws_lambda.Function(self,
'sample-app-lambda',
@@ -142,7 +142,7 @@ You can include Lambda Powertools Lambda Layer using [AWS Lambda Console](https:
role = aws_iam_role.iam_for_lambda.arn
handler = "index.test"
runtime = "python3.9"
- layers = ["arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPython:36"]
+ layers = ["arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPython:37"]
source_code_hash = filebase64sha256("lambda_function_payload.zip")
}
@@ -161,7 +161,7 @@ You can include Lambda Powertools Lambda Layer using [AWS Lambda Console](https:
? Do you want to configure advanced settings? Yes
...
? Do you want to enable Lambda layers for this function? Yes
- ? Enter up to 5 existing Lambda layer ARNs (comma-separated): arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPython:36
+ ? Enter up to 5 existing Lambda layer ARNs (comma-separated): arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPython:37
❯ amplify push -y
@@ -172,7 +172,7 @@ You can include Lambda Powertools Lambda Layer using [AWS Lambda Console](https:
- Name:
? Which setting do you want to update? Lambda layers configuration
? Do you want to enable Lambda layers for this function? Yes
- ? Enter up to 5 existing Lambda layer ARNs (comma-separated): arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPython:36
+ ? Enter up to 5 existing Lambda layer ARNs (comma-separated): arn:aws:lambda:eu-central-1:017000801446:layer:AWSLambdaPowertoolsPython:37
? Do you want to edit the local lambda function now? No
```
@@ -180,7 +180,7 @@ You can include Lambda Powertools Lambda Layer using [AWS Lambda Console](https:
Change {region} to your AWS region, e.g. `eu-west-1`
```bash title="AWS CLI"
- aws lambda get-layer-version-by-arn --arn arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPython:36 --region {region}
+ aws lambda get-layer-version-by-arn --arn arn:aws:lambda:{region}:017000801446:layer:AWSLambdaPowertoolsPython:37 --region {region}
```
The pre-signed URL to download this Lambda Layer will be within `Location` key.
@@ -218,7 +218,7 @@ If using SAM, you can include this SAR App as part of your shared Layers stack,
Properties:
Location:
ApplicationId: arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer
- SemanticVersion: 1.26.3 # change to latest semantic version available in SAR
+ SemanticVersion: 1.30.0 # change to latest semantic version available in SAR
MyLambdaFunction:
Type: AWS::Serverless::Function
@@ -246,7 +246,7 @@ If using SAM, you can include this SAR App as part of your shared Layers stack,
Location:
ApplicationId: arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer
# Find latest from github.com/awslabs/aws-lambda-powertools-python/releases
- SemanticVersion: 1.26.3
+ SemanticVersion: 1.30.0
```
=== "CDK"
@@ -256,7 +256,7 @@ If using SAM, you can include this SAR App as part of your shared Layers stack,
POWERTOOLS_BASE_NAME = 'AWSLambdaPowertools'
# Find latest from github.com/awslabs/aws-lambda-powertools-python/releases
- POWERTOOLS_VER = '1.26.3'
+ POWERTOOLS_VER = '1.30.0'
POWERTOOLS_ARN = 'arn:aws:serverlessrepo:eu-west-1:057560766410:applications/aws-lambda-powertools-python-layer'
class SampleApp(core.Construct):
@@ -320,7 +320,7 @@ If using SAM, you can include this SAR App as part of your shared Layers stack,
variable "aws_powertools_version" {
type = string
- default = "1.26.3"
+ default = "1.30.0"
description = "The AWS Powertools release version"
}
@@ -462,11 +462,11 @@ Whether you're prototyping locally or against a non-production environment, you
When `POWERTOOLS_DEV` is set to a truthy value (`1`, `true`), it'll have the following effects:
-| Utility | Effect |
-| ----------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| **Logger** | Increase JSON indentation to 4. This will ease local debugging when running functions locally under emulators or direct calls while not affecting unit tests |
+| Utility | Effect |
+| ----------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| **Logger** | Increase JSON indentation to 4. This will ease local debugging when running functions locally under emulators or direct calls while not affecting unit tests |
| **Event Handler** | Enable full traceback errors in the response, indent request/responses, and CORS in dev mode (`*`). This will deprecate [`POWERTOOLS_EVENT_HANDLER_DEBUG`](https://awslabs.github.io/aws-lambda-powertools-python/latest/core/event_handler/api_gateway/#debug-mode) in the future. |
-| **Tracer** | Future-proof safety to disables tracing operations in non-Lambda environments. This already happens automatically in the Tracer utility. |
+| **Tracer** | Future-proof safety to disables tracing operations in non-Lambda environments. This already happens automatically in the Tracer utility. |
## Debug mode
diff --git a/docs/utilities/parser.md b/docs/utilities/parser.md
index cdcb949d28a..48c244c8df2 100644
--- a/docs/utilities/parser.md
+++ b/docs/utilities/parser.md
@@ -163,6 +163,7 @@ Parser comes with the following built-in models:
| **S3Model** | Lambda Event Source payload for Amazon S3 |
| **S3ObjectLambdaEvent** | Lambda Event Source payload for Amazon S3 Object Lambda |
| **KinesisDataStreamModel** | Lambda Event Source payload for Amazon Kinesis Data Streams |
+| **KinesisFirehoseModel** | Lambda Event Source payload for Amazon Kinesis Firehose |
| **SesModel** | Lambda Event Source payload for Amazon Simple Email Service |
| **SnsModel** | Lambda Event Source payload for Amazon Simple Notification Service |
| **APIGatewayProxyEventModel** | Lambda Event Source payload for Amazon API Gateway |
@@ -319,6 +320,7 @@ Parser comes with the following built-in envelopes, where `Model` in the return
| **SqsEnvelope** | 1. Parses data using `SqsModel`.
2. Parses records in `body` key using your model and return them in a list. | `List[Model]` |
| **CloudWatchLogsEnvelope** | 1. Parses data using `CloudwatchLogsModel` which will base64 decode and decompress it.
2. Parses records in `message` key using your model and return them in a list. | `List[Model]` |
| **KinesisDataStreamEnvelope** | 1. Parses data using `KinesisDataStreamModel` which will base64 decode it.
2. Parses records in in `Records` key using your model and returns them in a list. | `List[Model]` |
+| **KinesisFirehoseEnvelope** | 1. Parses data using `KinesisFirehoseModel` which will base64 decode it.
2. Parses records in in `Records` key using your model and returns them in a list. | `List[Model]` |
| **SnsEnvelope** | 1. Parses data using `SnsModel`.
2. Parses records in `body` key using your model and return them in a list. | `List[Model]` |
| **SnsSqsEnvelope** | 1. Parses data using `SqsModel`.
2. Parses SNS records in `body` key using `SnsNotificationModel`.
3. Parses data in `Message` key using your model and return them in a list. | `List[Model]` |
| **ApiGatewayEnvelope** | 1. Parses data using `APIGatewayProxyEventModel`.
2. Parses `body` key using your model and returns it. | `Model` |
diff --git a/layer/layer/canary_stack.py b/layer/layer/canary_stack.py
index 426b3a4c87c..1f903f91c74 100644
--- a/layer/layer/canary_stack.py
+++ b/layer/layer/canary_stack.py
@@ -37,10 +37,6 @@ def __init__(
ManagedPolicy.from_aws_managed_policy_name("service-role/AWSLambdaBasicExecutionRole")
)
- execution_role.add_to_policy(
- PolicyStatement(effect=Effect.ALLOW, actions=["lambda:GetFunction"], resources=["*"])
- )
-
canary_lambda = Function(
self,
"CanaryLambdaFunction",
diff --git a/poetry.lock b/poetry.lock
index e2ef496c5dc..8f13157b12b 100644
--- a/poetry.lock
+++ b/poetry.lock
@@ -1295,7 +1295,7 @@ python-versions = ">=3.6"
[[package]]
name = "types-requests"
-version = "2.28.11.1"
+version = "2.28.11.2"
description = "Typing stubs for requests"
category = "dev"
optional = false
@@ -1314,7 +1314,7 @@ python-versions = "*"
[[package]]
name = "typing-extensions"
-version = "4.3.0"
+version = "4.4.0"
description = "Backported and Experimental Type Hints for Python 3.7+"
category = "main"
optional = false
@@ -1383,7 +1383,7 @@ pydantic = ["pydantic", "email-validator"]
[metadata]
lock-version = "1.1"
python-versions = "^3.6.2"
-content-hash = "bfa6597ca1a4b8e1199f55f2404e7caee56d30274072624165533b39f726e0a2"
+content-hash = "b6eba8ccb0bd0673dec8656d0fafa5aac520761f92cc152798c41883e3c92dca"
[metadata.files]
atomicwrites = [
@@ -2129,16 +2129,16 @@ typed-ast = [
{file = "typed_ast-1.5.4.tar.gz", hash = "sha256:39e21ceb7388e4bb37f4c679d72707ed46c2fbf2a5609b8b8ebc4b067d977df2"},
]
types-requests = [
- {file = "types-requests-2.28.11.1.tar.gz", hash = "sha256:02b1806c5b9904edcd87fa29236164aea0e6cdc4d93ea020cd615ef65cb43d65"},
- {file = "types_requests-2.28.11.1-py3-none-any.whl", hash = "sha256:1ff2c1301f6fe58b5d1c66cdf631ca19734cb3b1a4bbadc878d75557d183291a"},
+ {file = "types-requests-2.28.11.2.tar.gz", hash = "sha256:fdcd7bd148139fb8eef72cf4a41ac7273872cad9e6ada14b11ff5dfdeee60ed3"},
+ {file = "types_requests-2.28.11.2-py3-none-any.whl", hash = "sha256:14941f8023a80b16441b3b46caffcbfce5265fd14555844d6029697824b5a2ef"},
]
types-urllib3 = [
{file = "types-urllib3-1.26.24.tar.gz", hash = "sha256:a1b3aaea7dda3eb1b51699ee723aadd235488e4dc4648e030f09bc429ecff42f"},
{file = "types_urllib3-1.26.24-py3-none-any.whl", hash = "sha256:cf7918503d02d3576e503bbfb419b0e047c4617653bba09624756ab7175e15c9"},
]
typing-extensions = [
- {file = "typing_extensions-4.3.0-py3-none-any.whl", hash = "sha256:25642c956049920a5aa49edcdd6ab1e06d7e5d467fc00e0506c44ac86fbfca02"},
- {file = "typing_extensions-4.3.0.tar.gz", hash = "sha256:e6d2677a32f47fc7eb2795db1dd15c1f34eff616bcaf2cfb5e997f854fa1c4a6"},
+ {file = "typing_extensions-4.4.0-py3-none-any.whl", hash = "sha256:16fa4864408f655d35ec496218b85f79b3437c829e93320c7c9215ccfd92489e"},
+ {file = "typing_extensions-4.4.0.tar.gz", hash = "sha256:1511434bb92bf8dd198c12b1cc812e800d4181cfcb867674e0f8279cc93087aa"},
]
urllib3 = [
{file = "urllib3-1.26.12-py2.py3-none-any.whl", hash = "sha256:b930dd878d5a8afb066a637fbb35144fe7901e3b209d1cd4f524bd0e9deee997"},
diff --git a/pyproject.toml b/pyproject.toml
index a63f9360311..321b42e4248 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -67,7 +67,7 @@ mypy-boto3-ssm = { version = "^1.24.81", python = ">=3.7" }
mypy-boto3-s3 = { version = "^1.24.76", python = ">=3.7" }
mypy-boto3-xray = { version = "^1.24.0", python = ">=3.7" }
types-requests = "^2.28.11"
-typing-extensions = { version = "^4.3.0", python = ">=3.7" }
+typing-extensions = { version = "^4.4.0", python = ">=3.7" }
python-snappy = "^0.6.1"
mkdocs-material = { version = "^8.5.4", python = ">=3.7" }
filelock = { version = "^3.8.0", python = ">=3.7" }
diff --git a/tests/events/kinesisFirehoseKinesisEvent.json b/tests/events/kinesisFirehoseKinesisEvent.json
new file mode 100644
index 00000000000..5120dd57ccb
--- /dev/null
+++ b/tests/events/kinesisFirehoseKinesisEvent.json
@@ -0,0 +1,32 @@
+{
+ "invocationId": "2b4d1ad9-2f48-94bd-a088-767c317e994a",
+ "sourceKinesisStreamArn":"arn:aws:kinesis:us-east-1:123456789012:stream/kinesis-source",
+ "deliveryStreamArn": "arn:aws:firehose:us-east-2:123456789012:deliverystream/delivery-stream-name",
+ "region": "us-east-2",
+ "records": [
+ {
+ "data": "SGVsbG8gV29ybGQ=",
+ "recordId": "record1",
+ "approximateArrivalTimestamp": 1664028820148,
+ "kinesisRecordMetadata": {
+ "shardId": "shardId-000000000000",
+ "partitionKey": "4d1ad2b9-24f8-4b9d-a088-76e9947c317a",
+ "approximateArrivalTimestamp": 1664028820148,
+ "sequenceNumber": "49546986683135544286507457936321625675700192471156785154",
+ "subsequenceNumber": ""
+ }
+ },
+ {
+ "data": "eyJIZWxsbyI6ICJXb3JsZCJ9",
+ "recordId": "record2",
+ "approximateArrivalTimestamp": 1664028793294,
+ "kinesisRecordMetadata": {
+ "shardId": "shardId-000000000001",
+ "partitionKey": "4d1ad2b9-24f8-4b9d-a088-76e9947c318a",
+ "approximateArrivalTimestamp": 1664028793294,
+ "sequenceNumber": "49546986683135544286507457936321625675700192471156785155",
+ "subsequenceNumber": ""
+ }
+ }
+ ]
+}
diff --git a/tests/events/kinesisFirehosePutEvent.json b/tests/events/kinesisFirehosePutEvent.json
new file mode 100644
index 00000000000..27aeddd80eb
--- /dev/null
+++ b/tests/events/kinesisFirehosePutEvent.json
@@ -0,0 +1,17 @@
+{
+ "invocationId": "2b4d1ad9-2f48-94bd-a088-767c317e994a",
+ "deliveryStreamArn": "arn:aws:firehose:us-east-2:123456789012:deliverystream/delivery-stream-name",
+ "region": "us-east-2",
+ "records":[
+ {
+ "recordId":"record1",
+ "approximateArrivalTimestamp":1664029185290,
+ "data":"SGVsbG8gV29ybGQ="
+ },
+ {
+ "recordId":"record2",
+ "approximateArrivalTimestamp":1664029186945,
+ "data":"eyJIZWxsbyI6ICJXb3JsZCJ9"
+ }
+ ]
+ }
diff --git a/tests/functional/parser/schemas.py b/tests/functional/parser/schemas.py
index b1b66c63379..907deb40aa0 100644
--- a/tests/functional/parser/schemas.py
+++ b/tests/functional/parser/schemas.py
@@ -95,3 +95,7 @@ class MyALambdaFuncUrlBusiness(BaseModel):
class MyLambdaKafkaBusiness(BaseModel):
key: str
+
+
+class MyKinesisFirehoseBusiness(BaseModel):
+ Hello: str
diff --git a/tests/functional/parser/test_kinesis_firehose.py b/tests/functional/parser/test_kinesis_firehose.py
new file mode 100644
index 00000000000..59bbd2f4e18
--- /dev/null
+++ b/tests/functional/parser/test_kinesis_firehose.py
@@ -0,0 +1,114 @@
+from typing import List
+
+import pytest
+
+from aws_lambda_powertools.utilities.parser import (
+ ValidationError,
+ envelopes,
+ event_parser,
+)
+from aws_lambda_powertools.utilities.parser.models import (
+ KinesisFirehoseModel,
+ KinesisFirehoseRecord,
+ KinesisFirehoseRecordMetadata,
+)
+from aws_lambda_powertools.utilities.typing import LambdaContext
+from tests.functional.parser.schemas import MyKinesisFirehoseBusiness
+from tests.functional.utils import load_event
+
+
+@event_parser(model=MyKinesisFirehoseBusiness, envelope=envelopes.KinesisFirehoseEnvelope)
+def handle_firehose(event: List[MyKinesisFirehoseBusiness], _: LambdaContext):
+ assert len(event) == 1
+ assert event[0].Hello == "World"
+
+
+@event_parser(model=KinesisFirehoseModel)
+def handle_firehose_no_envelope_kinesis(event: KinesisFirehoseModel, _: LambdaContext):
+ assert event.region == "us-east-2"
+ assert event.invocationId == "2b4d1ad9-2f48-94bd-a088-767c317e994a"
+ assert event.deliveryStreamArn == "arn:aws:firehose:us-east-2:123456789012:deliverystream/delivery-stream-name"
+ assert event.sourceKinesisStreamArn == "arn:aws:kinesis:us-east-1:123456789012:stream/kinesis-source"
+
+ records = list(event.records)
+ assert len(records) == 2
+ record_01: KinesisFirehoseRecord = records[0]
+ assert record_01.approximateArrivalTimestamp == 1664028820148
+ assert record_01.recordId == "record1"
+ assert record_01.data == b"Hello World"
+
+ metadata_01: KinesisFirehoseRecordMetadata = record_01.kinesisRecordMetadata
+ assert metadata_01.partitionKey == "4d1ad2b9-24f8-4b9d-a088-76e9947c317a"
+ assert metadata_01.subsequenceNumber == ""
+ assert metadata_01.shardId == "shardId-000000000000"
+ assert metadata_01.approximateArrivalTimestamp == 1664028820148
+ assert metadata_01.sequenceNumber == "49546986683135544286507457936321625675700192471156785154"
+
+ record_02: KinesisFirehoseRecord = records[1]
+ assert record_02.approximateArrivalTimestamp == 1664028793294
+ assert record_02.recordId == "record2"
+ assert record_02.data == b'{"Hello": "World"}'
+
+ metadata_02: KinesisFirehoseRecordMetadata = record_02.kinesisRecordMetadata
+ assert metadata_02.partitionKey == "4d1ad2b9-24f8-4b9d-a088-76e9947c318a"
+ assert metadata_02.subsequenceNumber == ""
+ assert metadata_02.shardId == "shardId-000000000001"
+ assert metadata_02.approximateArrivalTimestamp == 1664028793294
+ assert metadata_02.sequenceNumber == "49546986683135544286507457936321625675700192471156785155"
+
+
+@event_parser(model=KinesisFirehoseModel)
+def handle_firehose_no_envelope_put(event: KinesisFirehoseModel, _: LambdaContext):
+ assert event.region == "us-east-2"
+ assert event.invocationId == "2b4d1ad9-2f48-94bd-a088-767c317e994a"
+ assert event.deliveryStreamArn == "arn:aws:firehose:us-east-2:123456789012:deliverystream/delivery-stream-name"
+
+ records = list(event.records)
+ assert len(records) == 2
+
+ record_01: KinesisFirehoseRecord = records[0]
+ assert record_01.approximateArrivalTimestamp == 1664029185290
+ assert record_01.recordId == "record1"
+ assert record_01.data == b"Hello World"
+
+ record_02: KinesisFirehoseRecord = records[1]
+ assert record_02.approximateArrivalTimestamp == 1664029186945
+ assert record_02.recordId == "record2"
+ assert record_02.data == b'{"Hello": "World"}'
+
+
+def test_firehose_trigger_event():
+ event_dict = load_event("kinesisFirehoseKinesisEvent.json")
+ event_dict["records"].pop(0) # remove first item since the payload is bytes and we want to test payload json class
+ handle_firehose(event_dict, LambdaContext())
+
+
+def test_firehose_trigger_event_kinesis_no_envelope():
+ event_dict = load_event("kinesisFirehoseKinesisEvent.json")
+ handle_firehose_no_envelope_kinesis(event_dict, LambdaContext())
+
+
+def test_firehose_trigger_event_put_no_envelope():
+ event_dict = load_event("kinesisFirehosePutEvent.json")
+ handle_firehose_no_envelope_put(event_dict, LambdaContext())
+
+
+def test_kinesis_trigger_bad_base64_event():
+ event_dict = load_event("kinesisFirehoseKinesisEvent.json")
+ event_dict["records"][0]["data"] = {"bad base64"}
+ with pytest.raises(ValidationError):
+ handle_firehose_no_envelope_kinesis(event_dict, LambdaContext())
+
+
+def test_kinesis_trigger_bad_timestamp_event():
+ event_dict = load_event("kinesisFirehoseKinesisEvent.json")
+ event_dict["records"][0]["approximateArrivalTimestamp"] = -1
+ with pytest.raises(ValidationError):
+ handle_firehose_no_envelope_kinesis(event_dict, LambdaContext())
+
+
+def test_kinesis_trigger_bad_metadata_timestamp_event():
+ event_dict = load_event("kinesisFirehoseKinesisEvent.json")
+ event_dict["records"][0]["kinesisRecordMetadata"]["approximateArrivalTimestamp"] = "-1"
+ with pytest.raises(ValidationError):
+ handle_firehose_no_envelope_kinesis(event_dict, LambdaContext())
diff --git a/tests/functional/test_metrics.py b/tests/functional/test_metrics.py
index e0ce7f84dc9..96dd3b41b25 100644
--- a/tests/functional/test_metrics.py
+++ b/tests/functional/test_metrics.py
@@ -898,3 +898,30 @@ def lambda_handler(evt, ctx):
# THEN we should have default dimensions in both outputs
assert "environment" in first_invocation
assert "environment" in second_invocation
+
+
+def test_metrics_reuse_dimension_set(metric, dimension, namespace):
+ # GIVEN Metrics is initialized with a metric and dimension
+ my_metrics = Metrics(namespace=namespace)
+ my_metrics.add_dimension(**dimension)
+ my_metrics.add_metric(**metric)
+
+ # WHEN Metrics is initialized one more time
+ my_metrics_2 = Metrics(namespace=namespace)
+
+ # THEN both class instances should have the same dimension set
+ assert my_metrics_2.dimension_set == my_metrics.dimension_set
+
+
+def test_metrics_reuse_metadata_set(metric, dimension, namespace):
+ # GIVEN Metrics is initialized with a metric, dimension, and metadata
+ my_metrics = Metrics(namespace=namespace)
+ my_metrics.add_dimension(**dimension)
+ my_metrics.add_metric(**metric)
+ my_metrics.add_metadata(key="meta", value="data")
+
+ # WHEN Metrics is initialized one more time
+ my_metrics_2 = Metrics(namespace=namespace)
+
+ # THEN both class instances should have the same metadata set
+ assert my_metrics_2.metadata_set == my_metrics.metadata_set