Git Product home page Git Product logo

powertools-lambda-java's People

Contributors

alexeysoshin avatar bdkosher avatar cjb574 avatar dependabot[bot] avatar eldimi avatar heitorlessa avatar humanzz avatar jasoniharris avatar jdoherty avatar jeromevdl avatar jreijn avatar kozub avatar lgouger avatar machafer avatar michaelbrewer avatar mriccia avatar msailes avatar nem0-97 avatar pankajagrawal16 avatar poprahul avatar rb2010 avatar rubenfonseca avatar scottgerring avatar skal111 avatar stevehouel avatar sthulb avatar sullis avatar vitodegiosa avatar volomas avatar walmsles avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

powertools-lambda-java's Issues

Metrics are not flushed when lambda handler throws an exception

What were you trying to accomplish?
I am trying to publish metrics even when the lambda handler fails. Apparently, metrics won't be flushed if the handler throws exception: https://github.com/awslabs/aws-lambda-powertools-java/blob/master/powertools-metrics/src/main/java/software/amazon/lambda/powertools/metrics/internal/LambdaMetricsAspect.java#L62

Expected Behavior

Metrics are flushed regardless of the handler execution status

Current Behavior

Metrics are flushed only when the handler finishes successfully

Possible Solution

Some thing likes the following snippet in the around method

public Object around(ProceedingJoinPoint pjp,
                         Metrics metrics) throws Throwable {

...[omitted code]...

            try {
                Object proceed;
                try {
                     proceed = pjp.proceed(proceedArgs);
                } finally {
                    coldStartDone();
                    validateBeforeFlushingMetrics(metrics);
                    logger.flush();
                }
                return proceed;

            } finally {
                refreshMetricsContext();
            }
}

Steps to Reproduce (for bugs)

  1. Add the following class to src/test/java/software/amazon/lambda/powertools/metrics/handlers
public class PowertoolsMetricsWithExceptionInHandler implements RequestHandler<Object, Object> {

    @Override
    @Metrics(namespace = "ExampleApplication", service = "booking")
    public Object handleRequest(Object input, Context context) {
        MetricsLogger metricsLogger = metricsLogger();
        metricsLogger.putMetric("CoolMetric", 1);
        throw new IllegalStateException("Whoops, unexpected exception");
    }
}
  1. Add the following test to software.amazon.lambda.powertools.metrics.internal.LambdaMetricsAspectTest
    @Test
    public void metricsPublishedEvenHandlerThrowsException() {
        requestHandler = new PowertoolsMetricsWithExceptionInHandler();
        try (MockedStatic<SystemWrapper> mocked = mockStatic(SystemWrapper.class)) {
            mocked.when(() -> SystemWrapper.getenv("AWS_EMF_ENVIRONMENT")).thenReturn("Lambda");
            assertThatExceptionOfType(IllegalStateException.class)
                .isThrownBy(() -> requestHandler.handleRequest("input", context))
                .withMessage("Whoops, unexpected exception");

            assertThat(out.toString())
                .satisfies(s -> {
                    Map<String, Object> logAsJson = readAsJson(s);
                    assertThat(logAsJson)
                        .containsEntry("CoolMetric", 1.0)
                        .containsEntry("Service", "booking")
                        .containsKey("_aws");
                });
        }
    }
  1. Run unit tests
  2. The unit test would fail

Environment

  • Powertools version used: master branch
  • Packaging format (Layers, Maven/Gradle): Maven
  • AWS Lambda function runtime: Java 8
  • Debugging logs

How to enable debug mode**

# paste logs here

Lazy loading of default SQSClient in order to make it work in quarkus /graalvm native

Is your feature request related to a problem? Please describe.
Use powertools within quarkus 2.0.2 and graalvm 21.1

Describe the solution you'd like
Currently when using this AWS tool within quarkus (2.0.2 with graalv 21.1) you get the following exception:

Error: Classes that should be initialized at run time got initialized during image building: software.amazon.awssdk.core.retry.backoff.FullJitterBackoffStrategy the class was requested to be initialized at run time (from feature io.quarkus.runner.AutoFeature.beforeAnalysis). To see why software.amazon.awssdk.core.retry.backoff.FullJitterBackoffStrategy got initialized use --trace-class-initialization=software.amazon.awssdk.core.retry.backoff.FullJitterBackoffStrategy com.oracle.svm.core.util.UserError$UserException: Classes that should be initialized at run time got initialized during image building: software.amazon.awssdk.core.retry.backoff.FullJitterBackoffStrategy the class was requested to be initialized at run time (from feature io.quarkus.runner.AutoFeature.beforeAnalysis). To see why software.amazon.awssdk.core.retry.backoff.FullJitterBackoffStrategy got initialized use --trace-class-initialization=software.amazon.awssdk.core.retry.backoff.FullJitterBackoffStrategy at com.oracle.svm.core.util.UserError.abort(UserError.java:68)

Which is caused by the fact that the default SqsClient is initilazed even if you override it later. Could you please do lazy loading of the SqsClient. That way we can fix this problem very easily.

public final class SqsUtils {
/////
private static SqsClient client = SqsClient.create(); <-- always loaded

@Logging annotation with logEvent = true does not play nice with @Tracing annotation in RequestStreamHandler

I am using Lambda Powertools Java to capture logs, metrics, and traces for my REST service running in Lambda. I am using RequestStreamHandler to forward ApiGateway request to a JerseyLambdaContainerHandler containing handling for my various endpoints proxied by ApiGateway.

Using @Logging(logEvent = true) and @Tracing together causes the JerseyLambdaContainerHandler to not process the request as the InputStream has been read from already due to the (logEvent = true) specification on the @Logging annotation. I do not see these errors using @Logging and @Tracing together, just @Logging(logEvent = true) alone, or just @Tracing alone.

How has this issue affected you? What are you trying to accomplish?

I attempted to use both @Tracing and @Logging(logEvent = true) on the same handler method implementing RequestStreamHandler

The handler method is unable to process the request due to:

com.fasterxml.jackson.databind.exc.MismatchedInputException: "No content to map due to end-of-input\n at [Source: (ByteArrayInputStream); line: 1, column: 0]"

I am able to work around it by resetting the InputStream on the first line of the handler. My method:

    @Logging(logEvent = true)
    @Tracing
    @Metrics(captureColdStart = true)
    @Override
    public void handleRequest(final InputStream inputStream, final OutputStream outputStream, final Context context) throws IOException {
        inputStream.reset(); // Need to reset as Logging aspect with logEvent=true is exhausting the input stream
        log.info("Started processing request");
        HANDLER.proxyStream(inputStream, outputStream, context);
        traceHttp(inputStream, outputStream);
        log.info("Finished processing request");
    }

What were you trying to accomplish?
To use the powertools annotations on my RequestStreamHandler without needing to reset inputStream

Expected Behavior

The inputStream should not be exhausted after the powertools aspects run before my method

Current Behavior

The inputStream is exhausted by the powertools aspects run before my method

Possible Solution

I believe this post details the issue but I'm not an expert in AspectJ (just consuming it here). That post and answer implies that LambdaLoggingAspect.java#L174 is not actually working so when LambdaTracingAspect runs after the LambdaLoggingAspect, it is not actually getting the updated parameters. However, it does not throw this error if I use the @Logging(logEvent = true) annotation without the @Tracing annotation which I do not understand in the context of the above hypothesis so I'm doubtful of my own hypothesis for that reason.

Steps to Reproduce (for bugs)

  1. Annotate handler method of RequestStreamHandler with both @Logging(logEvent=true) and @Tracing
  2. Execute a request that reads from InputStream
  3. Receive exception because InputStream has been read from already

Environment

  • Powertools version used: 1.2
  • Packaging format (Layers, Maven/Gradle): Gradle
  • AWS Lambda function runtime: Java11
  • Debugging logs
 | timestamp | message
| 1613335172594 | {"instant":{"epochSecond":1613335172,"nanoOfSecond":396000000},"thread":"main","level":"INFO","loggerName":"com.amazonaws.serverless.proxy.internal.LambdaContainerHandler","message":"Starting Lambda Container Handler","endOfBatch":false,"loggerFqcn":"org.apache.logging.slf4j.Log4jLogger","threadId":1,"threadPriority":5,"timestamp":"2021-02-14T20:39:32.396Z[UTC]"}
| 1613335174717 | START RequestId: b74a3b4d-3440-4518-a912-56c8cc53b832 Version: $LATEST
| 1613335175305 | {"_aws":{"Timestamp":1613335174944,"CloudWatchMetrics":[{"Namespace":"my-service-api","Metrics":[{"Name":"ColdStart","Unit":"Count"}],"Dimensions":[["Service","FunctionName"]]}]},"traceId":"Root=1-60298a81-f37154f04fe1c3846c6dec35;Parent=c34f426197c429d1;Sampled=1","FunctionName":"AuthorizationServiceLambda","functionVersion":"$LATEST","ColdStart":1.0,"Service":"authorization-service","logStreamId":"2021/02/14/[$LATEST]e794fcfaf4304e4780df7d5fd99261b1","executionEnvironment":"AWS_Lambda_java11"}
| 1613335175408 | {"instant":{"epochSecond":1613335175,"nanoOfSecond":407000000},"thread":"main","level":"INFO","loggerName":"com.myservice.authorization.RequestHandler","message":"<redacted>","endOfBatch":false,"loggerFqcn":"org.apache.logging.log4j.spi.AbstractLogger","threadId":1,"threadPriority":5,"timestamp":"2021-02-14T20:39:35.407Z[UTC]","coldStart":"true","functionArn":"arn:aws:lambda:us-east-1:xxxxxxxxxxxx:function:AuthorizationServiceLambda","functionMemorySize":"512","functionName":"AuthorizationServiceLambda","functionVersion":"$LATEST","function_request_id":"b74a3b4d-3440-4518-a912-56c8cc53b832","samplingRate":"0.0","service":"authorization-service","xray_trace_id":"1-60298a81-f37154f04fe1c3846c6dec35"}
| 1613335175625 | {"instant":{"epochSecond":1613335175,"nanoOfSecond":625000000},"thread":"main","level":"INFO","loggerName":"com.amazonaws.xray.config.DaemonConfiguration","message":"Environment variable AWS_XRAY_DAEMON_ADDRESS is set. Emitting to daemon on address 169.254.79.2:2000.","endOfBatch":false,"loggerFqcn":"org.apache.logging.log4j.jcl.Log4jLog","threadId":1,"threadPriority":5,"timestamp":"2021-02-14T20:39:35.625Z[UTC]","coldStart":"true","functionArn":"arn:aws:lambda:us-east-1:xxxxxxxxxxxx:function:AuthorizationServiceLambda","functionMemorySize":"512","functionName":"AuthorizationServiceLambda","functionVersion":"$LATEST","function_request_id":"b74a3b4d-3440-4518-a912-56c8cc53b832","samplingRate":"0.0","service":"authorization-service","xray_trace_id":"1-60298a81-f37154f04fe1c3846c6dec35"}
| 1613335179046 | {"instant":{"epochSecond":1613335179,"nanoOfSecond":45000000},"thread":"main","level":"INFO","loggerName":"com.amazonaws.xray.AWSXRayRecorder","message":"Overriding contextMissingStrategy. Environment variable AWS_XRAY_CONTEXT_MISSING has value: \"LOG_ERROR\".","endOfBatch":false,"loggerFqcn":"org.apache.logging.log4j.jcl.Log4jLog","threadId":1,"threadPriority":5,"timestamp":"2021-02-14T20:39:39.045Z[UTC]","coldStart":"true","functionArn":"arn:aws:lambda:us-east-1:xxxxxxxxxxxx:function:AuthorizationServiceLambda","functionMemorySize":"512","functionName":"AuthorizationServiceLambda","functionVersion":"$LATEST","function_request_id":"b74a3b4d-3440-4518-a912-56c8cc53b832","samplingRate":"0.0","service":"authorization-service","xray_trace_id":"1-60298a81-f37154f04fe1c3846c6dec35"}
| 1613335179103 | {"instant":{"epochSecond":1613335179,"nanoOfSecond":103000000},"thread":"main","level":"INFO","loggerName":"com.amazonaws.xray.config.DaemonConfiguration","message":"Environment variable AWS_XRAY_DAEMON_ADDRESS is set. Emitting to daemon on address 169.254.79.2:2000.","endOfBatch":false,"loggerFqcn":"org.apache.logging.log4j.jcl.Log4jLog","threadId":1,"threadPriority":5,"timestamp":"2021-02-14T20:39:39.103Z[UTC]","coldStart":"true","functionArn":"arn:aws:lambda:us-east-1:xxxxxxxxxxxx:function:AuthorizationServiceLambda","functionMemorySize":"512","functionName":"AuthorizationServiceLambda","functionVersion":"$LATEST","function_request_id":"b74a3b4d-3440-4518-a912-56c8cc53b832","samplingRate":"0.0","service":"authorization-service","xray_trace_id":"1-60298a81-f37154f04fe1c3846c6dec35"}
| 1613335179205 | {"instant":{"epochSecond":1613335179,"nanoOfSecond":204000000},"thread":"main","level":"INFO","loggerName":"com.myservice.authorization.RequestHandler","message":"Started processing request","endOfBatch":false,"loggerFqcn":"org.apache.logging.log4j.spi.AbstractLogger","threadId":1,"threadPriority":5,"timestamp":"2021-02-14T20:39:39.204Z[UTC]","coldStart":"true","functionArn":"arn:aws:lambda:us-east-1:xxxxxxxxxxxx:function:AuthorizationServiceLambda","functionMemorySize":"512","functionName":"AuthorizationServiceLambda","functionVersion":"$LATEST","function_request_id":"b74a3b4d-3440-4518-a912-56c8cc53b832","samplingRate":"0.0","service":"authorization-service","xray_trace_id":"1-60298a81-f37154f04fe1c3846c6dec35"}
| 1613335179344 | {"instant":{"epochSecond":1613335179,"nanoOfSecond":205000000},"thread":"main","level":"ERROR","loggerName":"com.amazonaws.serverless.proxy.internal.LambdaContainerHandler","message":"Error while mapping object to RequestType class","thrown":{"commonElementCount":0,"localizedMessage":"No content to map due to end-of-input\n at [Source: (ByteArrayInputStream); line: 1, column: 0]","message":"No content to map due to end-of-input\n at [Source: (ByteArrayInputStream); line: 1, column: 0]","name":"com.fasterxml.jackson.databind.exc.MismatchedInputException","extendedStackTrace":[{"class":"com.fasterxml.jackson.databind.exc.MismatchedInputException","method":"from","file":"MismatchedInputException.java","line":59,"exact":false,"location":"jackson-databind-2.11.x.jar","version":"?"},{"class":"com.fasterxml.jackson.databind.DeserializationContext","method":"reportInputMismatch","file":"DeserializationContext.java","line":1468,"exact":false,"location":"jackson-databind-2.11.x.jar","version":"?"},{"class":"com.fasterxml.jackson.databind.ObjectReader","method":"_initForReading","file":"ObjectReader.java","line":360,"exact":false,"location":"jackson-databind-2.11.x.jar","version":"?"},{"class":"com.fasterxml.jackson.databind.ObjectReader","method":"_bindAndClose","file":"ObjectReader.java","line":2064,"exact":false,"location":"jackson-databind-2.11.x.jar","version":"?"},{"class":"com.fasterxml.jackson.databind.ObjectReader","method":"readValue","file":"ObjectReader.java","line":1453,"exact":false,"location":"jackson-databind-2.11.x.jar","version":"?"},{"class":"com.amazonaws.serverless.proxy.internal.LambdaContainerHandler","method":"proxyStream","file":"LambdaContainerHandler.java","line":253,"exact":true,"location":"aws-serverless-java-container-core-1.5.2.jar","version":"?"},{"class":"com.myservice.authorization.RequestHandler","method":"handleRequest_aroundBody0","file":"RequestHandler.java","line":56,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"com.myservice.authorization.RequestHandler$AjcClosure1","method":"run","file":"RequestHandler.java","line":1,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"org.aspectj.runtime.reflect.JoinPointImpl","method":"proceed","file":"JoinPointImpl.java","line":257,"exact":true,"location":"aspectjrt-1.9.6.jar","version":"?"},{"class":"software.amazon.lambda.powertools.tracing.internal.LambdaTracingAspect","method":"around","file":"LambdaTracingAspect.java","line":59,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"com.myservice.authorization.RequestHandler","method":"handleRequest_aroundBody2","file":"RequestHandler.java","line":55,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"com.myservice.authorization.RequestHandler$AjcClosure3","method":"run","file":"RequestHandler.java","line":1,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"org.aspectj.runtime.reflect.JoinPointImpl","method":"proceed","file":"JoinPointImpl.java","line":257,"exact":true,"location":"aspectjrt-1.9.6.jar","version":"?"},{"class":"software.amazon.lambda.powertools.logging.internal.LambdaLoggingAspect","method":"around","file":"LambdaLoggingAspect.java","line":98,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"com.myservice.authorization.RequestHandler","method":"handleRequest_aroundBody4","file":"RequestHandler.java","line":55,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"com.myservice.authorization.RequestHandler$AjcClosure5","method":"run","file":"RequestHandler.java","line":1,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"org.aspectj.runtime.reflect.JoinPointImpl","method":"proceed","file":"JoinPointImpl.java","line":257,"exact":true,"location":"aspectjrt-1.9.6.jar","version":"?"},{"class":"software.amazon.lambda.powertools.metrics.internal.LambdaMetricsAspect","method":"around","file":"LambdaMetricsAspect.java","line":56,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"com.myservice.authorization.RequestHandler","method":"handleRequest","file":"RequestHandler.java","line":55,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"jdk.internal.reflect.NativeMethodAccessorImpl","method":"invoke0","line":-2,"exact":false,"location":"?","version":"?"},{"class":"jdk.internal.reflect.NativeMethodAccessorImpl","method":"invoke","line":-1,"exact":false,"location":"?","version":"?"},{"class":"jdk.internal.reflect.DelegatingMethodAccessorImpl","method":"invoke","line":-1,"exact":false,"location":"?","version":"?"},{"class":"java.lang.reflect.Method","method":"invoke","line":-1,"exact":false,"location":"?","version":"?"},{"class":"lambdainternal.EventHandlerLoader$StreamMethodRequestHandler","method":"handleRequest","file":"EventHandlerLoader.java","line":375,"exact":true,"location":"aws-lambda-java-runtime-0.2.0.jar","version":"?"},{"class":"lambdainternal.EventHandlerLoader$2","method":"call","file":"EventHandlerLoader.java","line":899,"exact":true,"location":"aws-lambda-java-runtime-0.2.0.jar","version":"?"},{"class":"lambdainternal.AWSLambda","method":"startRuntime","file":"AWSLambda.java","line":258,"exact":true,"location":"aws-lambda-java-runtime-0.2.0.jar","version":"?"},{"class":"lambdainternal.AWSLambda","method":"startRuntime","file":"AWSLambda.java","line":192,"exact":true,"location":"aws-lambda-java-runtime-0.2.0.jar","version":"?"},{"class":"lambdainternal.AWSLambda","method":"main","file":"AWSLambda.java","line":187,"exact":true,"location":"aws-lambda-java-runtime-0.2.0.jar","version":"?"}]},"endOfBatch":false,"loggerFqcn":"org.apache.logging.slf4j.Log4jLogger","threadId":1,"threadPriority":5,"timestamp":"2021-02-14T20:39:39.205Z[UTC]","coldStart":"true","functionArn":"arn:aws:lambda:us-east-1:xxxxxxxxxxxx:function:AuthorizationServiceLambda","functionMemorySize":"512","functionName":"AuthorizationServiceLambda","functionVersion":"$LATEST","function_request_id":"b74a3b4d-3440-4518-a912-56c8cc53b832","samplingRate":"0.0","service":"authorization-service","xray_trace_id":"1-60298a81-f37154f04fe1c3846c6dec35"}  |
| 1613335179346 | {"instant":{"epochSecond":1613335179,"nanoOfSecond":344000000},"thread":"main","level":"ERROR","loggerName":"com.amazonaws.serverless.proxy.AwsProxyExceptionHandler","message":"Called exception handler for:","thrown":{"commonElementCount":0,"localizedMessage":"No content to map due to end-of-input\n at [Source: (ByteArrayInputStream); line: 1, column: 0]","message":"No content to map due to end-of-input\n at [Source: (ByteArrayInputStream); line: 1, column: 0]","name":"com.fasterxml.jackson.databind.exc.MismatchedInputException","extendedStackTrace":[{"class":"com.fasterxml.jackson.databind.exc.MismatchedInputException","method":"from","file":"MismatchedInputException.java","line":59,"exact":false,"location":"jackson-databind-2.11.x.jar","version":"?"},{"class":"com.fasterxml.jackson.databind.DeserializationContext","method":"reportInputMismatch","file":"DeserializationContext.java","line":1468,"exact":false,"location":"jackson-databind-2.11.x.jar","version":"?"},{"class":"com.fasterxml.jackson.databind.ObjectReader","method":"_initForReading","file":"ObjectReader.java","line":360,"exact":false,"location":"jackson-databind-2.11.x.jar","version":"?"},{"class":"com.fasterxml.jackson.databind.ObjectReader","method":"_bindAndClose","file":"ObjectReader.java","line":2064,"exact":false,"location":"jackson-databind-2.11.x.jar","version":"?"},{"class":"com.fasterxml.jackson.databind.ObjectReader","method":"readValue","file":"ObjectReader.java","line":1453,"exact":false,"location":"jackson-databind-2.11.x.jar","version":"?"},{"class":"com.amazonaws.serverless.proxy.internal.LambdaContainerHandler","method":"proxyStream","file":"LambdaContainerHandler.java","line":253,"exact":true,"location":"aws-serverless-java-container-core-1.5.2.jar","version":"?"},{"class":"com.myservice.authorization.RequestHandler","method":"handleRequest_aroundBody0","file":"RequestHandler.java","line":56,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"com.myservice.authorization.RequestHandler$AjcClosure1","method":"run","file":"RequestHandler.java","line":1,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"org.aspectj.runtime.reflect.JoinPointImpl","method":"proceed","file":"JoinPointImpl.java","line":257,"exact":true,"location":"aspectjrt-1.9.6.jar","version":"?"},{"class":"software.amazon.lambda.powertools.tracing.internal.LambdaTracingAspect","method":"around","file":"LambdaTracingAspect.java","line":59,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"com.myservice.authorization.RequestHandler","method":"handleRequest_aroundBody2","file":"RequestHandler.java","line":55,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"com.myservice.authorization.RequestHandler$AjcClosure3","method":"run","file":"RequestHandler.java","line":1,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"org.aspectj.runtime.reflect.JoinPointImpl","method":"proceed","file":"JoinPointImpl.java","line":257,"exact":true,"location":"aspectjrt-1.9.6.jar","version":"?"},{"class":"software.amazon.lambda.powertools.logging.internal.LambdaLoggingAspect","method":"around","file":"LambdaLoggingAspect.java","line":98,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"com.myservice.authorization.RequestHandler","method":"handleRequest_aroundBody4","file":"RequestHandler.java","line":55,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"com.myservice.authorization.RequestHandler$AjcClosure5","method":"run","file":"RequestHandler.java","line":1,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"org.aspectj.runtime.reflect.JoinPointImpl","method":"proceed","file":"JoinPointImpl.java","line":257,"exact":true,"location":"aspectjrt-1.9.6.jar","version":"?"},{"class":"software.amazon.lambda.powertools.metrics.internal.LambdaMetricsAspect","method":"around","file":"LambdaMetricsAspect.java","line":56,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"com.myservice.authorization.RequestHandler","method":"handleRequest","file":"RequestHandler.java","line":55,"exact":true,"location":"my-serviceAuthorizationService-1.0.jar","version":"?"},{"class":"jdk.internal.reflect.NativeMethodAccessorImpl","method":"invoke0","line":-2,"exact":false,"location":"?","version":"?"},{"class":"jdk.internal.reflect.NativeMethodAccessorImpl","method":"invoke","line":-1,"exact":false,"location":"?","version":"?"},{"class":"jdk.internal.reflect.DelegatingMethodAccessorImpl","method":"invoke","line":-1,"exact":false,"location":"?","version":"?"},{"class":"java.lang.reflect.Method","method":"invoke","line":-1,"exact":false,"location":"?","version":"?"},{"class":"lambdainternal.EventHandlerLoader$StreamMethodRequestHandler","method":"handleRequest","file":"EventHandlerLoader.java","line":375,"exact":true,"location":"aws-lambda-java-runtime-0.2.0.jar","version":"?"},{"class":"lambdainternal.EventHandlerLoader$2","method":"call","file":"EventHandlerLoader.java","line":899,"exact":true,"location":"aws-lambda-java-runtime-0.2.0.jar","version":"?"},{"class":"lambdainternal.AWSLambda","method":"startRuntime","file":"AWSLambda.java","line":258,"exact":true,"location":"aws-lambda-java-runtime-0.2.0.jar","version":"?"},{"class":"lambdainternal.AWSLambda","method":"startRuntime","file":"AWSLambda.java","line":192,"exact":true,"location":"aws-lambda-java-runtime-0.2.0.jar","version":"?"},{"class":"lambdainternal.AWSLambda","method":"main","file":"AWSLambda.java","line":187,"exact":true,"location":"aws-lambda-java-runtime-0.2.0.jar","version":"?"}]},"endOfBatch":false,"loggerFqcn":"org.apache.logging.slf4j.Log4jLogger","threadId":1,"threadPriority":5,"timestamp":"2021-02-14T20:39:39.344Z[UTC]","coldStart":"true","functionArn":"arn:aws:lambda:us-east-1:xxxxxxxxxxxx:function:AuthorizationServiceLambda","functionMemorySize":"512","functionName":"AuthorizationServiceLambda","functionVersion":"$LATEST","function_request_id":"b74a3b4d-3440-4518-a912-56c8cc53b832","samplingRate":"0.0","service":"authorization-service","xray_trace_id":"1-60298a81-f37154f04fe1c3846c6dec35"}
| 1613335179363 | com.fasterxml.jackson.databind.exc.MismatchedInputException: No content to map due to end-of-input
| 1613335179363 | at [Source: (ByteArrayInputStream); line: 1, column: 0]
| 1613335179382 | at com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:59)
| 1613335179383 | at com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1468)
| 1613335179383 | at com.fasterxml.jackson.databind.ObjectReader._initForReading(ObjectReader.java:360)
| 1613335179383 | at com.fasterxml.jackson.databind.ObjectReader._bindAndClose(ObjectReader.java:2064)
| 1613335179383 | at com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:1453)
| 1613335179383 | at com.amazonaws.serverless.proxy.internal.LambdaContainerHandler.proxyStream(LambdaContainerHandler.java:253)
| 1613335179383 | at com.myservice.authorization.RequestHandler.handleRequest_aroundBody0(RequestHandler.java:56)
| 1613335179383 | at com.myservice.authorization.RequestHandler$AjcClosure1.run(RequestHandler.java:1)
| 1613335179383 | at org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:257)
| 1613335179383 | at software.amazon.lambda.powertools.tracing.internal.LambdaTracingAspect.around(LambdaTracingAspect.java:59)
| 1613335179383 | at com.myservice.authorization.RequestHandler.handleRequest_aroundBody2(RequestHandler.java:55)
| 1613335179383 | at com.myservice.authorization.RequestHandler$AjcClosure3.run(RequestHandler.java:1)
| 1613335179383 | at org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:257)
| 1613335179384 | at software.amazon.lambda.powertools.logging.internal.LambdaLoggingAspect.around(LambdaLoggingAspect.java:98)
| 1613335179384 | at com.myservice.authorization.RequestHandler.handleRequest_aroundBody4(RequestHandler.java:55)
| 1613335179384 | at com.myservice.authorization.RequestHandler$AjcClosure5.run(RequestHandler.java:1)
| 1613335179384 | at org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:257)
| 1613335179384 | at software.amazon.lambda.powertools.metrics.internal.LambdaMetricsAspect.around(LambdaMetricsAspect.java:56)
| 1613335179384 | at com.myservice.authorization.RequestHandler.handleRequest(RequestHandler.java:55)
| 1613335179384 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
| 1613335179384 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
| 1613335179384 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
| 1613335179384 | at java.base/java.lang.reflect.Method.invoke(Unknown Source)
| 1613335179384 | at lambdainternal.EventHandlerLoader$StreamMethodRequestHandler.handleRequest(EventHandlerLoader.java:375)
| 1613335179384 | at lambdainternal.EventHandlerLoader$2.call(EventHandlerLoader.java:899)
| 1613335179384 | at lambdainternal.AWSLambda.startRuntime(AWSLambda.java:258)
| 1613335179384 | at lambdainternal.AWSLambda.startRuntime(AWSLambda.java:192)
| 1613335179384 | at lambdainternal.AWSLambda.main(AWSLambda.java:187)
| 1613335179506 | {"instant":{"epochSecond":1613335179,"nanoOfSecond":506000000},"thread":"main","level":"INFO","loggerName":"com.myservice.authorization.RequestHandler","message":"Finished processing request","endOfBatch":false,"loggerFqcn":"org.apache.logging.log4j.spi.AbstractLogger","threadId":1,"threadPriority":5,"timestamp":"2021-02-14T20:39:39.506Z[UTC]","coldStart":"true","functionArn":"arn:aws:lambda:us-east-1:xxxxxxxxxxxx:function:AuthorizationServiceLambda","functionMemorySize":"512","functionName":"AuthorizationServiceLambda","functionVersion":"$LATEST","function_request_id":"b74a3b4d-3440-4518-a912-56c8cc53b832","samplingRate":"0.0","service":"authorization-service","xray_trace_id":"1-60298a81-f37154f04fe1c3846c6dec35"}
| 1613335179704 | {"traceId":"Root=1-60298a81-f37154f04fe1c3846c6dec35;Parent=c34f426197c429d1;Sampled=1","functionVersion":"$LATEST","LogGroup":"AuthorizationServiceLambda","ServiceName":"AuthorizationServiceLambda","ServiceType":"AWS::Lambda::Function","Service":"authorization-service","logStreamId":"2021/02/14/[$LATEST]e794fcfaf4304e4780df7d5fd99261b1","executionEnvironment":"AWS_Lambda_java11"}
| 1613335179705 | END RequestId: b74a3b4d-3440-4518-a912-56c8cc53b832
| 1613335179705 | REPORT RequestId: b74a3b4d-3440-4518-a912-56c8cc53b832 Duration: 4988.02 ms Billed Duration: 4989 ms Memory Size: 512 MB Max Memory Used: 227 MB Init Duration: 4751.94 ms  XRAY TraceId: 1-60298a81-f37154f04fe1c3846c6dec35 SegmentId: 18cccb1901b10289 Sampled: true

Gradle setup works only with 5.x, 6.x but not with 7.x

Gradle example provided works only when you are on gradle 5.x.

aspectj.AspectjGradlePlugin has to be pinned to version 0.0.3.

aspectj.AspectjGradlePlugin from version 0.0.4 is compiled on Java 12. Which makes plugin incompatible to use with lambda functions on runtime lower than Java 12.

Also version 0.0.3 had a bug which made it incompatible to work both on gradle versions 5.x and 6.x. Although this is now fixed in version 0.0.5, but the versions above 0.0.4 of plugin can only work with java 12 or above.

Here is the open issue to make plugin work with Java 11 or lower.

What were you trying to accomplish?

Expected Behavior

Should work with gradle 6.x as well.

Current Behavior

Fails with error on gradle 6.x


An exception occurred applying plugin request [id: 'aspectj.AspectjGradlePlugin', version: '0.0.3']
> Failed to apply plugin [id 'aspectj.AspectjGradlePlugin']
   > Could not create task ':compileAspect'.
      > Unnecessarily replacing a task that does not exist is not supported.  Use create() or register() directly instead.  You attempted to replace a task named 'compileAspect', but there is no existing task with that name.


Possible Solution

None as of now. Use gradle 5.x. with gradle 6.x users have to upgrade AWS lambda function runtime to Java 12 or above and use latest version of aspectj.AspectjGradlePlugin plugin.

Environment

  • Powertools version used: latest
  • Packaging format (Layers, Maven/Gradle): Gradle
  • AWS Lambda function runtime: Java 11 or lower
  • Debugging logs

Ability to disable logevent for test or inject object mapper instance

Is your feature request related to a problem? Please describe.

During testing, I've been passing a mock s3Event to the handler, this causes powertools logging to attempt to deserialize this but jackson throws an exception as it can't for a mock which leads to huge stack traces in the test logs.

Failed logging event of type class com.amazonaws.services.lambda.runtime.events.S3Event$MockitoMock$957278562 com.fasterxml.jackson.databind.exc.InvalidDefinitionException: No serializer found for class org.mockito.internal.junit.DefaultStubbingLookupListener and no properties discovered to create BeanSerializer (to avoid exception, disable SerializationFeature.FAIL_ON_EMPTY_BEANS) (through reference chain: com.amazonaws.services.lambda.runtime.events.S3Event$MockitoMock$957278562["records"]->java.util.ImmutableCollections$List12[0]->com.amazonaws.services.lambda.runtime.events.models.s3.S3EventNotification$S3EventNotificationRecord$MockitoMock$1268202659["s3"]->com.amazonaws.services.lambda.runtime.events.models.s3.S3EventNotification$S3Entity$MockitoMock$1728425648["bucket"]->com.amazonaws.services.lambda.runtime.events.models.s3.S3EventNotification$S3BucketEntity$MockitoMock$103719884["mockitoInterceptor"]->org.mockito.internal.creation.bytebuddy.MockMethodInterceptor["mockHandler"]->org.mockito.internal.handler.InvocationNotifierHandler["mockSettings"]->org.mockito.internal.creation.settings.CreationSettings["stubbingLookupListeners"]->java.util.concurrent.CopyOnWriteArrayList[0]) at com.fasterxml.jackson.databind.exc.InvalidDefinitionException.from(InvalidDefinitionException.java:77) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.SerializerProvider.reportBadDefinition(SerializerProvider.java:1191) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.DatabindContext.reportBadDefinition(DatabindContext.java:404) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.impl.UnknownSerializer.failForEmpty(UnknownSerializer.java:71) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.impl.UnknownSerializer.serialize(UnknownSerializer.java:33) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serializeContents(IndexedListSerializer.java:119) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:79) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:18) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:727) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:722) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:166) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:727) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:722) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:166) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:727) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:722) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:166) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:727) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:722) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:166) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:727) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:722) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:166) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:727) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:722) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:166) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serializeContents(IndexedListSerializer.java:119) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:79) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.impl.IndexedListSerializer.serialize(IndexedListSerializer.java:18) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:727) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.std.BeanSerializerBase.serializeFields(BeanSerializerBase.java:722) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.BeanSerializer.serialize(BeanSerializer.java:166) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider._serialize(DefaultSerializerProvider.java:480) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:319) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ObjectMapper._configAndWriteValue(ObjectMapper.java:4094) ~[jackson-databind-2.10.1.jar:2.10.1] at com.fasterxml.jackson.databind.ObjectMapper.writeValueAsString(ObjectMapper.java:3404) ~[jackson-databind-2.10.1.jar:2.10.1] at software.amazon.lambda.powertools.logging.internal.LambdaLoggingAspect.asJson(LambdaLoggingAspect.java:192) ~[powertools-logging-1.2.0.jar:?] at software.amazon.lambda.powertools.logging.internal.LambdaLoggingAspect.logEvent(LambdaLoggingAspect.java:152) ~[powertools-logging-1.2.0.jar:?] at software.amazon.lambda.powertools.logging.internal.LambdaLoggingAspect.around(LambdaLoggingAspect.java:95) ~[powertools-logging-1.2.0.jar:?]

Describe the solution you'd like

If there was some way to turn off the logging when testing, or perhaps more simply, if I could pass in my own objectMapper instance, I could configure it to be able to deserialize the mock.

RFC: Drop the @Powertools prefix from the annotations

Is your feature request related to a problem? Please describe.

Java developers are used to multiple imports and naming conventions. @Logging @Metric/Timed @Trace and so on. By not using Powertools prefix it would adhere to style of other frameworks.

Describe the solution you'd like

Use shorter annotation names, as we already using this in a Lambda fashion where would simply make sense that we're importing the Powertools annotation. Java differently from other languages, has sorted this out thought import organization within IDEs.

Describe alternatives you've considered

Either continue with Prefixed Annotations, or simply abide to other more common names, without the prefix.

Additional context

  1. https://docs.newrelic.com/docs/apm/transactions/transaction-traces/introduction-transaction-traces
  2. https://micrometer.io/docs/concepts#_the_timed_annotation

feat: Support for Boolean and Number type as value in TracingUtils putAnnotation

Is your feature request related to a problem? Please describe.
I came across a minor feature missing in TracingUtils. XRay SDK supports recording annotations as:

  • Keys – Up to 500 alphanumeric characters. No spaces or symbols except underscores.
  • Values – Up to 1,000 Unicode characters.

With key being a string and value can be a Boolean, Number, or String value.

However, TracingUtils just supports value as String, there is no support for Number, Boolean.

Describe the solution you'd like
Adding support for Number & Boolean like:

public static void putAnnotation(String key, Number value) {
  AWSXRay.getCurrentSubsegmentOptional()
    .ifPresent(segment -> segment.putAnnotation(key, value));
}

Describe alternatives you've considered
This is not a blocker because I can use String.valueOf(myIntValue)

Additional context

Spotbugs issues when using Powertools

My team has a Gradle setup in our Lambda. We have Spotbugs enabled and failing our builds on finding issues. Once we added the Powertools dependencies Spotbugs picked up a few code issues in Powertools which I'm reporting below.

What were you trying to accomplish?
Use Powertools Logging, Metrics and Tracing.

Expected Behavior

Build to succeed without issues once dependencies were included

Current Behavior

Spotbugs picks up issues in Powertools code

Screenshot 2021-02-25 at 10 19 17

Environment

  • Powertools version used: 1.2.0
  • Packaging format (Layers, Maven/Gradle): Gradle, Spotbugs
  • AWS Lambda function runtime: Java 11

Feature request: support large payload messages received from SNS

Is your feature request related to a problem? Please describe.
Currently, @SqsLargeMessage only supports large messages which are sent directly from SQSExtendedClient, i.e: SQSExtendedClient -> SQS -> Lambda.

A common use case I've seen is subscribing the SQS to an external SNS topic where the topic could have large payload message as well, i.e. SNSExtendedClient -> SNS -> SQS -> Lambda.

Right now, the expected SQSEvent is:

{
    "records": [
        {
            "body": "[\"software.amazon.payloadoffloading.PayloadS3Pointer\",{\"s3BucketName\":\"some-bucket\",\"s3Key\":\"some-key\"}]",
            ...
        }

    ]
}

In the SNS case, the body would be a serialized string of SNSEvent.SNS class. After deserializing it, SNSEvent.SNS#getMessage() would return a PayloadS3Pointer as the current case.

Describe the solution you'd like

We could either have a flag for the current annotation or another annotation to essentially the string message of the SNS event before checking if it's a large message or not. We will need to reconstruct the SNSEvent.SNS as the body of the records though.

Ability to remove custom key from logger via LoggingUtils

Hi all,
I have a question (or maybe a suggestion) about the power-tools-logging and pardon me if this is not the right place for this question.
I recently come across this Lambda Power logging and its feature of changing the logging level via environment variable is a useful one for us.
When I try to use it, I have coded to clear the custom key-pair values with ThreadContext.remove() and ThreadContext.clearAll() to avoid incorrect log output. I saw the power-tools logging LogginUtils contains method to appendKey() and appendKeys(), which in term calls ThreadContext.put() and putAll(). However, there is no wrapper methods to remove the keys.
Hence, I have to still directly import the ThreadContext class to perform the remove operation in the code.
It would be nice if LoggingUtils also include remove keys method so that my lambda code can completely rely on power-tools-logging for custom keys logging.

Originally posted by @seng-thebouqs in #393

Integration with CloudWatch ServiceLens

Is your feature request related to a problem? Please describe.
Just like python power tools, we should be supporting integration with CloudWatch ServiceLens

Describe the solution you'd like

The xray_trace_id key should be added to log output when tracing is active. This enables the log correlation functionality in ServiceLens to work with applications using powertools.

Describe alternatives you've considered
NA

Additional context
Ref https://github.com/awslabs/aws-lambda-powertools-python/releases/tag/v1.5.0

Cold start captured as false when using together with Tracing

When using Tracing annotation on method level and not just handler method, cold start flag on log are always captured as false

Expected Behavior

Cold start flag should be captured correctly on logs

Current Behavior

captured as false always

Possible Solution

Steps to Reproduce (for bugs)

Environment

  • Powertools version used:
  • Packaging format (Layers, Maven/Gradle):
  • AWS Lambda function runtime:
  • Debugging logs

How to enable debug mode**

# paste logs here

RFC: Provide logger instance upon using @PowertoolsLogging

Key information

  • RFC PR: (leave this empty)
  • Related issue(s), if known: This will likely be renamed if #68 pass.
  • Area: (i.e. Tracer, Metrics, Logger, etc.) Logger
  • Meet tenets: (Yes/no) Yes, to all of them

Summary

This is a lean and clean way to provide consistency on what we require from users in terms of logging. I might want to break this into two separate aspects, as the implementation and the facade, but since we're using Log4J2, I'll open this, and maybe consider a different timeline for the proxy one (Log4J2 over Slf4J).

Consider https://projectlombok.org/features/log, whenever the user interacts with the annotation, an explicit instance of the object is created on users behalf to be used. This makes the declaration, import collision, qualifying the correct import in the hands of the library instead of the user, pushing again convention over configuration.

Motivation

This is a simplified way of adding a logger to existing functions. This also helps the user to adhere to EMF and makes it straight forward.

Proposal

This is the bulk of the RFC.

  • Introduce a logger instance when added to a class
  • Introduce a logger local variable when adding to a method
  • Update the documentation to remove existing logging declarations

This should get into specifics and corner-cases, and include examples of how the feature is used. Any new terminology should be defined here.

None, only hiding implementation details.

Drawbacks

None so far.

Why should we not do this?
Convention over configuration has to have a line when it's too much. I don't believe this will likely cross that line, as we're requiring the users to adopt our Log4J2 instance anyway.

Do we need additional dependencies? Impact performance/package size?
We don't. This will be an implementation of current ask from users, which is create the local logger instance in their classes.

Rationale and alternatives

  • What other designs have been considered? Why not them?
  • What is the impact of not doing this?
    Users are still required to declare themselves the logger instance

Unresolved questions

  1. What if the user is already using other logging library? How do address that in this new RFC? How does that works today?

Optional, stash area for topics that need further development e.g. TBD

Unit Tests are missing from the project

The README has the following text:

Unit tests

Tests are defined in the HelloWorldFunction/src/test folder in this project.

example$ cd HelloWorldFunction
HelloWorldFunction$ gradle test

But there is no such folder HealloWorldFunction/src/test

What were you trying to accomplish?

Expected Behavior

There should be a HealloWorldFunction/src/test folder with unit tests

Current Behavior

There is no such folder that I can find

Possible Solution

Add the unit tests to the repository, if they exist, or remove the Unit Test section from the README if they do not

RFC: Support for lambda extensions

Is your feature request related to a problem? Please describe.
I' currently using lambda extensions for starting an external process and tried to implement powertools' logging and metrics modules into it, however they did not work.

Describe the solution you'd like
Powertools recognizing that it is inside lambda environment to emit metrics and format logs.

Describe alternatives you've considered
Using cloudwatch sdk and log4j direct implementation, however it involves a lot of plumbing that is not easy to handle.

Additional context
Im using java 8 corretto runtime.

RFC: Parameter Injection

Key information

  • RFC PR:
  • Related issue(s), if known:
  • Area: Parameter
  • Meet tenets: Yes

Summary

Provide an annotation based injection of parameters. A concept similar to the @value annotation in Spring, backed up by the existing providers for SSM and Secrets Manager

Motivation

This feature will make it easier and concise to retrieve and use parameters in a declarative way.

Proposal

Annotating a String field into the Lambda handler would result into the field to be populated with the value of the referenced parameter available in one of the possible providers (SSM and Secrets Manager for the time being).

It could look like this:

@Parameter(name="/powertools-java/sample/simplekey")
private String myParam;

The annotation can provide options to specify name, provider, transformation, caching

@Parameter(name="/sample/simplekey", provider="SSM", maxAge=30)
private String myParam;

Drawbacks

Why should we not do this?
Less transparency in the code

Do we need additional dependencies? Impact performance/package size?
No

Rationale and alternatives

  • What other designs have been considered? Why not them?
  • What is the impact of not doing this?

Unresolved questions

Optional, stash area for topics that need further development e.g. TBD

[Metrics] Improvements to Powertools Metrics

Is your feature request related to a problem? Please describe.
The following requests are based on a recent experience I had using Powerttools Metrics having used https://github.com/awslabs/aws-embedded-metrics-java in different projects before and the workarounds we had to implement and hoping they should be taken care of by Powertools so we can move our older projects to Powertools and have a consistent experience among the different projects.

As such, I'll mention a few points below

Describe the solution you'd like

  1. Improve documentation/warnings about the fact that powertools annotations require the Lambda handler method to be called handlerRequest

    • I use CloudGuru Profiler with the setup described in https://docs.aws.amazon.com/codeguru/latest/profiler-ug/lambda-custom.html. This implementation overrides/hides handleRequest and instead exposes requestHandler method that users should override to add their logic
    • It took me a lot of time debugging/investigating this; time that could've been saved if there was better documentation or warnings about where the annotations are.
    • My workaround was to create a new BaseHandler that extends RequestHandlerWithProfiling, override handleRequest calling super.handleRequest and annotating it with the powertools annotations. Then implementing my concrete handlers as subclasses of BaseHandler
  2. Metrics emitted by MetricsUtils.metricsLogger should have a property for the requestId

    • The Metric EMF entity emitted does not contain a property for the requestId. It has other useful things like traceId
    • I think the LambdaMetricsAspect can be updated to add a requestId property by default
  3. Metrics emitted by MetricsUtils.metricsLogger should allow dimensions to be overriden or unset

  4. MetricsUtils.withSingleMetric should have improved properties and namespace configs

    • withSingleMetric requires the setting of a namespace. In most cases, one would want to use the default namespace which is not easily accessible (beyond grabbing it explicitly from the environment variables). I believe there should be a version of this method which simply reuses the namespace configured for MetricsUtils.metricsLogger
    • withSingleMetric should also add the requestId property
    • In our projects that use EMF, we have an abstraction similar to MetricsUtils. It provides the same general capabilities - a default metricsLogger that gets flushed at the end of a request + one off metrics. The one thing we have on top of that is that we're able to set some default properties that will be written to both types of metrics e.g. defaultProperties that we can append to at any time. It adds to the MetricsUtils.metricsLogger, but gets applied to the equivalent of withSingleMetrics when they're instantiated. This allows us to have a consistent set of properties common across all our EMF metrics.

Additional context
It's likely that each of topics above should be addressed in a different issue and I'm happy to break it down further if you think that's useful.

I'm also happy to attempt to contribute after discussions if needed.

Feature request: Allow setting a custom S3 client when using LargeMessageHandling

Is your feature request related to a problem? Please describe.
Currently the large message handler uses default AmazonS3 client to download client. It means the lambda execution IAM role must have S3 read permissions. The problem is when the S3 bucket is owned by another owner, they provide us the access via an IAM role and ask us to assume that role to read S3 content. Right now, we can't pass a custom S3 client (with assumed credentials) into the LargeMessageHandler.

Describe the solution you'd like
Be able to provide a custom AmazonS3 client for large message handling

Improve clarity and fix Docs for Logging

What were you initially searching for in the docs?

  • Don't use service_undefined in Java, use Null or String.Empty.
  • Use units to represent the percentage, clarity over the 0.1. (http://jsr-108.sourceforge.net/javadoc/javax/units/NonSI.html#PERCENT) I'll review the javadocs of it to make it explicitly whether we should be using 0.01, 0.1, 10, etc.
  • Message, Unserializable JSON values will be casted to string, question: How? calling toString()? if so, tell that toString() will be called. Also, what about exceptions there? Will that be swollen?
  • Version is great, what about logging the alias?
  • Please, make this a warning/info box, this is too subtle the way it is today and if this precedence is respected elsewhere, create a section for know before you go. Configuration on environment variable is given precedence over sampling rate configuration on annotation, provided it's in valid value range.

Is this related to an existing part of the documentation? Please share a link
https://awslabs.github.io/aws-lambda-powertools-java/core/logging/#standard-structured-keys

Describe how we could make it clearer
Address the points listed

If you have a proposed update, please share it here

Warning during build process from aspectj-maven-plugin

While building the lib currently we see a warning:

[INFO] --- aspectj-maven-plugin:1.12.1:compile (default) @ aws-lambda-powertools-java ---
[INFO] Showing AJC message detail for messages of types: [error, warning, fail]
[WARNING] Processing Log4j annotations
        <unknown source file>:<no line information>

[WARNING] Annotations processed
        <unknown source file>:<no line information>

[WARNING] Processing Log4j annotations
        <unknown source file>:<no line information>

[WARNING] No elements to process
        <unknown source file>:<no line information>

There should be some flag to make it stop logging warns. Its always nice with clean build logs.

Java doc tests cases ?

There is a plugin which we can include to test our java docs. Might consider using it ?

Create a SAM template with Powertools support bake in

Is your feature request related to a problem? Please describe.

No

Describe the solution you'd like

I'd like to use 'sam init' to create a java project which comes with Powertools included.

Describe alternatives you've considered

Maybe it should live in the SAM project? I think it would also do well here so it can be maintained in the same release cycles.

Additional context

An example would be the Python SAM template - https://github.com/aws-samples/cookiecutter-aws-sam-python

[Enhancement] SSM get parameter taking too long to retrieve a parameter.

Hello, is it normal that a simple code to get a simple SSM parameter takes over 10 seconds?

  @Override
  public String handleRequest(Object o, Context context) {
    LambdaLogger logger = context.getLogger();

    Instant start = Instant.now();
    SSMProvider ssmProvider = ParamManager.getSsmProvider();
    String ssmParameter = ssmProvider.withDecryption().get("test-parameter");
    logger.log("ssm parameter: " + ssmParameter);
    Instant finish = Instant.now();
    logger.log("Total duration SSM: " + Duration.between(start, finish).toMillis());

    return ssmParameter;
 }

Similar code to pull a parameter from SSM takes less than a second in other languages using AWS SDK, for instance, NodeJs.

Is this a normal java behavior?

Notes:

  • This issue is not related to the withDecryption() functionality, even without it the code takes the same amount of time.
  • This is not related to Java cold start (using Lambda), a simple "Hello World" application takes less than a second to execute.

Log events of logging module does not always logs entire event

Using log event feature of @PowetrtoolsLogging.

Expected Behavior

When logging S3EventNotification, it should log payload.

Current Behavior

Does not logs payload but rather just Object Hash

Possible Solution

Use object mapper to parse the object instead of depending on toString implementation on POJO.

Steps to Reproduce (for bugs)

Handler like:

   @PowertoolsLogging(logEvent = true)
    public IndexFacesResponse handleRequest(S3EventNotification input, Context context) {

Logs message com.amazonaws.services.lambda.runtime.events.models.s3.S3EventNotification@

Environment

  • Powertools version used: 0.3.1-beta
  • Packaging format (Layers, Maven/Gradle): Both
  • AWS Lambda function runtime: java8
  • Debugging logs

How to enable debug mode**

# paste logs here

Cold start increased significantly by enabling powertools-logging

What were you trying to accomplish?
While testing in my sample (cold start optimized) application I can measure up to 20%+ increase by enable powertools-logging instead of lambda-logging (SLF4J + Logback)

The tests are executed by leveraging lumigo-cli measure-lambda-cold-starts.

Expected Behavior

Cold starts timings shouldn't become much worse by switching out another logging framework for powertools-logging

Current Behavior

lambda-logging (baseline)

 {
      "functionName": "client-service-dev-create-client",
      "memorySize": 3008,
      "coldStarts": 40,
      "min": 1142.29,
      "p25": 1192.22,
      "median": 1213.9,
      "p75": 1223.73,
      "p95": 1261.69,
      "max": 1276.78,
      "stddev": 29.0004
}

powertools-logging

{
      "functionName": "client-service-dev-create-client",
      "memorySize": 3008,
      "coldStarts": 71,
      "min": 1787.6,
      "p25": 1952.06,
      "median": 1981.9,
      "p75": 2004.21,
      "p95": 2063.38,
      "max": 2149.03,
      "stddev": 53.575
}

Possible Solution

/

Steps to Reproduce (for bugs)

  1. git clone https://github.com/drissamri/powertools-coldstart-issue
  2. mvn clean package
  3. sls deploy --region eu-central-1
  4. npm install -g lumigo-cli
  5. lumigo-cli measure-lambda-cold-starts -p dev -n client-service-dev-create-client -r eu-central-1 -f src/test/resources/create-client-event.json

This gives you the baseline results, now we enable powertools:

  1. Switch to powertools git branch
  2. mvn clean package
  3. sls deploy --region eu-central-1
  4. lumigo-cli measure-lambda-cold-starts -p dev -n client-service-dev-create-client -r eu-central-1 -f

Environment

  • Powertools version used: 0.4.0-beta
  • Packaging format (Layers, Maven/Gradle): maven (zip)
  • AWS Lambda function runtime: java11

isDebugEnabled() returning false when debug enabled through env var

I have several sections in my debug wrapped in the logger.isDebugEnabled()
What I noticed is that setting the log level to debug through the env var LOG_LEVEL causes these log lines not to appear. This seems like a bug to me.

Expected Behavior

isDebugEnabled should return true.

Current Behavior

Possible Solution

Steps to Reproduce (for bugs)

  1. Configure logging via config to be INFO
  2. Override to DEBUG via LOG_LEVEL env var

Environment

  • Powertools version used:
  • Packaging format (Layers, Maven/Gradle):
  • AWS Lambda function runtime:
  • Debugging logs

How to enable debug mode**

# paste logs here

Idempotency utility support similar to python version

Is your feature request related to a problem? Please describe.

Idempotency utility support similar to python version https://awslabs.github.io/aws-lambda-powertools-python/latest/utilities/idempotency/

Describe the solution you'd like
Idempotency utility support similar to python version https://awslabs.github.io/aws-lambda-powertools-python/latest/utilities/idempotency/

Describe alternatives you've considered
Implementing idempotency ourselves

Additional context

compileTestAspect can't compile Classes with static fields

What were you trying to accomplish?
I have a class that resembles a respository. In this class a have a

public class VatRepository extends DynamoRepository<VatEntity> {
    public static String VATENTITY_TABLE_VAR_NAME = "VAT_TABLE_NAME";

In my tests, I am trying to access this field to set a system property:

    @BeforeAll
    void setup() {

        System.setProperty(VatRepository.VATENTITY_TABLE_VAR_NAME, "vatTable-test");

when I run the tests I get this error from the compileTestAspect :

0::0 VATENTITY_TABLE_VAR_NAME cannot be resolved or is not a field

This is not happening in the compileAspect task, that produces the production code.
Is there anything I can do that the AspectJ plugin is not trying to compile my classes?

I am building with gradle.

Expected Behavior

It should compile

Steps to Reproduce (for bugs)

  1. create a class with a static property and use it in tests.

Environment

  • Powertools version used: 1.1.0
  • Packaging format (Layers, Maven/Gradle): Gradle
  • AWS Lambda function runtime: java11
  • Debugging logs

Gradle.build:

plugins {
    id 'java'
    id 'aspectj.AspectjGradlePlugin' version '0.0.6'

}

sourceCompatibility = 1.11
targetCompatibility = 1.11

repositories {
    jcenter()
    maven {
        name "DynamoDB Local Release Repository - EU (Frankfurt) Region"
        url "https://s3.eu-central-1.amazonaws.com/dynamodb-local-frankfurt/release"
    }

}

dependencies {
    implementation platform('software.amazon.awssdk:bom:2.15.40')
    implementation 'software.amazon.awssdk:dynamodb'
    //https://aws.amazon.com/de/blogs/developer/introducing-enhanced-dynamodb-client-in-the-aws-sdk-for-java-v2/
    implementation 'software.amazon.awssdk:dynamodb-enhanced'
    implementation 'com.amazonaws:aws-lambda-java-core:1.2.1'
    implementation 'com.amazonaws:aws-lambda-java-events:2.2.9'
    implementation 'software.amazon.lambda:powertools-sqs:1.1.0'
    aspectpath 'software.amazon.lambda:powertools-sqs:1.1.0'

    implementation 'io.burt:jmespath-jackson:0.5.0'
    implementation 'com.fasterxml.jackson.core:jackson-databind:2.12'
    implementation 'com.fasterxml.jackson.datatype:jackson-datatype-jdk8'
    implementation 'com.fasterxml.jackson.datatype:jackson-datatype-jsr310'

    // Use JUnit Jupiter
    testImplementation 'org.junit.jupiter:junit-jupiter-api:5.6.2'
    testImplementation 'org.hamcrest:hamcrest:2.2'

    testRuntimeOnly 'org.junit.jupiter:junit-jupiter-engine'
    testCompile group: 'com.amazonaws', name: 'DynamoDBLocal', version: '1.13.5'
    testCompile group: 'org.jeasy', name: 'easy-random-core', version: '5.0.0'
    testCompile group: 'org.jeasy', name: 'easy-random-core', version: '5.0.0'
    testCompile group: 'org.jeasy', name: 'easy-random-bean-validation', version: '5.0.0'
    testCompile group: 'org.jeasy', name: 'easy-random-randomizers', version: '5.0.0'

}
task copyNativeDeps(type: Copy) {
    from(configurations.compile + configurations.testCompile) {
        include '*.dll'
        include '*.dylib'
        include '*.so'
    }
    into 'build/libs'
}


test {
    // Use junit platform for unit tests.
    dependsOn copyNativeDeps
    systemProperty "java.library.path", 'build/libs'
    useJUnitPlatform()
}

gradle.properties:

aspectjVersion=1.9.6

gradle-wrapper.properties:

distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-6.7-bin.zip
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists

[Feature Request][Bug?] Request-scope LoggingUtils setting of keys

Is your feature request related to a problem? Please describe.
My team uses Powertools core utilities (Logging, Metrics and Tracing) in one of our Lambda functions.
In that, we rely on LoggingUtils.appendKeys and metricsLogger().putProperty to append specific request-related key/value pairs that improve our ability to debug issues and run adhoc queries using CloudWatch Logs Insights.

We've discovered an issue with our logging that values set via LoggingUtils.appendKeys are not cleared after the request is done, they instead get carried over to the following requests unless those key/value pairs are overridden which led to wrong key/value pairs being associated with a given request's log lines.

This is an unexpected behaviour, our assumption was that logging properties set in a given request, would just be scoped to that request.

Describe the solution you'd like
Setting logging properties via LoggingUtils.appendKeys should be scoped to a request i.e. they should be automatically cleared after the request is completed.

The behaviour where properties are carried over from one request to the next, opens a door for logging bugs as in Lambda.

Describe alternatives you've considered
The main alternative, and current mitigation, is to explicitly clear the logging properties using LoggingUtils.removeKeys. While it achieves the same result, it's cumbersome - as it requires tracking the keys to clear.

Gradle plug-in for compile-time weave (CTW)

Is your feature request related to a problem? Please describe.

Describe the solution you'd like

Please include a gradle plugin for setting up the compile-time weave (CTW). I think this would be a far more popular option than maven.

Describe alternatives you've considered

Additional context

Build failure when including aspectj-maven-plugin with runtime version java 11,12,13

Following the setup instructions of the library and including below snippet under <build>section, Lambda is working with runtime Java 11. Tried with Java 10, 12 and 13 as well.

        <plugin>
             <groupId>org.codehaus.mojo</groupId>
             <artifactId>aspectj-maven-plugin</artifactId>
             <version>1.11</version>
             <configuration>
                 <source>1.8</source>
                 <target>1.8</target>

Build fails with below logs:

Execution default of goal org.codehaus.mojo:aspectj-maven-plugin:1.11:compile failed: 
Plugin org.codehaus.mojo:aspectj-maven-plugin:1.11 or one of its dependencies could not be resolved: 
Could not find artifact com.sun:tools:jar:13.0.2 at specified path /Library/../lib/tools.jar 

Expected Behavior

Build should succeed and able to use the library utilities

Current Behavior

Build fails with below logs:

Execution default of goal org.codehaus.mojo:aspectj-maven-plugin:1.11:compile failed: 
Plugin org.codehaus.mojo:aspectj-maven-plugin:1.11 or one of its dependencies could not be resolved: 
Could not find artifact com.sun:tools:jar:13.0.2 at specified path /Library/../lib/tools.jar 

Possible Solution

There is a bug report on official version of aspectj-maven-plugin which makes plugin incompatible with these java version.

Suggested workaround for now is to use below plugin as discussed in this issue until plugin is patched.

           <plugin>
                <groupId>com.nickwongdev</groupId>
                <artifactId>aspectj-maven-plugin</artifactId>
                <version>1.12.1</version>

Steps to Reproduce (for bugs)

  1. Follow setup instruction
  2. Make Lambda runtime as one of Java 10,11,12,13
  3. Build the project
  4. Build fails with below logs

Environment

  • Powertools version used: Latest
  • Packaging format (Layers, Maven/Gradle): Maven/Gradle
  • AWS Lambda function runtime: Java 10, 11, 12, 13.
  • Debugging logs
Execution default of goal org.codehaus.mojo:aspectj-maven-plugin:1.11:compile failed: 
Plugin org.codehaus.mojo:aspectj-maven-plugin:1.11 or one of its dependencies could not be resolved: 
Could not find artifact com.sun:tools:jar:13.0.2 at specified path /Library/Java//../lib/tools.jar 

Unable to use SQS Large Message Handling with Gradle

I have an AWS Lambda function that receives an SQSEvent. I need to enable support for large messages that are stored in S3.

I'm following the instructions here: https://awslabs.github.io/aws-lambda-powertools-java/utilities/sqs_large_message_handling/
I'm using Gradle, so I add this to my build.gradle as described:

dependencies { ... implementation 'software.amazon.lambda:powertools-sqs:1.5.0' aspectpath 'software.amazon.lambda:powertools-sqs:1.5.0' }

Error: Could not find method aspectpath() for arguments [software.amazon.lambda:powertools-sqs:1.5.0] on object of type org.gradle.api.internal.artifacts.dsl.dependencies.DefaultDependencyHandler.

Ok so I gather I need to add the AspectJ plugin, like in the Maven example (although not mentioned in the Gradle example).
In the Maven example you use the one by org.codehaus.mojo, but there doesn't seem to be an equivalent for Gradle. This one seems like the official plugin for Gradle - https://plugins.gradle.org/plugin/aspectj.gradle - so I use that.

plugins { id "aspectj.gradle" version "0.1.6" }

Error: You must set the property 'aspectjVersion' before applying the aspectj plugin

Stack Overflow https://stackoverflow.com/questions/55753896/gradle-error-you-must-set-the-property-aspectjversion-before-applying-the-as tells me to define the version in gradle.properties so I do that

Error: java.lang.IllegalStateException: Unnecessarily replacing a task that does not exist is not supported. Use create() or register() directly instead.

Google leads me to this issue #146

I'm using Gradle 6, and I can't drop to Gradle 5 in my workplace. So like the issue says I use a different AspectJ plugin: https://plugins.gradle.org/plugin/aspectj.AspectjGradlePlugin.

It now builds. I run my tests...

Error: Task :compileAspect warning You aren't using a compiler supported by lombok, so lombok will not work and has been disabled.

That's this issue: projectlombok/lombok#2592. So I add the workaround to my build process options:

-Djps.track.ap.dependencies=false

I run the tests again.

`Unable to find method ''org.gradle.internal.deprecation.DeprecationMessageBuilder$DeprecateProperty$WithDeprecationTimeline org.gradle.internal.deprecation.DeprecationMessageBuilder$DeprecateProperty.willBeRemovedInGradle8()''
Your project may be using a third-party plugin which is not compatible with the other plugins in the project or the version of Gradle requested by the project.

In the case of corrupt Gradle processes, you can also try closing the IDE and then killing all Java processes`

At this point I'm seriously considering my career choices. I restart my IDE and kill all Java processes. I reimport my dependencies. Same error. This time I click on:

Re-download dependencies and sync project (requires network)

Same error. I click on

Stop Gradle build processes (requires restart)

Same error.

I'm out of options. I've lost a whole day to this.

Please can you tell me how I can use SQS Large Message Handling using Gradle, and provide working instructions on your documentation. OR - is there an alternative way I can receive large SQS messages (that are offloaded to S3) in my lambda function?

Changing log level using env var

We have a logger that extends org.apache.logging.log4j.core.Logger, and are using PowerTools Logging. The behaviour of the logger is inconsistent when you modify the Power Tools environment variable.

What were you trying to accomplish?

When we modify the environment variable to change the log level. It does not completely work. When we call log methods such as log(), info(), debug(), trace(), the change to the environment variable does work with those log methods.

But when we call the isEnabled log methods like isTraceEnabled() or isDebugEnabled() it will return a boolean based on the log4j2.xml param, not the env var.

So this would work:
logDebug(myMsg)
not this:
if ( isDebugEnabled()) { log(myMsg) }

Expected Behavior

The isDebugEnabled method should return true if the lambda environment variable is set to DEBUG, and log4j2.xml has log level set to INFO

Current Behavior

The isDebugEnabled method returns false

Possible Solution

I am not sure if this is a bug in PowerTools, or if this is caused by our wrapper. But my guess is that, here on the method starting on line 117
https://github.com/awslabs/aws-lambda-powertools-java/blob/master/powertools-loggi[…]zon/lambda/powertools/logging/internal/LambdaLoggingAspect.java

If you pass false to the getContext method, aren't you getting a new context object ? and not the current one ?

Steps to Reproduce (for bugs)

  1. Set log level in log4j2.xml to INFO
  2. Once lambda is deployed change the POWERTOOLS_LOG_LEVEL to DEBUG
  3. in the lambda run: logger.log("I should be visible")
  4. in the lambda run: if ( isDebugEnabled()) { logger.log("I should also be visible") }

Environment

  • Powertools version used: 1.7.1
  • Packaging format (Layers, Maven/Gradle): Gradle
  • AWS Lambda function runtime:
  • Debugging logs

How to enable debug mode**
This link doesn't actually say how to enable debug-mode ? Something missing in the doc ?

# paste logs here

  2021-09-22T09:20:37.544-04:00 {"instant":{"epochSecond":1632316837,"nanoOfSecond":544000000},"thread":"main","level":"INFO","loggerName":"com.amazon.pharmacycontrolsystem.machineproxymockservice.lambda.MachineProxyMockService","message":"isDebugEnabled : 'false'","endOfBatch":false,"loggerFqcn":"org.apache.logging.log4j.spi.AbstractLogger","threadId":1,"threadPriority":5,"timestamp":"2021-09-22T13:20:37.544Z[UTC]","coldStart":"false","functionArn":"arn:aws:lambda:us-east-1:314485189890:function:PCS-personal-MacProxyMockS-MacProxyMockSvcD1773E09-rAWyTsUVhgYl","functionMemorySize":"512","functionName":"PCS-personal-MacProxyMockS-MacProxyMockSvcD1773E09-rAWyTsUVhgYl","functionVersion":"$LATEST","function_request_id":"34f2d390-6ef0-4130-b6d8-bfc2d50fcdb5","samplingRate":"0.0","service":"PCSv2 Machine Proxy Mock Service","xray_trace_id":"1-614b2da2-66dcce3b297641326f82ea15"}
 

Build failed - Could not find artifact com.sun:tools:jar

I added mvn plugin and dependencies as documented here.

https://awslabs.github.io/aws-lambda-powertools-java/utilities/sqs_large_message_handling/

I am using AdoptOpenJDK (build 11.0.8+10)

I am getting this error when I run mvn package

Execution default of goal org.codehaus.mojo:aspectj-maven-plugin:1.11:compile failed: Plugin org.codehaus.mojo:aspectj-maven-plugin:1.11 or one of its dependencies could not be resolved: Could not find artifact com.sun:tools:jar:11.0.8 at specified path

Can you please let me know if there is anything else I need to do?

Java-based cfn-response module

Is your feature request related to a problem? Please describe.
There's a cfn-response module to support Lambda-backed AWS CloudFormation custom resources, but it's only available for Node and Python runtimes.

Describe the solution you'd like
A cfn-module equivalent for Java.

Describe alternatives you've considered
Lambda authors must create their own Java equivalent or use a supported runtime.

Set default values for tracing utilty parameters

Is your feature request related to a problem? Please describe.
Changing the default captureResult param for the tracing utility to "false" would make my life easier, given that most of my methods require having it turned off because it causes errors while interacting with other services.

Describe the solution you'd like
It would be nice to be able to set the captureResult property for the tracing annotation to false and let all methods called inherit that default instead of having to put capture=false on every single annotation

Describe alternatives you've considered
right now I override the params on every annotation

Additional context
I'm working with a library that fails when tracing's capture response is true

NoClassDefFoundError in the example using the SDK v2 and tracing (x-ray)

Adding the sdk v2 in the pom of the example and using it in the App.java generate a NoClassDefFoundError at runtime.

What were you trying to accomplish?
Adding the SDK v2 and simply using it in the example generate this exception (see error logs below).

Expected Behavior

Using the SDK v2 in the lambda function should not generate this kind of exception.

Current Behavior

java.lang.NoClassDefFoundError "com/amazonaws/xray/handlers/config/AWSServiceHandlerManifest"

Possible Solution

Problem with x-ray dependencies.

Steps to Reproduce (for bugs)

  1. Add the following dependencies (sdk v2) in the pom.xml:
        <dependency>
            <groupId>software.amazon.awssdk</groupId>
            <artifactId>ssm</artifactId>
            <version>2.14.10</version>
        </dependency>
        <dependency>
            <groupId>software.amazon.awssdk</groupId>
            <artifactId>apache-client</artifactId>
            <version>2.14.10</version>
        </dependency>
  1. Import the following classes in the App.java:
import software.amazon.awssdk.services.ssm.SsmClient;
import software.amazon.awssdk.services.ssm.model.GetParameterRequest;
  1. Use the sdk v2 in the App.java within the handleRequest method:
        GetParameterRequest request = GetParameterRequest.builder()
                .name("/powertools-java/sample/simplekey")
                .build();
        SsmClient.create().getParameter(request);

Environment

  • Powertools version used: 0.2.0-beta
  • Packaging format (Layers, Maven/Gradle): maven
  • AWS Lambda function runtime: java 8
  • Error logs
  "errorMessage": "com/amazonaws/xray/handlers/config/AWSServiceHandlerManifest",
  "errorType": "java.lang.NoClassDefFoundError",
  "stackTrace": [
    "com.amazonaws.xray.interceptors.TracingInterceptor.initInterceptorManifest(TracingInterceptor.java:110)",
    "com.amazonaws.xray.interceptors.TracingInterceptor.<init>(TracingInterceptor.java:92)",
    "com.amazonaws.xray.interceptors.TracingInterceptor.<init>(TracingInterceptor.java:86)",
    "helloworld.App.handleRequest_aroundBody0(App.java:73)",
    "helloworld.App$AjcClosure1.run(App.java:1)",
    "org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:257)",
    "software.amazon.lambda.powertools.tracing.internal.LambdaTracingAspect.around(LambdaTracingAspect.java:52)",
    "helloworld.App.handleRequest_aroundBody2(App.java:65)",
    "helloworld.App$AjcClosure3.run(App.java:1)",
    "org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:257)",
    "software.amazon.lambda.powertools.logging.internal.LambdaLoggingAspect.around(LambdaLoggingAspect.java:94)",
    "helloworld.App.handleRequest(App.java:65)",
    "sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)",
    "sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)",
    "sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)",
    "java.lang.reflect.Method.invoke(Method.java:498)"
  ],
  "cause": {
    "errorMessage": "com.amazonaws.xray.handlers.config.AWSServiceHandlerManifest",
    "errorType": "java.lang.ClassNotFoundException",
    "stackTrace": [
      "java.net.URLClassLoader.findClass(URLClassLoader.java:382)",
      "java.lang.ClassLoader.loadClass(ClassLoader.java:424)",
      "java.lang.ClassLoader.loadClass(ClassLoader.java:357)",
      "com.amazonaws.xray.interceptors.TracingInterceptor.initInterceptorManifest(TracingInterceptor.java:110)",
      "com.amazonaws.xray.interceptors.TracingInterceptor.<init>(TracingInterceptor.java:92)",
      "com.amazonaws.xray.interceptors.TracingInterceptor.<init>(TracingInterceptor.java:86)",
      "helloworld.App.handleRequest_aroundBody0(App.java:73)",
      "helloworld.App$AjcClosure1.run(App.java:1)",
      "org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:257)",
      "software.amazon.lambda.powertools.tracing.internal.LambdaTracingAspect.around(LambdaTracingAspect.java:52)",
      "helloworld.App.handleRequest_aroundBody2(App.java:65)",
      "helloworld.App$AjcClosure3.run(App.java:1)",
      "org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:257)",
      "software.amazon.lambda.powertools.logging.internal.LambdaLoggingAspect.around(LambdaLoggingAspect.java:94)",
      "helloworld.App.handleRequest(App.java:65)",
      "sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)",
      "sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)",
      "sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)",
      "java.lang.reflect.Method.invoke(Method.java:498)"
    ]
  }
}

Lambda powertool SSMProvider is either increasing cold start time or unable to refresh parameters

We may need to update some Parameter Store parameters value that are being used in Lambda Function.
If any parameter is updated while Lambda is in Warm state, it should be able to read the latest values.
I used lambda powertool SSMProvider to achieve the same inside lambda function constructor and inside the lambda handler.

The issue here is; if parameters are read inside the constructor, it does not get refreshed after 1 minute (withMaxAge(60, SECONDS)). And if I read these parameters inside lambda handler method, it increases the cold, warm start time to ~3 times though connection to SSMProvider is initialized only once inside constructor.

What were you trying to accomplish?
Trying to read the latest values of the parameters while Lambda is in warm state. The reason to read parameters inside constructor is to reduce the warm start time.

Expected Behavior

Lambda should be able to read latest values of parameters and it should not impact performance.

Code Snippet

public class CustomHandler implements RequestHandler<Object, Object> {
	private static SSMProvider ssmProvider = null;
	private static Map<String, String> params = null;
	
	public CustomHandler() {
		ssmProvider = DependencyFactory.ssmProvider();
		
		// Parameters are not refreshed after 1 min as configured.
		params = ssmProvider.withMaxAge(60, SECONDS).withDecryption().recursive().getMultiple("/lab/workflow");
	}
	
	@Override
	public Object handleRequest(final Object input, final Context context) {
		// Parameters are refreshed after 1 min as configured but;
		// Cold start time is 3-4 times higher,
		// Warm start time after every minute is 2-3 times higher.
		params = ssmProvider.withMaxAge(60, SECONDS).withDecryption().recursive().getMultiple("/lab/workflow");
	}

}
public class DependencyFactory {
	public static SSMProvider ssmProvider() {
		return ParamManager.getSsmProvider(ssmClient());
	}

	public static SsmClient ssmClient() {
		return SsmClient.builder().credentialsProvider(EnvironmentVariableCredentialsProvider.create())
			.region(region).httpClientBuilder(UrlConnectionHttpClient.builder()).build();
	}
}

Environment

  • Powertools version used: 1.1.0
  • Packaging format (Layers, Maven/Gradle): Java8
  • AWS Lambda function runtime: Maven
{
    "Timestamp": "2020-12-07 22:11:40",
    "RequestId": "7a4c8861-1c4e-4c33-becf-f990404eb3b5",
    "Severity": "DEBUG",
    "Message": "request - Sending Request: DefaultSdkHttpFullRequest(httpMethod=POST, protocol=https, host=ssm.us-east-1.amazonaws.com, encodedPath=/, headers=[amz-sdk-invocation-id, Content-Length, Content-Type, User-Agent, X-Amz-Target], queryParameters=[])\n"
}
{
    "Timestamp": "2020-12-07 22:11:40",
    "RequestId": "7a4c8861-1c4e-4c33-becf-f990404eb3b5",
    "Severity": "DEBUG",
    "Message": "request - Received successful response: 200\n"
}

Documentation mentions metrics, but no code is available just yet

What were you initially searching for in the docs?

https://awslabs.github.io/aws-lambda-powertools-java/ mentions: "Powertools is a suite of utilities for AWS Lambda Functions that makes tracing with AWS X-Ray, structured logging and creating custom metrics asynchronously easier."

But no metric module has been published yet.

Is this related to an existing part of the documentation? Please share a link
https://awslabs.github.io/aws-lambda-powertools-java/

Describe how we could make it clearer
Tell that metric module is coming.

If you have a proposed update, please share it here

RFC: Allow filtering logEvent output

Is your feature request related to a problem? Please describe.

It's not, it's an enhancement

Describe the solution you'd like

Considering that logEvent param exists over @PowertoolsLogging, we should provide an option include only/exclude fields from the event log record itself, as such:

  • AllowedFields: Set<...> Only log items contained on the List/Set
  • PreventLogFields: Set<...>: Only log items not matching the List/Set
    ....

Describe alternatives you've considered

There are no alternatives when using LogEvent

Additional context

Feature request: Better failures handling while using both BatchProcessing and LargeMessageHandling

Is your feature request related to a problem? Please describe.

While using both BatchProcessing and LargeMessageHandling on a lambda, there could be a case when the lambda failed to download the large payload from S3, hence it would fail the whole batch before it could even reach the batch processor. The result is that we have to reprocess/redownload the whole batch, even though most of the messages in the batch could have been processed in the previous try.

Describe the solution you'd like
The failures when downloading payload are handled individually as it were an exception inside the batch processor.

Additional context

My use case is that I would like to batch process large messages. I would prefer to process any messages downloaded successfully without relying on other messages' downloading status. Only the failed-to-load messages should be retried.

SQS partial batch failure

Is your feature request related to a problem? Please describe.

A lambda processing a batch of messages from SQS is a very common approach and it works smooth for most use cases. Now, for the sake of example, suppose we're processing a batch and one of the messages failed, lambda is going to redrive this batch to the queue again, including the successful ones ! Re-running successful messages is not acceptable for all use cases.

Describe the solution you'd like

A very common execution pattern is running a lambda connected in sqs, in most cases with a batch size not equal to one. In such cases, an error to one of the processed messages will cause the whole batch to return to the queue. For some use cases, it's impossible to rely on such behavior - non idempotent actions. A solution for this problem would improve the experience of using a lambda with SQS.

Describe alternatives you've considered

Additional context
aws-powertools/powertools-lambda-python#92

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.