NettyServer and gzip request support

Hi! I migrated an existing project from Play to Tapir + netty server (since performance matters) and during testing it turned out that out of the box gzip requests are not supported (i.e. they reach underlying json codec as still compressed)

I tried adding a native netty gzip decoder to the pipeline but the server seem to hang instead of giving a response if gzip request came (until returning 503 after hitting default timeout). Here’s the custom pipeline code I use

  def pipeline(cfg: NettyConfig)(pipeline: ChannelPipeline, handler: ChannelHandler): Unit = {
    cfg.sslContext.foreach(s => pipeline.addLast(s.newHandler(pipeline.channel().alloc())))
    pipeline.addLast(ServerCodecHandlerName, new HttpServerCodec(8192, DEFAULT_MAX_HEADER_SIZE, DEFAULT_MAX_CHUNK_SIZE))
    pipeline.addLast(new HttpContentDecompressor()) // added netty decompressor between http decoder and server handler
    pipeline.addLast(new HttpStreamsServerHandler())
    pipeline.addLast(handler)
    if (cfg.addLoggingHandler) pipeline.addLast(new LoggingHandler())
    ()
  }

Request examples

uncompressed goes fine

curl -v --location 'http://0.0.0.0:9000/endpoint' --header 'Content-Type: application/json' -H --data '{}'
*   Trying 0.0.0.0:9000...
* Connected to 0.0.0.0 (127.0.0.1) port 9000
> POST /endpoint HTTP/1.1
> Host: 0.0.0.0:9000
> User-Agent: curl/8.5.0
> Accept: */*
> Content-Type: application/json
> Content-Length: 2
> 
< HTTP/1.1 400 Bad Request
< server: tapir/1.10.15
< Error-Message: Error decoding request: JsonDecodeException(List(JsonError(Missing required field,List(FieldName(id,id))), JsonError(Missing required field,List(FieldName(imp,imp)))),io.circe.Errors))
< content-length: 0
< 
* Connection #0 to host 0.0.0.0 left intact

compressed hits timeout

curl -v --location 'http://0.0.0.0:9000/endpoint' --header 'Content-Type: application/json' -H "Content-Encoding: gzip" --data-binary @exmpl.gz
*   Trying 0.0.0.0:9000...
* Connected to 0.0.0.0 (127.0.0.1) port 9000
> POST /endpoint HTTP/1.1
> Host: 0.0.0.0:9000
> User-Agent: curl/8.5.0
> Accept: */*
> Content-Type: application/json
> Content-Encoding: gzip
> Content-Length: 2097
> 
< HTTP/1.1 503 Service Unavailable
< content-length: 0
< connection: close
< 
* Closing connection

I was able to trace via the debugger that the proper branch of io.netty.handler.codec.http.HttpContentDecompressor#newContentDecoder is hit but further on the processing is lost inside netty loop (due to my lack of knowledge of netty internals).

So the question is: is it possible to use netty pipeline decoder for compressed requests or should I rather write an interceptor to decode it?

:sweat_smile: The hang up was happening in my endpoint logic due to local deployment. I confirm the method above works well. :sparkling_heart: Tapir