All convolutions inside of a dense block are ReLU-activated and use batch normalization. Channel-smart concatenation is just possible if the height and width dimensions of the info stay unchanged, so convolutions within a dense block are all of stride 1. Pooling levels are inserted between dense blocks for even https://financefeeds.com/nasdaq-100-signs-of-a-bearish-shift-amid-elevated-volatility/