All convolutions within a dense block are ReLU-activated and use batch normalization. Channel-intelligent concatenation is barely probable if the height and width dimensions of the information keep on being unchanged, so convolutions inside a dense block are all of stride 1. Pooling layers are inserted between dense blocks for https://financefeeds.com/where-is-best-to-keep-my-10000-in-february-xrp-cardano-or-1fuel-for-bigger-rewards/