Logging one value per epoch?

Reading the documentation and following the examples, there doesn’t seem to be a way to log just one value per epoch. This is insane, because when you’re trying to figure out a model architecture, batch sizes will be changing all the time, so per-step logging is useless.

It appears that MANY people have been complaining about this since 2020. So is there an intelligent way to handling logging with LightningModule?

I’m trying to manually use Tensorboard SummaryWriter, but it’s not writing. I’m thinking there’s something going on with scope or threading. The same methods work perfectly every other time I’ve used it outside of Lightning.

So is there any way to use Tensorboard without the broken built-in logging?