I suppose that would depend entirely on how you measure what optimal is. Optimal randomness is 128 bits from the best random source and 0 bits from anything else, like time. Optimal "just random enough for my use case but no more so I can fit other information in the value" depends entirely on the requirement of your use case (more specifically, not just "for databases" but "for my database to... on the hardware... in which the access is... on the presumed growth..." and so on). For picking a "good enough" value 12 bits is probably as reasonable as one will find generic reason for.