sorry for nitpicking but that's actually a misconception - the GPU doesn't generate intermediate vertices as a result of normal maps! all of that happens in the fragment shader, which only changes how light behaves at the fragments being rendered. Look close and you'll see distortion.
a vertex shader could change the geometry or emit intermediate vertices from a displacement map, but my understanding is that that also happens via fragment shader tricks!
perhaps you mean the artist's highly detailed mesh used to bake normal maps but those aren't typically shipped out to the user's game engine to save space, it's just an artifact of the artist's production process
As the sibling comment points out, gp is clearly talking about the source mesh from which the normal map is generated. But that there is never a higher-detail intermediate mesh during rendering is also wrong since the engine could use tesselation or geometry shaders to do just that. Doing it all in the fragment shaders means you either need to limit yourself to small displacements, have visibly wrong rendering at oblique viewing angles or add a lot of complexity to the fragment shader to simulate self-occlusion via raymarching and still end up with visibly wrong rendering in some cases.
The CSG approaches are somewhat more rare. The tools for working with meshes are just so much better. When I’ve seen people use CSG these days, it’s often used as a tool to create a mesh.