This is a bit of a stretch of how you are defining sub-pages. It is a single page with calculated content based on URL. I could just echo URL parameters to the screen and say that I have infinite subpages if that is how we define thing. So no - what you have is dynamic content.
Which is why I'd answer your question by recommending that you focus on the bots, not your content. What are they? How often do they hit the page? How deep do they crawl? Which ones respect robots.txt, and which do not?
Go create some bot-focused data. See if there is anything interesting in there.
FWIW, a billion static pages vs. single script with URL rewrite that makes it look like a billion static pages are effectively equivalent, once a cache gets involved.
Which is why I'd answer your question by recommending that you focus on the bots, not your content. What are they? How often do they hit the page? How deep do they crawl? Which ones respect robots.txt, and which do not?
Go create some bot-focused data. See if there is anything interesting in there.