I think a lot of the exponential phases like Moore's law have more to do with the positive feedback of people seeing a particular field being lucrative and in turn directing more money and effort towards it, than the results of the tech directly bootstrapping it's own advances.

Like, AI has gone exponential or even hyperexponential in various metrics (resolution of generated images or LLM context lengths for example) but that's largely because we went from stuff you can train on a single GPU in a week to investing millions of dollars into a single experiment. While there have been some attempts at bootstrapping, using AI to make better training or hyperparameter search or architecture design, they haven't really caught on much.

So I suppose the ending-or-not thing is less, are we hitting the top of a particular sigmoid, and more about what determines the rate at which new sigmoids are started...