The central issue is high cost of training the models, it seems:
> "Once it has finally released, it usually remains stagnant in terms of having its knowledge updated. This creates an AI knowledge gap. A period between the present and AI’s training cutoff... The cutoff means that models are strictly limited in knowledge up to a certain point. For instance, Anthropic’s latest models have a cutoff of April 2024, and OpenAI’s latest models have cutoffs of late 2023."
Hasn't DeepSeek's novel training methodology changed all that? If the energy and financial cost for training a model really has drastically dropped, then frequent retraining including new data should become the norm.
> Hasn't DeepSeek's novel training methodology changed all that? If the energy and financial cost for training a model really has drastically dropped, then frequent retraining including new data should become the norm.
Even if training gets way cheaper or even if it stays as expensive but more money gets thrown at it, you'll still run into the issue of having no/less data to train on?
> "Once it has finally released, it usually remains stagnant in terms of having its knowledge updated. This creates an AI knowledge gap. A period between the present and AI’s training cutoff... The cutoff means that models are strictly limited in knowledge up to a certain point. For instance, Anthropic’s latest models have a cutoff of April 2024, and OpenAI’s latest models have cutoffs of late 2023."
Hasn't DeepSeek's novel training methodology changed all that? If the energy and financial cost for training a model really has drastically dropped, then frequent retraining including new data should become the norm.