Lets rename "Deep learning" to "Statistical Compression"
The previous name for what is now called “machine learning” was predictive statistics. But a better name for deep learning is “machine intuition” since it seems to work pretty well but can’t be readily explained
It works "pretty well"
in well-defined complete-information domains like Go.
The training in that case is actually about improving the next dataset from which higher-quality patterns are extracted.
1.We have a NN that doesn't know how to play and makes random moves.
2.Blank slate dataset filled from random games.
3.NN learns to produce better games by weighting wins vs losses, making losing moves less likely and winning moves more likely.
4.NN produces slightly better dataset.
5.Repeat #3 until you have covered a large chunk of go openings and have pretty good estimation for move quality.
6.NN becomes the condensed statistic pattern data extracted
from billions of games, able to estimate value of a move more accurately than humans.
Still in some rare cases it "condensed pattern database" will make mistakes due gaps in data or wrong correlation that NN hallucinated from its connections.
works "pretty well" in well-defined complete-information domains like Go
But no one can definitively say “my AI credit scoring system didn’t decline his mortgage because he is black” and that’s the explainability bar that needs to be passed.
Right now the state of it is “gut feel” with an unknown amount of unconscious bias.
We are a long, long way from cracking open a neural net cat classifier and pointing at the exact equation for the whisker coefficient or the nose cuteness factor.
The previous name for what is now called “machine learning” was predictive statistics. But a better name for deep learning is “machine intuition” since it seems to work pretty well but can’t be readily explained