Very interesting point that while we've figured out how to digitize images, text and sounds we haven't digitized touch. At best we can describe in words what a touch sensation was like. Smell is in a similar situation. We haven't digitized it at all.
Touch is a 2D field of 3D vectors. Easily stored and transmitted as images, and easily processed by neural nets. You could add temperature and pain/damage channels if you want, though they don't seem essential for most manipulation tasks. (Actually I don't believe touch is as essential as he argues anyway. Of course someone who learned a task with touch will struggle without it, but they can still do it and would quickly change strategies and improve.)
The problem with touch is making sensors that are cheap and durable and light and thin and repairable and sensitive and shape-conforming. Representation is trivial in comparison.
I'm not sure describing it in words is very helpful, and there's probably a good amount of such data available already.
I would think the way to do it is build the touch sensors first (and it seems they're getting pretty close) then just tele-operate some robots and collect a ton of data. Either that, or put gloves on humans that can record. Pay people to live their normal lives but with the gloves on.