

Think about where AI was 10 years ago. Cutting edge AI was able to accept an video, and put bounding boxes around a predefined set of objects it’s able to recognize (see You Only Look Once paper, 2015). That was about it.
10 years before that, cutting edge AI was maybe digit recognition. I’m not sure.
Your own understanding of history is incorrect https://en.wikipedia.org/wiki/Timeline_of_artificial_intelligence?wprov=sfla1
I have no comment on this vid or the scishow, but you’re comment is ignorant on the reality of progression of ai-related fields. Having computers “learn” has been a thing for much longer than 20 years.





Zuck is at least a close runner up.