A few items of interest that stood out to me:
- The rise of Transfer Learning where a model trained on one data set for one task can be applied with much less training to a new and different task.
- That AI right now seems to be more about having newer and better hardware (GPUs and TPUs) than anything else. The limiting component for deep learning today is hardware with GPUs best for offline training of AI models, necessiating large CapEx investments in expensive hardware in the coming years for tech companies that want to remain competitive.
- AI has become a service offered by the big cloud providers.
- Some great case studies of AI being used today to disrupt different industries including pharma, Enterprise automation, transportation and cybersecurity.
- One of their last slides has their big AI predictions over the next 12 months - I will be genuinely curious to see how many they get right.