Deep learning enthusiasts are increasingly putting NVIDIA’s GTC at the top of their gotta-be-there conference list. I enjoyed mining this year’s t

The Future of Data Science – Mining GTC 2021 for Trends

submited by
Style Pass
2021-06-09 18:00:21

Deep learning enthusiasts are increasingly putting NVIDIA’s GTC at the top of their gotta-be-there conference list. I enjoyed mining this year’s talks for trends that foreshadow where our industry is headed. Three of them were particularly compelling and inspired a new point of view on transfer learning that I feel is important for analytical practitioners and leaders to understand. Let’s start with the themes.

Summary: APIs will get better at transferring model components from one application to another and transferring pipelines to production.

Theme 2: There is a deep net architecture race going on which may cause deep nets to look quite a bit different 10 years from now

Two examples of novel deep net architectures reviewed below both borrow heavily from the concepts of transformers and BERT-like models that lend themselves so well to transfer learning, however, they do so in a way that can be generalized to other applications like computer vision. I see a future of deep learning where transfer learning is king in large part because mega-model architectures are going to get really good at capturing large amounts of information about domains like language and vision.

If we can crack the nut of enabling a wider workforce to build AI solutions, we can start to realize the promise of data science. Transferring knowledge between data scientists and data experts (in both directions) is critical and may soon lend itself to a new view of citizen data science.

Leave a Comment