I attended ODSC (Open Data Science Conference) West 2020 end of last month. I also presented Keras from Soup to Nuts -- an example driven tutorial there, a 3-hour tutorial on Keras. Like other conferences this year, the event was all-virtual. Having attended one other all-virtual conference this year (Knowledge Discovery and Data Mining (KDD) 2020 and being part of organizing another (an in-house conference), I can appreciate how much work it took to pull it off. As with the other conferences, I continue to be impressed at how effortless it all appears to be from the point of view of both speaker and attendee, so kudos to the ODSC organizers and volunteers for a job well done!
In this post, I want to cover my general impressions about the conference for readers of this blog. Content seems similar to PyData, except that not all talks here are based on Python (or Julia or R) related. As with PyData, the content is mostly targeted at data scientists in industry, with a few talks that are more academic, based on the presenter's own research. I think there is also more coverage on the career related aspects of Data Science than PyData. I also thought that there was more content here than in typical PyData conferences -- the conference was 4 days long (Monday to Friday) and multi-track, with workshops and presentations. The variety of content feels a bit like KDD but with less academic rigor. Overall, the content is high-quality, and if you enjoy attending PyData conferences, you will find more than enough talks and workshops here to hold your interest through the duration of the conference.
Pricing is also a bit steep compared to KDD and PyData, although there seem to be deep discounts available if you qualify. You have to contact the organizers for details about the discounts. Fortunately I didn't have to worry about that since I was presenting and my ticket was complimentary.
Like KDD and unlike PyData, OSDC also does not share talk recordings with the public after the conference. Speakers sometimes do share their slides and github repositories, so hopefully you will find these resources for the talks I list below. Because my internal conference (the one I was part of the organizing team for) was scheduled the very next week, I could not spend as much time at ODSC as I would have liked, so there were many talks that I would have liked to attend but I didn't. Here is the full schedule (until the link is repurposed for the 2021 conference).
- Evaluating and Testing Natural Language Processing Models -- interesting talk about generating adversarial examples to test Natural Language Processing (NLP) models, to gain insight into the inner workings of the model. The presenter is co-author of a paper titled Generating Natural Adversarial Examples.
- Natural Language Processing with Pytorch -- NLP tutorial aimed mostly at beginners, covers a lot of traditional techniques to establish good baselines, followed by some Pytorch examples on Sentiment Analysis and Summarization. This was a 3 hour tutorial (like the one I did), and had labs. Here is the github repository for the labs.
- Learning with Limited Labels -- I was interested in this from a NLP point of view, but the techniques covered were almost exclusively for Computer Vision (CV) problems. I enjoyed the talk though, I thought the examples covered were useful techniques for CV problems.
- Transfer Learning in NLP -- fairly high level survey of transfer learning in NLP, before and after BERT (and other transformers). If you haven't been following the field closely, this is a good refresher.
- Accelerating NLP Model Training and Deployment with Pytorch -- covers, among other things, how to write out and deploy trained transformer based models using the ONNX format. I thought this was quite interesting.
- Building Content Embeddings with Self Supervised Learning -- the presenters describe building vectors representing tweets based on random walks through the Twitter follower graph. These vectors can be used for recommending content from one Twitter user to another.
- A Comparison of Topic Modeling Methods in Python -- good notebook based talk on applying different topic modeling techniques (K-Means, LDA, and NMF) to the same dataset, and using different evaluation metrics to evaluate their quality.
- Machine Learning in Biology and Medicine -- good presentation with examples of applying ML techniques to genomics and Electronic Medical Records.
As I mentioned earlier already, I also presented a 3 hour tutorial on Keras, so I wanted to cover that in slightly greater detail for readers here as well. As implied by the name, and the talk abstract, the tutorial tries to teach participants enough Keras to become advanced Keras programmers, and assumes only some Python programming experience as a pre-requisite. Clearly 3 hours is not enough time, so the notebooks are deliberately short on theory and heavy on examples. I organized the tutorial into 3 45-minute sessions, with exercises at the end of the first two, but we ended up just running through the exercise solutions instead because of time constraints.
The tutorial materials are just a collection of Colab notebooks that are available at my sujitpal/keras-tutorial-odsc2020 github repository. The project README provides additional information about what each notebook contains. Each notebook is numbered with the session and sequence within each session. There are two notebooks called exercise 1 and 2, and corresponding solution notebooks titled exercise_1_solved and exercise_2_solved.
Keras started life as an easy to use high level API to Theano and Tensorflow, but has since been subsumed into Tensorflow 2.x as its default API. I was among those who learned Keras in its first incarnation, when certain things were just impossible to do in Keras, and the only option was to drop down to Tensorflow 1.x's two-step model (create compute graph and then run it with data). In many cases, Pytorch provided simpler ways to do the same thing, so for complex models I found myself increasingly gravitating towards Pytorch. I did briefly look at Keras (now tf.keras) and Tensorflow 2.0-alpha while co-authoring the Deep Learning with Tensorflow 2 and Keras book, but the software was new and there was not a whole lot information available at the time.
My point of mentioning all this is to acknowledge that I ended up learning a bit of advanced Keras myself as well when building the last few notebooks. Depending on where you are with Keras, you might find them interesting as well. Some of the interesting examples covered (according to me) are Sequence to Sequence models with and without attention, using transformers from the Huggingface Transformers library in your Keras models, using Cyclic Learning Rates and LR Finder, and distributed training across multiple GPUs and TPU. I am actually quite pleasantly surprised at how much more you can do with tf.keras with respect to the underlying Tensorflow framework, and I think you will be too (if you aren't already).
Be the first to comment. Comments are moderated to prevent spam.
Post a Comment