Find your zen, with newsletter number ten!

On Community

After a calm and productive few weeks at home, conference season is back in full swing for the fall. This week, I’m at the O’Reilly/Intel Nervana Artificial Intelligence Conference in San Francisco. As much as travel can be a grind, I really get a kick out of engaging with the TWIML community in person—both interviewing guests and meeting up with listeners.

Two of the listeners I met up with this time were Xinyu Hong and Richard Shen, the winners of our recent AI Conference ticket giveaway. Xinyu is a student at Yale, and Richard just finished up at Dartmouth, and is about to start grad school at Cambridge. While we didn’t specifically target students with this contest, it’s great to know the pod was able to give back in this way, and their enthusiasm for machine learning, the conference and the podcast was amazing! Best of luck Xinyu and Richard!

Now, as much as I love in-person meet-ups, our virtual meet-ups have been great too! For this month’s meetup last Tuesday, meetup member Nikola Kučerová presented Yoshua Bengio’s Learning Long-Term Dependencies with Gradient Descent is Difficult. This is an important paper that explores one of the key challenges with training recurrent neural nets—the vanishing gradient problem. It looks at various alternative approaches and ultimately sets the stage for later advances like LSTM and GRU networks. Thanks for presenting, Nikola!

The replay is posted. Check it out, and join the meetup, at twimlai.com/meetup.

That’s it for now. Be sure to follow me on Twitter (@samcharrington & @twimlai) and Instagram (@twimlai) to keep up with me on my travels.