We could not locate the page you were looking for.

Below we have generated a list of search results based on the page you were trying to reach.

404 Error
Happy New Year! I've spent the week at CES in Las Vegas this week, checking out a bunch of exciting new technology. (And a bunch of not-so-exciting technology as well.) I'll be writing a bit more about my experiences at CES on the TWIML blog, but for now I'll simply state the obvious: AI was front and center at this year's show, with many interesting applications, spanning smart home and city to autonomous vehicles (using the term vehicle very broadly) to health tech and fitness tech. I focused on making videos this time around, and we'll be adding a bunch from the show to our CES 2019 playlist over on Youtube, so be sure to check that out and subscribe to our channel while you're there. In other news, we just wrapped up our AI Rewind 2018 series in which I discussed key trends from 2018 and predictions for 2019 with some of your favorite TWIML guests. This series was a bit of an experiment for us and we're excited to have received a lot of great feedback on it. If you've had a chance to check it out I'd love to hear your thoughts. Cheers, Sam P.S. We're always looking for sponsors to support our work with the podcast. If you think your company might benefit from TWIML sponsorship, I'd love your help getting connected to the right people. P.P.S. I'm planning a visit to the Bay Area for the week of January 21st. I've got a few open slots for briefings and meeting up with listeners. If you're interested in connecting give me a holler.   To receive this to your inbox, subscribe to our Newsletter.
Happy New Year! I hope you had a great one and that you’re as pumped about 2018 as I am. I’ve enjoyed the opportunity to relax a bit with family over the past few weeks, but it’s also been great to jump back into the podcast, my research and other projects! Before we turn to what’s next, a brief reflection on 2017... Last year was an exciting one for Team TWIML. It’s been a pleasure to bring you 75 interviews, meet you at 16 events, learn with you at 4 online meetups, send you tons of TWIML stickers, and field your feedback, questions and comments on Twitter, Facebook, our website, and our meetup’s Slack channel. You helped us become a Top 40 Technology podcast on Apple Podcasts (we’ve seen it as high as number 33!), and break 1 million plays, two milestones we’re really proud of! We couldn’t ask for a better community and our team is committed to bringing you even better content and programs in 2018.   What’s Ahead for AI in 2018 Speaking of 2018... I have mixed feelings about predictions posts. They can be fun, but so often they’re self-serving. So this isn’t one. Rather, it’s more of a what’s-on-my-radar-in-2018 post. Here are a few of the things you should expect to hear more about from me and on the podcast next year. AI in the Cloud. Cloud service providers are continuing to build out strong AI offerings. In the process, they make ML/AI more accessible with each passing month. This is why I expect cloud-based ML/AI to continue to grow, and why I’m planning to explore this area in my research and on the podcast in 2018. Reinforcement Learning. Amassing labeled training data for supervised learning is very expensive and time consuming. This is one of the reasons why reinforcement learning is so interesting—it allows intelligent agents to learn in an unsupervised manner in simulated environments. We’ve tackled RL quite a bit on the show already and I expect to dig in even more in 2018. Meta-Learning. Humans, even toddlers, can adapt intelligently to a wide variety of new, unseen situations. By teaching systems how to learn, meta-learning and related ideas like few-shot, one-shot and zero-shot learning seeks to achieve the same for intelligent agents. This will be important for the same reason as RL. Capsule Networks. Deep learning luminary Geoffrey Hinton believes CNNs are dead, and that Capsule Networks are the next big thing in AI. CapsNets use what Hinton calls “inverse graphics” and capsules to overcome some of the key challenges of convolutional networks, and have proven to be more efficient at identifying overlapping and distorted images. I’m looking forward to digging into CapsNets a bit more this year. AI Acceleration. While deep learning training is currently dominated by GPUs, a number of new hardware architectures are expected to come online in 2018 and beyond. I’m planning to survey these at some point this year and will be sure to share what I discover. Algorithmic Fairness & Ethics. This is another topic we’ve touched on a bit, but not quite enough. As we shift more power to algorithms to make decisions that affect lives, it’s incumbent on us as a field to really understand the implications of what we’re building and how we’re building it. There’s certainly more ground to explore in 2018, but these are a few of the topics that are top of mind for me as we head into the new year. What’s on your mind? What do you want to learn more about? What should be on my radar as I plan my research and content calendar for the year? Sign up for our Newsletter to receive this weekly to your inbox.
Get ready to delve into newsletter number twelve! De-mist-ifying AI and the Cloud The other day my friends Aaron Delp and Brian Gracely interviewed me for their cloud computing podcast, The CloudCast. I’ve been spending a lot of time of late thinking about the multi-faceted relationship between machine learning, AI and the cloud of late, for some upcoming research on the topic, and our discussion provided a nice opportunity to summarize some of my thoughts on the topic. (The podcast should post to their site in a couple of weeks; I’ll keep you posted here.) Most media coverage of cloud computing and artificial intelligence treat these two topics, for the most part, as distinct and independent technology trends. However, the two are, and will remain, inexorably tied to one another. The leading cloud vendors are all investing heavily in AI. Google, Microsoft, Salesforce and Amazon represent two-thirds of the top six companies with the most AI acquisitions, according to Quid data published by Recode. A number of reasons are motivating these firms’ investments, not the least of which are projections that AI training and inference workloads are huge potential growth markets for cloud vendors. In a recent investor presentation, NVIDIA projected that deep learning training and inference represent $11 billion and $15 billion markets respectively, both with steep growth curves.   Three types of cloud-based AI offerings When I think of AI in the cloud, three distinct delivery models come to mind: AI-optimized infrastructure. These are traditional cloud Infrastructure-as-a-Service (IaaS) offerings that have been optimized for machine learning and AI workloads. One obvious optimization comes in the form of GPU support, but network and memory architectures play a role here as well, as do tooling, developer experience and pricing model. Users get a lot of control and customizability over the infrastructure, but they must ultimately manage it themselves. Cloud-based data science platforms. These offerings represent the next step up in terms of abstraction. Users of these platforms, typically data scientists, don’t have to think about the underlying infrastructure supporting their workloads. Rather, they get to work exclusively in the realm of higher level concepts such as code notebooks, ML models, and data flows. They trade off some control for speed and agility, and they’re not responsible for managing the underlying infrastructure. Cloud-based AI (e.g. cognitive) APIs. Finally, these offerings represent the tip of the AI in the Cloud pyramid. They are simply cloud-hosted APIs backed by pre-trained models. A common example is image object detection. A developer calls the object detection API passing in an image, and the API returns an array of items that are detected in the image. These APIs make AI extremely accessible for developers who just want to build smarter apps, but you lose a lot of control. In many cases, you can’t even train the models on your own data, so if you get too far afield from the basics you may be out of luck. I’ll be elaborating on each of these areas over the upcoming weeks, and am planning to publish a three-part “AI in the Cloud” research series that covers each in depth. Stay tuned. Join Team TWIML! TWIML is growing and we're looking for an energetic and passionate Community Manager to help expand our programs. This full-time position can be remote, but if you happen to be in St. Louis, all the better. If you’re interested, please reach out for additional details. Sign up for our Newsletter to receive this weekly to your inbox.