Machine learning (ML) and deep learning (DL) content from the past 24 hours.
— Spark Summit East will meet in Boston February 7-9 2017.
— Adrian Colyer summarizes a report from Stanford University covering key topics in artificial intelligence, including large scale machine learning, deep learning, reinforcement learning, robotics, computer vision, natural language processing, algorithmic game theory, IoT, and neuromorphic computing.
— On the Moor Insights & Strategy blog, highlights from last week’s Supercomputing conference.
— In MIT Technology Review, Will Knight describes some experiments recently released by Google that demonstrate how neural networks work.
— Zachary Chase Lipton writes a long thumb-sucker about algorithmic bias.
Methods and Techniques
— Don Hillborn explains oil and gas asset optimization with Amazon Kinesis, Amazon RDS, and Databricks.
— Engineers develop a new machine learning algorithm that learns from human instruction and not from data. That strikes me as an oxymoron.
— Jelor Gallego reveals that NVIDIA just built the most energy-efficient supercomputer ever.
— Timothy Prickett Morgan dissects the “Summit” supercomputer on order from the U.S. Department of Energy for its Oak Ridge National Laboratory.
— Sophie Curtis asks: how can machine learning create a smarter energy grid?
— On the WeWork blog, Nicole Phelan describes how WeWork uses machine learning to design offices. A neural network significantly outperforms human designers at predicting utilization.
— A blogger on Seeking Alpha argues for a lot of upside in NVIDIA shares.
— Serkan Piantino, co-founder of Facebook’s AI research lab, quits to start a new venture, Top 1 Networks, which will provide GPU-accelerated computing as a service.
— In Investopedia, Richard Saintvilus chronicles the cloud machine learning wars.
Bottom Story of the Day
— Pinterest uses machine learning to determine what’s trending.