Machine-Learning Algorithm Predicts Laboratory Earthquakes

Machine-Learning Algorithm Predicts Laboratory Earthquakes
MIT Technology Review reported a breakthrough that raises the possibility that real earthquake prediction could be on the horizon. The team is cautious about the new technique’s utility for real earthquakes, but the work opens up new avenues of research in this area.

Bertrand Rouet-Leduc at Los Alamos National Laboratory led a team that trained a machine-learning algorithm to spot the tell-tale signs in a laboratory earthquake simulator. Using recorded acoustic emissions from experimental system that follows the Gutenberg-Richter distribution were fed into a ML algorithm.

To their astonishment, the algorithm gave accurate predictions even when an earthquake’s probability was not imminent under existing models. “We show that by listening to the acoustic signal emitted by a laboratory fault, machine learning can predict the time remaining before it fails with great accuracy,”.

 

Google is acquiring data science community Kaggle

From TechCrunch: sources are reporting that Google is acquiring Kaggle, a platform that hosts data science and machine learning competitions. Details about the transaction remain somewhat vague, but given that Google is hosting its Cloud Next conference in San Francisco this week, the official announcement could come as early as tomorrow.

Article: https://techcrunch.com/2017/03/07/google-is-acquiring-data-science-community-kaggle/

Paper On The Origin of Deep Learning

From the March 2nd 2017 Data Science Weekly – Issue 171 Editor’s Pick

On the Origin of Deep Learning
This paper reviews the evolutionary history of deep learning models. It covers the genesis of neural networks when associationism modeling of the brain is studied, through to the models that dominate the last decade of research in deep learning and extends to recent popular models like variational autoencoder and generative adversarial nets:

Buzzword Importance

Why is it important to pay attention to industry buzzwords?  Often, they appear to simply describe existing concepts: “The Cloud” versus 1970’s TSO”, “Thin Client versus Web Application”, “Thick Client” versus Client Server Architecture”, or “Monitoring” versus “Observability”

After a superficial examination, a buzzword has the feel of “marketing spin” being applied to recapture drifted attention.  A simple application of the “New and Improved” marketing strategy.

Buzzwords do have an importance.  They are often signaling new approaches to implementing that technology.   Buzzwords can represent inflection points in a technology or industry.

A cloud system isn’t simply “someone else’s hardware, not on your premises”.  It is that, but it is implemented in a way where a lot of the responsibility and maintenance tasks have shifted from you to a third-party.

Observability isn’t monitoring.   A new buzzword in system operations architecture is “observability”.  Generally speaking, observability systems execute analytics in memory on incoming data streams that are stored for further analysis later.  Observability products are organized around the use cases of enabling data exploration, visualization and prediction.

Monitoring contrasted to observability, is oriented to alerting engineers that there is a current problem to be investigated.  Clearly, monitoring has aspects in common with observability. However, monitoring is not necessarily about the deep dive analysis of individual components.

Buzzwords are not to be initially written off.  Even though quoted out of context, you don’t want to be remembered like DEC’s Ken Olsen pronouncement of personal computers.  He is infamous as having said “There is no reason for any individual to have a computer in his home.” in 1977.   Buzzwords need to be examined in depth to make sure that you’re not missing an important evolutionary shift in a critical component to your industry.