A new paper shows how AI can accelerate scientific discovery through analyzing millions of scientific abstracts. From the MIT Technology Review:
Natural-language processing has seen major advancements in recent years, thanks to the development of unsupervised machine-learning techniques that are really good at capturing the relationships between words. They count how often and how closely words are used in relation to one another, and map those relationships in a three-dimensional vector space. The patterns can then be used to predict basic analogies like “man is to king as woman is to queen,” or to construct sentences and power things like autocomplete and other predictive text systems.
A group of researchers have now used this technique to munch through 3.3 million scientific abstracts published between 1922 and 2018 in journals that would likely contain materials science research. The resulting word relationships captured fundamental knowledge within the field, including the structure of the periodic table and the way chemicals’ structures relate to their properties. The paper was published in Nature last week.MIT Technology Review
More from Computing —
Explore over two decades of long-term thinking
- Climate Change