PMCID
stringclasses
24 values
Title
stringclasses
24 values
Sentences
stringlengths
2
40.7k
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Virtual screening is a commonly used strategy to search for promising molecules among millions of existing or billions of virtual molecules.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Searching can be carried out using similarity-based metrics, which provides a quantifiable numerical indicator of closeness between molecules.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
In contrast, in de novo drug design, one aims to directly create novel molecules that are active toward the desired biological target.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Here, like in any molecular design task, the computer has to(i)create molecules,(ii)score and filter them, and(iii)search for better molecules, building on the knowledge gained in the previous steps.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
create molecules, score and filter them, and search for better molecules, building on the knowledge gained in the previous steps.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Task i, the generation of novel molecules, is usually solved with one of two different protocols.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
One strategy is to build molecules from predefined groups of atoms or fragments.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Unfortunately, these approaches often lead to molecules that are very hard to synthesize.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Therefore, another established approach is to conduct virtual chemical reactions based on expert coded rules, with the hope that these reactions could then also be applied in practice to make the molecules in the laboratory.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
These systems give reasonable drug-like molecules and are considered as “the solution” to the structure generation problem.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
We generally share this view.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
However, we have recently shown that the predicted reactions from these rule-based expert systems can sometimes fail.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Also, focusing on a small set of robust reactions can unnecessarily restrict the possibly accessible chemical space.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Task ii, scoring molecules and filtering out undesired structures, can be solved with substructure filters for undesirable reactive groups in conjunction with established approaches such as docking or machine learning (ML) approaches.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
The ML approaches are split into two branches: Target prediction classifies molecules into active and inactive, and quantitative structure–activity relationships (QSAR) seek to quantitatively predict a real-valued measure for the effectiveness of a substance (as a regression problem).
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
As molecular descriptors, signature fingerprints, extended-connectivity (ECFP), and atom pair (APFP) fingerprints and their fuzzy variants are the de facto standard today.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Convolutional networks on graphs are a more recent addition to the field of molecular descriptors.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Jastrzebski et al. proposed to use convolutional neural networks to learn descriptors directly from SMILES strings.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Random forests, support vector machines, and neural networks are currently the most widely used machine learning models for target prediction.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
This leads to task iii, the search for molecules with the right binding affinity combined with optimal molecular properties.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
In earlier work, this was performed (among others) with classical global optimization techniques, for example genetic algorithms or ant-colony optimization.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Furthermore, de novo design is related to inverse QSAR.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
While in de novo design a regular QSAR mapping X → y from molecular descriptor space X to properties y is used as the scoring function for the global optimizer, in inverse QSAR one aims to find an explicit inverse mapping y → X, and then maps back from optimal points in descriptor space X to valid molecules.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
However, this is not well-defined, because molecules are inherently discrete (the space is not continuously populated), and the mapping from a target property value y to possible structures X is one-to-many, as usually several different structures with very similar properties can be found.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Several protocols have been developed to address this, for example enumerating all structures within the constraints of hyper-rectangles in the descriptor space.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Gómez-Bombarelli et al. proposed to learn continuous representations of molecules with variational autoencoders, based on the model by Bowman et al., and to perform Bayesian optimization in this vector space to optimize molecular properties.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
While promising, this approach was not applied to create active drug molecules and often produced syntactically invalid molecules and highly strained or reactive structures, for example cyclobutadienes.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
In this work, we suggest a complementary, completely data-driven de novo drug design approach.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
It relies only on a generative model for molecular structures, based on a recurrent neural network, that is trained on large sets of molecules.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Generative models learn a probability distribution over the training examples; sampling from this distribution generates new examples similar to the training data.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Intuitively, a generative model for molecules trained on drug molecules would “know” how valid and reasonable drug-like molecules look and could be used to generate more drug-like molecules.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
However, for molecules, these models have been studied rarely, and rigorously only with traditional models such as Gaussian mixture models (GMM).
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Recently, recurrent neural networks (RNNs) have emerged as powerful generative models in very different domains, such as natural language processing, speech, images, video, formal languages, computer code generation, and music scores.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
In this work, we highlight the analogy of language and chemistry, and show that RNNs can also generate reasonable molecules.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Furthermore, we demonstrate that RNNs can also transfer their learned knowledge from large molecule sets to directly produce novel molecules that are biologically active by retraining the models on small sets of already known actives.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
We test our models by reproducing hold-out test sets of known biologically active molecules.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
To connect chemistry with language, it is important to understand how molecules are represented.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Usually, they are modeled by molecular graphs, also called Lewis structures in chemistry.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
In molecular graphs, atoms are labeled nodes.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
The edges are the bonds between atoms, which are labeled with the bond order (e.g., single, double, or triple).
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
One could therefore envision having a model that reads and outputs graphs.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Several common chemistry formats store molecules in such a manner.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
However, in models for natural language processing, the input and output of the model are usually sequences of single letters, strings or words.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
We therefore employ the SMILES format, which encodes molecular graphs compactly as human-readable strings.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
SMILES is a formal grammar which describes molecules with an alphabet of characters, for example c and C for aromatic and aliphatic carbon atoms, O for oxygen, and −, =, and # for single, double, and triple bonds (see Figure 1).
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
To indicate rings, a number is introduced at the two atoms where the ring is closed.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
For example, benzene in aromatic SMILES notation would be c1ccccc1.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Side chains are denoted by round brackets.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
To generate valid SMILES, the generative model would have to learn the SMILES grammar, which includes keeping track of rings and brackets to eventually close them.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
In morphine, a complex natural product, the number of steps between the first 1 and the second 1, indicating a ring, is 32.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Having established a link between molecules and (formal) language, we can now discuss language models.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Examples of molecules and their SMILES representation.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
To correctly create smiles, the model has to learn long-term dependencies, for example, to close rings (indicated by numbers) and brackets.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Given a sequence of words (w1, ..., wi), language models predict the distribution of the (i+1)th word wi+1.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
For example, if a language model received the sequence “Chemistry is”, it would assign different probabilities to possible next words: “fascinating”, “important”, or “challenging” would receive high probabilities, while “runs” or “potato” would receive very low probabilities.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Language models can both capture the grammatical correctness (“runs” in this sentence is wrong) and the meaning (“potato” does not make sense).
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Language models are implemented, for example, in message autocorrection in many modern smartphones.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Interestingly, language models do not have to use words.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
They can also be based on characters or letters.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
In that case, when receiving the sequence of characters chemistr, it would assign a high probability to y, but a low probability to q. To model molecules instead of language, we simply swap words or letters with atoms, or, more practically, characters in the SMILES alphabet, which form a (formal) language.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
For example, if the model receives the sequence c1ccccc, there is a high probability that the next symbol would be a “1”, which closes the ring, and yields benzene.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
More formally, to a sequence S of symbols si at steps ti ∈ T, the language model assigns a probability of1where the parameters θ are learned from the training set.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
In this work, we use a recurrent neural network (RNN) to estimate the probabilities of eq 1.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
In contrast to regular feedforward neural networks, RNNs maintain state, which is needed to keep track of the symbols seen earlier in the sequence.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
In abstract terms, an RNN takes a sequence of input vectors x1:n = (x1, ..., xn) and an initial state vector h0, and returns a sequence of state vectors h1:n = (h1, ..., hn) and a sequence of output vectors y1:n = (y1, ..., yn).
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
The RNN consists of a recursively defined function R, which takes a state vector hi and input vector xi+1 and returns a new state vector hi+1.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Another function O maps a state vector hi to an output vector yi.234 The state vector hi stores a representation of the information about all symbols seen in the sequence so far.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
As an alternative to the recursive definition, the recurrent network can also be unrolled for finite sequences (see Figure 2).
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
An unrolled RNN can be seen as a very deep neural network, in which the parameters θ are shared among the layers, and the hidden state ht is passed as an additional input to the next layer.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Training the unrolled RNN to fit the parameters θ can then simply be done by using backpropagation to compute the gradients with respect to the loss function, which is categorical cross-entropy in this work. (
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
a) Recursively defined RNN. (
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
b) The same RNN, unrolled.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
The parameters θ (the weight matrices of the neural network) are shared over all time steps.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
As the specific RNN function, in this work, we use the long short-term memory (LSTM), which was introduced by Hochreiter and Schmidhuber.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
It has been used successfully in many natural language processing tasks, for example in Google’s neural machine translation system.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
For excellent in-depth discussions of the LSTM, we refer to the articles by Goldberg, Graves, Olah, and Greff et al. To encode the SMILES symbols as input vectors xt, we employ the “one-hot” representation.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
This means that if there are K symbols, and k is the symbol to be input at step t, then we can construct an input vector xt with length K, whose entries are all zero except the kth entry, which is one.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
If we assume a very restricted set of symbols , input c would correspond to xt = (1, 0, 0), 1 to xt = (0, 1, 0), and to xt = (0, 0, 1).
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
The probability distribution Pθ(st+1|st, ..., s1) of the next symbol given the already seen sequence is thus a multinomial distribution, which is estimated using the output vector yt of the recurrent neural network at time step t by5where yt corresponds to the kth element of vector yt.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Sampling from this distribution would then allow generating novel molecules: After sampling a SMILES symbol st+1 for the next time step t + 1, we can construct a new input vector xt+1, which is fed into the model, and via yt+1 and eq 5 yields Pθ(st+2|st+1, ..., s1).
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Sampling from the latter generates st+2, which serves again also as the model’s input for the next step (see Figure 3).
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
This symbol-by-symbol sampling procedure is repeated until the desired number of characters have been generated.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Symbol generation and sampling process.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
We start with a random seed symbol s1, here c, which gets converted into a one-hot vector x1 and input into the model.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
The model then updates its internal state h0 to h1 and outputs y1, which is the probability distribution over the next symbols.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Here, sampling yields s2 = 1.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Converting s2 to x2 and feeding it to the model leads to updated hidden state h2 and output y2, from which we can sample again.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
This iterative symbol-by-symbol procedure can be continued as long as desired.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
In this example, we stop it after observing an EOL () symbol, and obtain the SMILES for benzene.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
The hidden state hi allows the model to keep track of opened brackets and rings, to ensure that they will be closed again later.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
To indicate that a molecule is “completed”, each molecule in our training data finishes with an “end of line” (EOL) symbol, in our case the single character (which means that the training data is just a simple SMILES file).
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Thus, when the system outputs an EOL, a generated molecule is finished.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
However, we simply continue sampling, thus generating a regular SMILES file that contains one molecule per line.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
In this work, we used a network with three stacked LSTM layers, using the Keras library.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
The model was trained with back-propagation through time, using the ADAM optimizer at standard settings.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
To mitigate the problem of exploding gradients during training, a gradient norm clipping of 5 is applied.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
For many machine learning tasks, only small data sets are available, which might lead to overfitting with powerful models such as neural networks.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
In this situation, transfer learning can help.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Here, a model is first trained on a large data set for a different task.
PMC5785775
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks.
Then, the model is retrained on the smaller data set, which is also called fine-tuning.