A restricted Boltzmann machine (RBM) is a generative stochastic artificial neural network that can learn a probability distribution over its set of inputs.

RBMs were initially invented under the name Harmonium by Paul Smolensky in 1986, and rose to prominence after Geoffrey Hinton and collaborators invented fast learning algorithms for them in the mid-2000. RBMs have found applications in dimensionality reduction, classification, collaborative filtering, feature learning, topic modelling and even many body quantum mechanics. They can be trained in either supervised or unsupervised ways, depending on the task.

As their name implies, RBMs are a variant of Boltzmann machines, with the restriction that their neurons must form a bipartite graph: a pair of nodes from each of the two groups of units (commonly referred to as the "visible" and "hidden" units respectively) may have a symmetric connection between them; and there are no connections between nodes within a group. By contrast, "unrestricted" Boltzmann machines may have connections between hidden units. This restriction allows for more efficient training algorithms than are available for the general class of Boltzmann machines, in particular the gradient-based contrastive divergence algorithm.

Restricted Boltzmann machines can also be used in deep learning networks. In particular, deep belief networks can be formed by "stacking" RBMs and optionally fine-tuning the resulting deep network with gradient descent and backpropagation.
Without quoting directly from the text give me a summary of what a restricted Boltzmann machine is?
A restricted Boltzmann machine (RBM) is a category of an artificial neural network invented by Paul Smolensky and popularized by Geoffrey Hinton. RBM's have been useful in various applications within the field of machine learning and can be trained in supervised and unsupervised ways.

RBMS is a restricted variant of Boltzmann machines that constrains their neurons; a pair of nodes from each of the two groups of neurons, visible and hidden forming a bipartite graph. Connection can only be between two groups and not within groups (visible and hidden). This constrained formulation results in the ability to more efficiently train algorithms. 

RBM's can also be integrated into deep learning networks such as deep belief networks with options to fine tune those networks using gradient descent and backpropagation.