They were introduced by Hochreiter & Schmidhuber (1997), and were refined and popularized by many people in following work.
Long Short Term Memory networkss work tremendously well on a large variety of problems, and are now widely used.
Long Short Term Memory networkss are explicitly designed to avoid the long-term dependency problem. Remembering information for long periods of time is practically their default behavior, not something they struggle to learn!
All Recurrent Neural networks have the form of a chain of repeating modules of neural network. In standard RNNs, this repeating module will have a very simple structure, such as a single tanh layer.