Nuclear fusion occurs when two atoms of a light element (such as hydrogen) combine to create a single atom of a heavier element (such as helium). Fusion of elements lighter than iron will release energy, while fusion of elements heavier than iron will always consume energy. Efficient production of energy using nuclear fusion requires light isotopes (such as deuterium) that will fuse relatively easily to liberate a relatively large amount of energy.
Achieving nuclear fusion requires very high temperatures and pressures of the kind normally only found deep inside stars. The extraordinary conditions needed to achieve nuclear fusion make it non-viable as a power source with current technology: creating the conditions for fusion to occur requires more energy than can be extracted from the reaction.
Nuclear fusion reactors are prone to exploding in science fiction stories, but in real life, the conditions of high-temperature and high-pressure needed to initiate and maintain a nuclear fusion reaction mean that any problem with a fusion reactor will almost certainly cause the reaction to die instead of running out of control.
Breaching the reactor vessel would, however, release whatever heat and pressure it contained at the time. This would be bad, but not nearly as catastrophic as the usual depiction of fusion reactor explosions in sci-fi.
Cold fusion is the theoretical achievement of nuclear fusion at far more easily achievable temperatures and pressures. Many engineers have attempted to achieve nuclear fusion at relatively low temperature, and some have even claimed success, but to date there is no viable, repeatable method of achieving fusion without recreating the temperatures and pressures found in the interior of the Sun.