To comment on what you said and what CanadaRAM said, I believe that Uranium or Plutonium bombs release their energy from the Uranium splitting into smaller particles, which releases energy because the binding energy that keeps the nucleus together is exponentially smaller than the energy required to keep a smaller particle together. So for example, a nucleus with 80 protons required more than twice as much energy to keep it together when compared to an atom with 40 protons (say around 2.5 times, for example), and this is provided by the neutrons in the nucleus, which also does not increase linearly with size. So a 40 proton nucleus may have 40 neutrons in it, but an 80 proton nucleus may have 120 neutrons with it.
When the bomb goes off, the uranium is essentially being split into 2 smaller particles, and the extra binding energy provided by the excess neutrons is converted into energy, which is why the sum of the mass of the particles does not add up after an explosion. So when my 80 proton nucleus breaks down into two 40 proton nuclei, the binding energy of each particle is less than half of the energy of the original 80 proton nucleus. Since the 80 proton nucleus required 2.5x more energy than a 40 proton nucleus to keep it together, the extra 0.5x energy not used by the two 40 proton nuclei is seen as the "explosion". Again, this difference in energy is from the extra neutrons required for the 80 proton nucleus, but aren't required by the two 40 proton nuclei. The energy of the neutrons is essentially the "explosion", which is why the mass of the fission fragments don't add up to the mass of the Uranium.
Basically, I obeygiant is correct, but CanadaRAM brings up an important point that obeygiant may have actually mentioned without going into enough detail for it to be obvious upon initial reading.
The concept of
binding energy per nucleon is much more important in understanding nuclear reactions than binding energy per nucleus.
Also, I think you have misunderstood what binding energy is. Binding energy should not be thought of as energy that a given nucleus
has, but rather as energy that a given nucleus
lacks. The "higher" (read: lower) the binding energy of a given atom, the more stable it will be. In the link above, notice that the graph of Relative Stability of Nucleus (i.e., binding energy in MeV) vs. Atomic Mass peaks around 56 amu, which corresponds to iron. This is why iron has the most stable nucleus of all known elements (and all elements likely to be discovered, judging by the trend in the graph).
In a U-235 fission process, we go from a single nucleus with binding energy per nucleon around 7.6 MeV/amu to two nuclei fragments (say
Barium and Krypton), which correspond to binding energies per nucleon of 8.2 MeV/amu and 8.5 MeV/amu, respectively. So with the U-235 (neglecting incident neutron), we had:
binding energy = 235amu * -7.6MeV/amu = -1786MeV
For the two products (neglecting expelled neutrons and such):
binding energy = (144amu * -8.2MeV/amu) + (89amu * -8.5MeV/amu) = -1937.3MeV
For the net energy difference, we take the final binding energy and subtract the initial binding energy, which gives:
(-1937.3MeV) - (-1786MeV) = -151.3MeV --> negative indicates energy lost by the system...explosion!
Keep in mind that this is an
over-simplified example, so that's not quite the right answer. Nonetheless, this is basically what happens.
The binding energy per nucleon of a nucleus is intimately related to the rest masses of its constituent particles, which can also be used to predict the energy released in a fission (or fusion) reaction.
As a bonus for anyone who's interested...the reason fusion would be such a great energy source is that there is a much higher rate of change in binding energy per nucleon than there is for fission. See the initial steepness of the curve of the graph in the first link.