What your monitor displays is also measured in pixels, for example if you have a widescreen monitor the native display resolution might be 1920 * 1080. "Widescreen" is a ratio, typically 16:9. That means for every 16 units of measurement horizontally there is 9 units vertically. To further explain this the screen on my monitor has a physical width of 20 inches and 11.25 inches in height. If i divide the 20 inches by 16 it's 1.25, 11.25 divided by 9 also equals 1.25 which gives us a perfect ratio of 16:9 on a physical medium.
My monitors native resolution is 1920 * 1080. If we divide the 1920 by 16 we get 120, if we divide 1080 by 9 we get 120 which again is a perfect 16:9 ratio. Why that is important is because as long as we're using a resolution that has a 16:9 ratio on a display that is 16:9 our square pixels will remain square. If you use a disply resolution that has a different ratio than the physical ratio balls will take on egg shapes.
Onto DPI and it may help to understand the history, specifically the history of 72DPI. When computers first came on the market the monitors themselves were actually 72 DPI, a square inch on the monitor display had a resolution of 72px * 72px. At that point in time what ever you saw on the screen would print exactly the same size on a piece of paper. On modern display we have different resolutions and physical sizes so this no longer applies but it's still used as the default for images and to scale text. With Windows the default DPI is 96. If you go into your computers displays settings you'll find an option to set the DPI BUT it has no bearing on the what the real DPI is when displayed on your monitor. There is much better and longer explanation in this article:
Apple's 72 dpi logical inch was close to accurate text size on the early Macintosh screen (1984), which is why Apple selected the 72 dpi number back then. Apple still uses that number today for text logical inches, but that number was only about the size of THAT screen. That 1984 Macintosh graphical screen was something new, it could show various font faces and sizes, italics and bold, even proportional fonts on the screen for the first time. We take this for granted today, but graphic text on the video screen was a new idea then. MS-DOS screens did not show graphic text, and instead used screens dimensioned as 80x25 fixed-width characters of one bit-mapped font.
And back then, Apple did brag about 72 dpi and about WYSIWYG (What You See Is What You Get), meaning text on that screen looked the same, and matched the size of the same text printed on paper. The size accuracy was not because of the number 72 dpi per se, it was because this logical inch size was selected to match this physical screen size. We don't hear about WYSIWYG today - it is not possible now because modern screens vary in size. But that early screen was always the same size then, and this advertising was very effective for Apple back then. However it was unfortunately also detrimental to our understanding how things work, because it let us assume 72 dpi had some significance to the video system. But it doesn't - it can't - there are NO inches and NO dpi in ANY video system - our screens are dimensioned in pixels. It was only about the size of THAT screen. That early 9 inch Mac screen was 512x324 pixels, and each of the 412x312 pixel images above would be nearly full screen then.
This is why images are given a DPI setting of 72 by default on most cameras, it's a legacy number. DPI is meta data, it has no bearing on the pixel size of the image. What DPI does is allows us to give a pixel a physical dimension where none exists. The following images are examples. These two images are exactly the same image. They are exactly the same pixel size, they are exactly the same colors, they are exactly the same file size and if printed at exactly the same physical size will produce identical prints. The only difference between them is the DPI setting which as you can see has absolutely no difference in how they are displayed on your monitor. We can change this to anything we want (*see important warning below) and it will still be the same image.
Example 1 - 249x276 @72 DPI
Example 2 - 249x276 @600 DPI
When we transfer this image to a physical canvas, this is where DPI enters the picture. It sets the default physical size or scale that an image should be printed at. If you're still not following along here save the two images to your computer and insert them into a Word document or other word processing program, the first thing you should note is they are now a different size because Word has scaled them according to their DPI. If you were to print them roughly speaking example 1 is going to scale to about 3" * 3" and example 2 will only be about .5" * .5".
A Word document has a physical dimension based on what size paper you have selected in the options. Since we have a physical canvas Word will scale these images to it. By default when you print an image most software like Word will determine the printed size or scale of the image by the DPI. If we had a 300px * 300px image and set the DPI to 100 by default the printed size will be 3" * 3". If you take the same image and set the DPI to 300 the default printed size will be 1" x 1".
The other use for DPI is when scanning prints, in this case you're setting the scanner for how much information to gather per inch of physical surface. The higher you set the DPI the more detailed the scan will be. If you have a print with physical dimensions of 6" * 4" and set the DPI at 300 you're going to produce an image that is 1800px * 1200px or little more than 2Mpx.
*On a final note be careful when changing the DPI, some image editors use physical size to scale images and if you change the DPI you could in fact reduce the pixel size under certain circumstances.