Delta Time is the time it takes for the computer to go through all the processing/rendering for a single frame. It is dynamically updated, so it can fluctuate depending on what level of processing the last frame required.
Its use in the situation is to regulate the speed at which the game runs. Without factoring in delta time, the faster one's computer is, the faster the game will seem to run (and the opposite for slower computers). For example, in a game you might want a player to move at a speed of 50 units per second, and you had an average framerate of 30 fps, you could say:
this would move the player forward by 50/30 units per frame, and at 30 fps this becomes 50 units per second. However, what happens if the fps is 60? In this case, the player is still moved by 50/30 units per frame, but the frame rate is doubled, and so the player seems to move at 100 units per second instead.
To solve this, we use delta time instead of the constant 30 (or more accurately, 1/30, since we multiply by the delta time).
This gets us:
If the frame rate is 30 fps, we get a delta time of 1/30=0.033, and the player moves at 50*0.033=1.65 units per frame. If the frame rate were instead 75 fps, we get a delta time of 1/75=0.0133, and the player moves at 50*0.0133=0.66 units per frame. Of course, the dt is a lot more accurate than what I use here, so the player should move at a constant rate no matter what the frame rate is.
(Would others correct me if I'm wrong please)