It means to find the lim of the function as you approach 0 from the right side of the number line.
That is, as x gets closer to zero, as you approach from 0.1, then 0.01, then 0.001, then 0.0001, etc.
limx→0+x = 0 because x becomes 0.1, 0.010.001, 0.0001, ... → 0
An example: limx→0+(1/x) =
when x = 0.1 ---> lim becomes 1/0.1 = 10
when x = 0.01 ---> lim becomes 1/0.01 = 100
when x = 0.001 ---> lim becomes 1/0.001 = 1000
so the lim approaches ∞.
However, limx→0-(1/x) =
when x = -0.1 ---> lim becomes 1/-0.1 = -10
when x = -0.01 ---> lim becomes 1/-0.01 = -100
when x = -0.001 ---> lim becomes 1/-0.001 = -1000
so the lim approaches -∞.
Thus the limx→0(1/x) doesn't exist, because you get a different value when you approach from the right side than you do when you approach from the left side.