You could use the functional built-in reduce
. It will repeatedly (or recursively) apply a function - here an anonymous lambda - on a list of values, building up some aggregate:
>>> reduce(lambda x, y: x * (y + 1), [1, 2, 3])
12
which would be equivalent to:
>>> (1 * (2 + 1)) * (3 + 1)
12
If you need another initial value, you can pass it as the last argument to reduce:
>>> reduce(lambda x, y: x * (y + 1), [1, 2, 3], 10)
240
>>> (((10 * (1 + 1)) * (2 + 1)) * (3 + 1))
240
Like @DSM points out in the comment, you probably want:
>>> reduce(lambda x, y: x * (y + 1), [1, 2, 3], 1) # initializer is 1
which can be written more succinctly with the operator module and a generator expression as:
>>> from operator import mul
>>> reduce(mul, (v + 1 for v in d.values()))
I would have guessed, that the generator variant is faster, but on 2.7 it seems it is not (at least for very small dictionaries):
In [10]: from operator import mul
In [11]: d = {'a' : 1, 'b' : 2, 'c' : 3}
In [12]: %timeit reduce(lambda x, y: x * (y + 1), d.values(), 1)
1000000 loops, best of 3: 1 us per loop
In [13]: %timeit reduce(mul, (v + 1 for v in d.values()))
1000000 loops, best of 3: 1.23 us per loop