I'll use some pseudocode to explain.
There's an outer function which produces b random bytes by XORing two arrays of b bytes from two wrapper functions (one wrapper which eventually gets bytes from /dev/urandom, and one from your custom entropy mixer):
function get_random_bytes(b):
return wrapper1_get_bytes_from_dev_urandom(b) ^ wrapper2_get_bytes_from_custom_mixer(b)
This would look fine to me if at least one of those wrapper functions generated uniform output e.g. if they looked like this:
function wrapper1_get_bytes_from_dev_urandom(b):
return get_bytes_from_dev_urandom(b)
function wrapper2_get_bytes_from_custom_mixer(b):
return get_bytes_from_custom_mixer(b)
But they don't look like the above, instead they both look something like this:
function wrapper1_get_bytes_from_dev_urandom(b):
do
bytes = get_bytes_from_dev_urandom(b)
loop while bytes are all 0's
return bytes
function wrapper2_get_bytes_from_custom_mixer(b):
do
bytes = get_bytes_from_custom_mixer(b)
loop while bytes are all 0's
return bytes
Neither ever returns all 0's, and therefore the get_random_bytes() will slightly favor returning all 0's[1]. If you only use this get_random_bytes() with b=32 to create private keys and k's for signatures, this is a non-issue presuming that you'd discard an all 0's result anyways. However if you use get_random_bytes() for other purposes, or if keys are produced by calling get_random_bytes(1) 32 times, it might be an issue (especially when b is small). I couldn't say how much of an issue; I'm no cryptographer....
[1] Specifically get_random_bytes(b) will produce an all 0's output about (28b - 1) / (28b - 2) times more frequently than any other particular result, which for b=1 is a perhaps noticeable 0.4% greater number of 0's produced.