javascript - Random Number, Math.floor(...) vs Math.ceil(...) -


i've seen lot of code random numbers generated like

// random integers in interval [1, 10] math.floor(math.random()*10 + 1) 

anyway, feel i'm missing something. why don't people use more succint way

math.ceil(math.random()*10); 

?

i tried test randomness , seems true far.

in fact, subsequent code

// generate random integers 1 4 var frequencies = [ 0, 0, 0, 0, 0 ]; // not using first place var randomnumber; ( var = 0; < 1*1000*1000; ++i ) {    randomnumber = math.ceil(math.random()*4);    frequencies[randomnumber]++; }  ( var = 1; <= 4; ++i ) {    console.log(i +": "+ frequencies[i]); } 

prints out

1: 250103 2: 250161 3: 250163 4: 249573 

what missing?

quick ot: there more succint way declare , initialize frequencies? mean frequencies[5] = { 0 }; c++...

as stated in mdn reference math.random()

returns floating-point, pseudo-random number in range [0, 1) is, 0 (inclusive) not including 1 (exclusive), can scale desired range.

since math.random can return 0, math.ceil(math.random()*10) return 0 , value out of [1..10] range.


about second question, see most efficient way create 0 filled javascript array?


Comments

Popular posts from this blog

monitor web browser programmatically in Android? -

Shrink a YouTube video to responsive width -

wpf - PdfWriter.GetInstance throws System.NullReferenceException -