Home >> NUMBER THEORY >> q1-definition

Q1 DEFINITION

For two positive integers a and b define the function h(a,b) as the greatest common factor (G.C.F) of a, b. Let A be a set of n positive integers. G(A), the G.C.F of the elements of set A is computed by repeatedly using the function h. The minimum number of times h is required to be used to compute G is
a. ½ n
b. (n-1)
c. n
d. None of these

Let p and q be any two elements of the set A.

For the computation of the GCF of elements of the set A,

we can replace both p and q by just the GCF(p,q) and the result is unchanged.

So, for every application of the function h, we are reducing the number of elements of the set A by 1.

(In this case two numbers p and q are replaced by one number GCF(p,q)).

Expanding this concept further, the minimum number of times function h should be called is n-1

Write Here

Video Explanation

Share the solution with your mates: