'Why is the Python Code running so long and is there any way to get the output quickly?
I have a Python Code to get the largest prime factor of a Number and the below is my code When I give the input to the number up to an 8-digit number takes a few minutes but when I tried running the code for a 12-digit number 600851475143 it took more time and still it didn't give any output or any error. So is there any way how to get the output for the 12-digit numbers quickly?
def large_prime_fact(num):
prime_factors=[]
if num==2 or num==3:
return(prime_factors.append(num))
if num%2==0:
prime_factors.append(2)
for i in range(3,num,2):
cnt=0
if num%i==0:
for j in range(2,i):
if i%j==0:
cnt+=1
if cnt==0 and i!=2:
prime_factors.append(i)
return prime_factors
if __name__=='__main__':
number=int(input('Enter the number:'))
print('The Largest prime factor for',number,'is :',max(large_prime_fact(number)))
Solution 1:[1]
As Karl Knechtel pointed out your approach is seriously flawed. Here on SO there are lots of questions about prime numbers and factorization, which you may want to read.
That said, here's my solution. It may still benefit from further optimizations, but it solves your 600851475143 example in about 1ms.
##define a prime numbers generator
def prime_gen():
primes = []
n = 2
while n < 2**64:
if primes == []:
primes.append(n)
yield n
elif len(primes) == 1:
n = 3
primes.append(n)
yield n
else:
n += 2
limit = int(sqrt(n))+1
for p in takewhile(lambda x:x<limit, primes):
if n%p==0:
break
else:
primes.append(n)
yield n
##factorize and return largest factor
def largest_prime(n):
target = n
for x in prime_gen():
while target % x == 0:
target //= x
if target == 1:
return x
Solution 2:[2]
You have two algorithms here:
- An algorithm for finding all factors of
num
, which contains - An algorithm for checking the primality of each factor you find
Both algorithms are implemented incredibly inefficiently. The primality tester knows, from the moment it finds a single smaller value which divides it, that it's not prime, but you continue counting all the factors of that number, and use the count solely to check if it's zero or non-zero. Since the vast majority of composite numbers will have a smallish factor, you could just break
your checking loop as soon as you find even one, knowing it's composite immediately and avoiding huge amounts of work.
Even better, you could pre-compute the prime numbers up to the square root of the target number (note: Only need to go up to the square root because any factor larger than the square root definitionally has a factor below the square root which you could use to find it without exhaustive search). Efficiently computing all prime numbers in a fixed range (especially such a tiny range as the square root of your target number, which is under 1M) can be done much more efficiently than repeated trial division (search information on the Sieve of Eratosthenes for a decent algorithm for smallish primes).
Replacing slow per factor primality tests from #2 with a cheap precompute of all possible prime factors below the square root (allowing you to cheaply determine any prime factors above the square root) should solve the problem. There are additional optimizations available, but that should get 99% of it.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | gimix |
Solution 2 |