The meaning of the question: input a n, k, A, B, you need to change n into 1 through two operations, one is to subtract 1 each time, which needs to cost A coins, if the current n is a multiple of K, n It can also be divided by k, which costs B coins. To output the least cost, let n become 1.
Solution: At first, I simulated greedy. If n is a multiple of k, compare the cost of dividing by k, which is the same as the cost of subtracting 1 from n to n/k one by one. Take the smaller cost, not the cost of k Multiples, just subtract one by one, the idea is correct, but it will time out, because it is not a multiple of k, in the case of large data, one by one reduction will time out, and then optimize, we need to compare the cost when n is k When it is a multiple of , so when it is not a multiple, do not reduce it one by one, and reduce it to n at one time, which is a multiple of k. n-=n%k. The time complexity is greatly reduced. The first simulation of the fourth group of data takes 14 seconds, and then the fourth group of data is optimized for 0 seconds.
#include<bits/stdc++.h> using namespace std; intmain() { long long n,k,a,b,ans=0; cin>>n>>k>>a>>b; // double s=clock(); while(n>1) { ans+=n%k*a; n-=n%k; if((n-n/k)*a<b) yrs+=(n-1)*y,n=1; else ans + = b, n = n / k; } // double t=clock(); // cout<<(t-s)/1000<<"毫秒"<<endl; cout<<ans<<endl; return 0; }