Farmer John has so very many jobs to do! In order to run the farm efficiently, he must make money on the jobs he does, each one of which takes just one time unit.
His work day starts at time 0 and has 1,000,000,000 time units (!). He currently can choose from any of N (1 ≤ N ≤ 100,000) jobs conveniently numbered 1..N for work to do. It is possible but extremely unlikely that he has time for all Njobs since he can only work on one job during any time unit and the deadlines tend to fall so that he can not perform all the tasks.
Job i has deadline Di (1 ≤ Di ≤ 1,000,000,000). If he finishes job i by then, he makes a profit of Pi (1 ≤ Pi ≤ 1,000,000,000).
What is the maximum total profit that FJ can earn from a given list of jobs and deadlines? The answer might not fit into a 32-bit integer.
Multipel test cases. For each case :
* Line 1: A single integer: N
* Lines 2..N+1: Line i+1 contains two space-separated integers: Di and Pi
For each case, output one line : A single number on a line by itself that is the maximum possible profit FJ can earn.
3
2 10
1 5
1 7
17
Complete job 3 (1, 7) at time 1 and complete job 1 (2, 10) at time 2 to maximize the earnings (7 + 10 → 17).
The greedy strategy for this problem is:
Add all the values first. Then synchronously count a now statistical time.
If the current time has passed the realization, then we subtract the smallest value.
Then the smallest value can be maintained with a heap.
AC code:
#include<stdio.h> #include<queue> #include<vector> #include<algorithm> using namespace std; #define ll long long #define MAX 100008 struct nod { ll d; ll p; }a[MAX]; bool cmp (nod a,nod b) { if(a.d!=b.d) return a.d<b.d; return a.p<b.p; } intmain () { int n; while(scanf("%d",&n)!=EOF) { priority_queue<int,vector<int>,greater<int> >q; for(int i=0 ; i<n ; i++) scanf("%d%d",&a[i].d,&a[i].p); sort(a,a+n,cmp); ll sum=0; ll now=0; for(int i=0 ; i<n ; i++) { now++; sum+=a[i].p; q.push(a[i].p); if(now>a[i].d) { sum-=q.top(); q.pop(); now--; } } printf("%lld\n",sum); } }