Java regular tasks the Quartz (II) - Data transfer

1 EDITORIAL

In the actual development, we often need to pass data parameters to the task, the task created before, we can only JobBuilder.newJob (DataJob.class) in the form of transfer to a class builder, in fact JobDetail Interface provides a method  getJobDataMap () , for transmitting data. 

Preliminary data transfer 2

2.1 JobDataMap

JobDataMap by reading the source code, we find it is to use a String as a key of Map of realization, but also has a isDirty field. It has getCharacterFromString () method and the like, the principles of the method is obtained from the Map Object, and cast to type Character, other similar methods. Please note that the parent class generic categories:

 

 

 

2.2 JobExecutionContext

If the Job require data in performing the task, naturally acquired from the unique method of operating parameters JobExecutionContext in JobExecutionContext interface provides  getJobDetail () method to obtain  JobDetail , and there is method acquiring JobDataMap in JobDetail the  getJobDataMap () , according to this idea, we naturally think of the data transfer is very likely in  JobBuilder the completed (behind the analysis, we found in Trigger can also pass, of course, these are the words)

2.3 Method usingJobData

There are several ways usingJobData signature, there usingJobData (String dataKey, Boolean value), usingJobData (String dataKey, Double value), usingJobData (JobDataMap newJobDataMap) range, we first see one way

Various methods dataKey field is String type, corresponding to DirtyFlagMap < String , Object> The first generic type.

 

 

JobDataMap field is actually JobDataMap class, the method of data transfer into JobDataMap, and  () Build value method in the field gave JobDetail JobDataMap provided in jobDataMap:

2.4 Test

According to the analysis of the top, we can simply write the following test class:

 1 /**
 2  * @author pancc
 3  * @version 1.0
 4  */
 5 public class DataJobDemo {
 6 
 7     public static void main(String[] args) throws SchedulerException, InterruptedException {
 8         JobDetail detail = JobBuilder.newJob(DataJob.class)
 9                 .withIdentity("data", "group0")
10                 .usingJobData("data", "hello")
11                 .build();
12 
13         Trigger trigger = TriggerBuilder.newTrigger()
14                 .withIdentity("data_trigger")
15                 .startNow()
16                 .build();
17 
18         Scheduler scheduler = new StdSchedulerFactory().getScheduler();
19 
20         scheduler.start();
21         scheduler.scheduleJob(detail, trigger);
22         /*
23          * 2 秒钟后关闭
24          */
25         Thread.sleep(2_000);
26         scheduler.shutdown();
27     }
28 
29     public static class DataJob implements Job {
30 
31         @Override
32         public void execute(JobExecutionContext context) {
33             String data = context.getJobDetail().getJobDataMap().getString("data");
34             System.out.printf("get data {%s} from map\n", data);
35 
36         }
37     }
38 }

 

This program task successfully printed the following statement:

 

3 data is transmitted Revisited

According to the official description, the value set in the JobDataMap automatically mapped to the Job class field, there is an implicit requirement field and setter methods must follow JavaBean specification.

This time we can not be as simple as the top view source code can understand the design of which, let's use an example, and it's break point to view the automatic injection of magic.

3.1 Simple test

We have a specified Job name field, and in line with the JavaBean specification, and we pass to JobDetail a dataKey as name   value:

 1 /**
 2  * @author pancc
 3  * @version 1.0
 4  */
 5 public class InjectDataDemo {
 6 
 7     public static void main(String[] args) throws SchedulerException, InterruptedException {
 8         JobDetail detail = JobBuilder.newJob(InjectData.class)
 9                 .withIdentity("inject", "group0")
10                 .usingJobData("name", "Alex")
11                 .build();
12 
13         Trigger trigger = TriggerBuilder.newTrigger()
14                 .withIdentity("inject_trigger")
15                 .startNow()
16                 .build();
17 
18         Scheduler scheduler = new StdSchedulerFactory().getScheduler();
19 
20         scheduler.start();
21         scheduler.scheduleJob(detail, trigger);
22         /*
23          * 2 秒钟后关闭
24          */
25         Thread.sleep(2_000);
26         scheduler.shutdown();
27     }
28 
29     public static class InjectData implements Job {
30 
31         private String name;
32 
33         public void setName(String name) {
34             this.name = name;
35         }
36 
37         @Override
38         public void execute(JobExecutionContext context) throws JobExecutionException {
39             System.out.printf("hello, my name is %s \n", name);
40         }
41     }
42 
43 }

 

View the console, we managed to get to the value in the name attribute:

 

3.2 Exploring injection principle

 

 Now,  this.name = name; pre-marked break, we come to see the actual process through the call stack debug. 

 

Four inlets respectively, which is

  • A first inlet: shell.initialize (QS);  belonging QuartzSchedulerThread thread run method is invoked at start QuartzScheduler.
  • The second entry: Job = sched.getJobFactory () newJob (firedTriggerBundle, Scheduler);. Here instantiate a task.
  • The third entry: setBeanProps (the Job, jobDataMap);  the method name we can guess that it uses jobDataMap Bean field values were set.
  • The fourth entry: setMeth.invoke (obj, new new} {parm Object []); from the method name we can guess that it sets the value of reflection to invoke the setter method by the corresponding field.

In the first four entries, we adopted the debug variables view, at which point it can be observed constantly trying to acquire key from JobDataMap, try to use datakey   value to find the corresponding class in InjectData setter method, and the call instance on the setter method provided value:

  

 

Data transfer with the data merging 4 Trigger

In constructing Trigger, we also observed usingJobData the same method JobBuilder, if at this time to repeat Trigger passed in the value of the new field, how? Before writing test code, we return to the point of analysis PropertySettingJobFactory class, careful observation of this method:

 

 And DirtyFlagMap in Map type:

 

 Therefore, our data is like merging two HashMap as duplicate key values ​​occur coverage of the meeting, the new value overwrites the old, do not conflict with the reserved

4.1 merge verification

 1 /**
 2  * @author pancc
 3  * @version 1.0
 4  */
 5 public class DuplicatedDataDemo {
 6 
 7     public static void main(String[] args) throws SchedulerException, InterruptedException {
 8         JobDetail detail = JobBuilder.newJob(DuplicatedData.class)
 9                 .withIdentity("inject", "group0")
10                 .usingJobData("name", "Alex")
11                 .build();
12 
13         Trigger trigger = TriggerBuilder.newTrigger()
14                 .withIdentity("inject_trigger")
15                 .usingJobData("name", "Alice")
16                 .usingJobData("age",50)
17                 .startNow()
18                 .build();
19 
20         Scheduler scheduler = new StdSchedulerFactory().getScheduler();
21 
22         scheduler.start();
23         scheduler.scheduleJob(detail, trigger);
24         /*
25          * 2 秒钟后关闭
26          */
27         Thread.sleep(2_000);
28         scheduler.shutdown();
29     }
30 
31     public static class DuplicatedData implements Job {
32 
33         private String name;
34 
35         private Integer age;
36 
37         public void setName(String name) {
38             this.name = name;
39         }
40 
41         public void setAge(Integer age) {
42             this.age = age;
43         }
44 
45         @Override
46         public void execute(JobExecutionContext context) throws JobExecutionException {
47             System.out.printf("hello, my name is %s , my age is %d \n", name, age);
48         }
49     }
50 }

 

It can be observed in JonDetail name we set off Trigger been replaced in the new age held by the Trigger value passed to the correct age properties:

 

4.2 what is my JobExecutionContext

Continue code above, let us add the following statement execute and marked with a breakpoint:

 

Let us walk  org.quartz.core.JobRunShell # initialize method, based here Scheduler , from JobStore (in this case, RAMJobStore) acquired  TriggerFiredBundle within an instance method with the instantiation of Job instance creates a   JobExecutionContext:

 

 

 So, when we want to detect the original data is overwritten, you can use the following statement:

1         @Override
2         public void execute(JobExecutionContext context) throws JobExecutionException {
3             JobDataMap map = context.getJobDetail().getJobDataMap();
4             JobDataMap mapMerged = context.getMergedJobDataMap();
5             List<Map.Entry<String, Object>> duplicates = mapMerged.entrySet().stream().filter(en -> map.getWrappedMap().containsKey(en.getKey())).collect(Collectors.toList());
6             System.out.printf("hello, my name is %s , my age is %d \n", name, age);
7         }

 

 

Data transfer in the pit 5

Safety Type : evokes setter method when the corresponding field, the type of security class Quartz by checking the data guarantee.

Unserializable error : When evoke corresponding field setter method, Quartz setter also examined whether the type of the parameter corresponding to the basic type (Primitive), if it will be given .

Data covering : the use of the underlying nature of the JobDataMap the HashMap , so then the value will cover the original value.

 

Guess you like

Origin www.cnblogs.com/siweipancc/p/12596035.html