Quartz official translation tutorial series -Lesson 3

Original Address: http://www.quartz-scheduler.org/documentation/2.4.0-SNAPSHOT/tutorials/tutorial-lesson-03.html

Lesson: Jobs and Job Details More

As you saw in the second lesson, the task is quite simple to implement, there is only one 'execute' method in the interface. Here are only a few things you need to understand about the nature of jobs, and about Job interface execute (...) method, as well as JobDetails.

When you implement a class assignment, there know how to do code for a specific type of work, Quartz need to know about the various attributes that you want to have the task instance. This will complete by JobDetail, in a simple already mentioned.

Examples JobDetail constructed by JobBuilder class. You can usually use a static imports all of its methods, in order to make your code looks areas of specification language of feeling.


import static org.quartz.JobBuilder.*;复制代码

Let's take a moment to discuss the life cycle in nature Quartz point Jobs and job instances. First, let us look back at the first lesson snippet:



  // 定义任务并绑定我们的 HelloJob 类
  JobDetail job = newJob(HelloJob.class)
      .withIdentity("myJob", "group1") // name "myJob", group "group1"
      .build();

  // 触发任务立刻运行,并且妹40秒有也行
  Trigger trigger = newTrigger()
      .withIdentity("myTrigger", "group1")
      .startNow()
      .withSchedule(simpleSchedule()
          .withIntervalInSeconds(40)
          .repeatForever())            
      .build();

  // 告诉 quartz 执行调度使用的触发器
  sched.scheduleJob(job, trigger);复制代码

Now consider the task class "HelloJob" is defined as follows:


public class HelloJob implements Job {

    public HelloJob() {
    }

    public void execute(JobExecutionContext context)
      throws JobExecutionException
    {
      System.err.println("Hello!  HelloJob is executing.");
    }
  }复制代码

Note that we specify a schedule for the task JobDetail instance, and when building JobDetail, just to provide the type of job classes to perform the job. Every task scheduling tasks, it will be created before the new instance is called execute (...) method. When the end of the execution, the task references class instances is discarded, and examples will be recycled. One consequence of this behavior is that there is a necessary task, no arguments constructor (when using the default JobFactoy achieve). Another result is that the class definition in the job status data field is meaningless - their value will not be saved during task execution.

You might want to ask "How can I save property / configure a task instance?" And "How can I track the status of a task to perform in between?" Their answers are the same, the key is JobDataMap, JobDetail objects a part of.

JobDataMap

JobDataMap can be used to hold any number (serialized) data objects that are available when you execute the task instance. JobDataMapJava is an implementation of the Map interface, and added some convenient method for storing and restoring the original type of data.

Here it is on the definition / JobDetail store data to build JobDataMapthe code segment, prior to the addition schedule tasks in a task:


// 定义任务并且绑定到DumbJob类
  JobDetail job = newJob(DumbJob.class)
      .withIdentity("myJob", "group1") // name "myJob", group "group1"
      .usingJobData("jobSays", "Hello World!")
      .usingJobData("myFloatValue", 3.141f)
      .build();复制代码

The following is a task to see from the implementation of JobDataMapa simple example of data acquisition:


public class DumbJob implements Job {

    public DumbJob() {
    }

    public void execute(JobExecutionContext context)
      throws JobExecutionException
    {
      JobKey key = context.getJobDetail().getKey();

      JobDataMap dataMap = context.getJobDetail().getJobDataMap();

      String jobSays = dataMap.getString("jobSays");
      float myFloatValue = dataMap.getFloat("myFloatValue");

      System.err.println("Instance " + key + " of DumbJob says: " + jobSays + ", and val is: " + myFloatValue);
    }
  }复制代码

If you want a persistent storage tasks (discussed in JobStore section of this tutorial) you should carefully decide on the JobDataMapplace, because the object is serialized, so they can easily appear like version control issues. Clearly, the standard Java type should be very safe, but more than this, any time someone change a class instance that you have implemented the definition, must be careful not to break compatibility. Another option is that you can put JDBC_JobStore and JobDataMap only allowed to enter the mode primitives and strings are stored in the map, thereby eliminating the possibility of serialization problems in the future.

The default JobFactory example, if you increase corresponds to the JobDataMap keys setter method to your task class (like the following example of data in setJobSay (String val)) method, then Qurartz will automatically call after his setter in Job instantiated, Therefore, no explicit values ​​acquired from the map in your execute method.

Triggers can also JodDataMaps association. Thus in the case when you have a task is stored in the flip-flop to a plurality of periodic scheduler / reusable are useful, however, for each individual trigger, you provide different data input for the task.

JobDataMap when the task is executed, it is easy to find in JobExeccutionContext. It is bound on JobDataMap JobDataMap and the Trigger JobDetail, using the value of the latter to cover any former value of the same name.

Here is a brief example to demonstrate how to get the data from the integration of JobExecutionContext and JobDataMap, when performing the task.


public class DumbJob implements Job {

    public DumbJob() {
    }

    public void execute(JobExecutionContext context)
      throws JobExecutionException
    {
      JobKey key = context.getJobDetail().getKey();

      JobDataMap dataMap = context.getMergedJobDataMap();  // Note the difference from the previous example

      String jobSays = dataMap.getString("jobSays");
      float myFloatValue = dataMap.getFloat("myFloatValue");
      ArrayList state = (ArrayList)dataMap.get("myStateData");
      state.add(new Date());

      System.err.println("Instance " + key + " of DumbJob says: " + jobSays + ", and val is: " + myFloatValue);
    }
  }复制代码

Or if you want to rely on JobFactory value by injecting data Map your own class, it looks like this:


public class DumbJob implements Job {


    String jobSays;
    float myFloatValue;
    ArrayList state;

    public DumbJob() {
    }

    public void execute(JobExecutionContext context)
      throws JobExecutionException
    {
      JobKey key = context.getJobDetail().getKey();

      JobDataMap dataMap = context.getMergedJobDataMap();  // Note the difference from the previous example

      state.add(new Date());

      System.err.println("Instance " + key + " of DumbJob says: " + jobSays + ", and val is: " + myFloatValue);
    }

    public void setJobSays(String jobSays) {
      this.jobSays = jobSays;
    }

    public void setMyFloatValue(float myFloatValue) {
      myFloatValue = myFloatValue;
    }

    public void setState(ArrayList state) {
      state = state;
    }

  }复制代码

You will notice that the overall code is longer, in a single execute () method inside the code clearer. One might say that, although the code is longer, single practical uses less code, if programming IDE to create setter methods have their own, without having to manually write code to retrieve the value of each call from JobDataMap. Your choice.

Tasks "instance"

Users spend a lot of time is not quite sure what the task instance. In the following sections we try to make it clear which task state and concurrency.

You can create a separate task class, and store multiple "instances defined" in the scheduler by creating multiple instances of JobDetails - each with its own attributes and JobDataMap- and they all add to the Scheduler.

For example, you can create a name "SalesReportJob" task instance. This task may be desired parameters send (like JobDataMap) to specify the name of the sales report is based on sales staff. They may create multiple definitions (JobDetails) in the task, like "SalesReportForJbe" and "SaleReportForMike" specified as input each job in the corresponding JobDataMaps in with "joe" and "mike".

When a trigger ignition, association (defined in Example) the JobDetail is loaded, and the task instance of the class is referenced JobFactory configured to perform scheduling. The default JobFactory call newInstance job class () method, the intent to call in on a class and method JobDataMap key name to match. You may want to create your own JobFactory instance to complete your application IOC or DI container production / initialization task instance.

In the "Quartz speak", we refer to each stored JobDetail like "job definiton" or "JobDetail instance", then we refer to each task performed like "job instance" or "instance of a job definition". In general, if we only use the "job" of the word, we mention a name is defined, or JobDetail. When we refer to a job class instance, we usually use the term "job class".

And parallel task status

Annotated now, something about job status data (ie JobDataMap) and parallel) of. Here is a pair of notes can be added to your task class, the impact Quzrtz behavior.

@DisallowConcurrentExecution is a task class can be added in the annotation, not telling the task definition Quartz (here designated task class) parallel execution of multiple instances specified. Note that the wording is very carefully selected. In an example, if "SalesReportJob" this annotation, then only one "SalesReportForJoe" Examples may be performed at a specified time, but may be performed instance a "SaleReportForMike" in parallel. Constraints are defined based on examples (JobDetail), not an instance of the task. However, it is dependent on (during Quartz design) comment on whether the process itself, as it often have an impact on encoding class.

@PersistJobDataAfterExecution is a task class can be added to the comments, tell Quzrtz after performing the execute () method is successful (not thrown) to update the JobDetail JobDataMap storage, so the next time perform the same task (JobDetail) receives the update value if the stored value is not @DisallowConcurrentExcetion as for example a task, the task is not a class is an instance, depending on the nature of the task class because it generally affect the coding class. (Like "statefulness" it will need clear "understood" by the code in the execute method).

If you use @PersistJobDataAfterExecution this comment, you need to consider carefully while using @DisallowConcurrentExecution notes, in order to avoid a the same task data (JobDetail) when parallel storage leads to possible confusion (chaos speed).

Other task properties

Here is a quick summary of the definition of a task instance that is JobDetail other properties of objects:

  • Persistence - If a task is not permanent, he will perform the task scheduler is not associated with any of the trigger once, the life cycle of non-durable task of binding in the presence of a trigger.
  • Recoverability - If a task, "recoverable", and when it is executed by a "hard close" (that is, the process in the course of running the Ben collapse, or shut down the machine), then start again when the task scheduler, it will re-run . In this case, it returns true JobExecutionContext.isRecovering () method.

JobExecutionException

Finally, we need to tell you Job.execute(...)some of the details. The only type of exception (includes RuntimeEcxeption) you can throw an exception from the execution method is JobExecutionException. Because of this, you should usually packed the entire contents of "try-catch" block. You should also take the time to read the document, on the part of JobExecutionExcetion provides various commands for the scheduler to determine how you want to handle the exception.

This article from the blog article multiple platforms OpenWrite release!

Guess you like

Origin juejin.im/post/5de0dd2d5188254cb43db753