Kafka Worker documentation

Enter the directory where kafkaWorker is located, and use this directory as a relative directory unless otherwise specified below

• configuration

Create a subdirectory demo in the conf directory. (In actual use, the naming method is determined by the business)

1

At least three files in the demo must be configured. There are no special requirements for file naming, because parameters need to be specified when running kafkaWorker, and the file format remains unchanged.

  1. config.json: zookeeper information; this file refers to the json file format in courseh5, pk, and gold, and the variables are the topic of kafka, the storage path in zk, zk and hosts
  2. conf.ini: configuration items required for kafkaWorker to run; reference: business application framework
  3. template.ini: The template definition for kafkaWorker processing messages. This file is special. Its full path is configured in conf.ini -> DEFAULT -> tpl. KafkaWorker will read this variable at runtime to obtain the template and compile it.

Here we mainly talk about the template.ini template file (the template analysis uses the native package of go: text/template)

 [demo]
-={
    
    {
    
    $arg := .}}{
    
    {
    
     comsumKafkaDemo $arg }}
-=@NONE

Element introduction:

  1. [demo] : is the command name. When calling kafkaProxy, the identifier passed in is sent to kafkaWorker together with the data. eg:
package demo
 
import (
    "git.xesv5.com/golib/gotools/utils/kafkautil"
    "git.xesv5.com/golib/logger"
    "streamCalculator/service/demo/proto"
)
  
func sendKafka(topic, key string, data []byte) error {
    
    
   if topic == "" || key == "" {
    
    
      return logger.NewError("kafka topic%s,key:%s exception", topic, key)
   }
   poolByte := getByte()
   dataBuf := bytes.NewBuffer(poolByte)
   dataBuf.WriteString(key)
   dataBuf.WriteString(" ")
   dataBuf.Write(data)
   if err := kafkautil.Send2Proxy(topic, dataBuf.Bytes()); err != nil {
    
    
      freeByte(poolByte)
      return err
   }
   freeByte(poolByte)
   return nil
}
 
func main(){
    
    
    logs := proto.SubmitStuLogs{
    
    
      123,
      456,
      789,
      "Hello World!",
   }
 
   var recordBuf bytes.Buffer
   if err := msgp.Encode(&recordBuf, &logs); err != nil {
    
    
       logger.E("submitStuLogsAction", "msgp.encode err:%v", err)
   }
    //向kafkaProxy发送异步消息
    sendKafka(demo_topic, "demo", recordBuf.Bytes())
}
  1. "-=": It is the definition format of the default key of the .ini file, indicating that when parsing the configuration file, the key of each line is a natural sequence: "#1", "#2",... , starting from 1.

  2. **"{ { KaTeX parse error: Expected 'EOF', got '}' at position 10: arg := . }̲}" :** This is a variable placeholder, args is a temporary variable defined, which can be used in this The template is referenced by other placeholders. The "." after the equal sign represents the object passed when the template is replaced, such as: tmpl.Execute(out, data) and "." represents the object data.

In kafkaWorker. Specifically refers to the sequence of bytes: dataBuf.Write(data) (above)

  1. Boundary character of the command: @NONE pure variable is passed in and executed; @CURL @RET @END is paired to use to construct http request; by default, it is parsed according to SQL statement

  2. **Custom function { {comsumKafkaDemo KaTeX parse error: Expected 'EOF', got '}' at position 5: args}̲}:** where comsumKa... args is the variable it receives

• Template function customization

First of all, kafkaWorker has prepared some common functions for us, refer to this package definition: github.com/Masterminds/sprig.

Enter the directory src/template and create a new go file

//+build newstudent    
package template
 
import (
    "bytes"
    "git.xesv5.com/golib/logger"
    "github.com/tinylib/msgp/msgp"
 
    "git.xesv5.com/service/streamCalculator/service/demo"
    "git.xesv5.com/service/streamCalculator/service/demo/proto"
)
 
 
var demoReply = proto.Reply{
    
    }
 
func init() {
    
    
 
    //注册模板处理函数,和template.ini 中的函数一一对应, 函数的入参和返参个数、类型任意
    //kafkaWorker 仅是作为消息消费之用,业务逻辑最好还是写到其他服务中,以包的形式引入
    funcsMap["comsumKafkaDemo"] = func(v interface{
    
    }) (ret string, err error) {
    
    
        logs := proto.SubmitStuLogs{
    
    }
        err = msgp.Decode(bytes.NewBuffer(toByteSlice(v, nil)), &logs)
 
        logger.I("comsumKafkaDemo", logs)
 
        demoSev := &demo.DemoService{
    
    }
 
        demoSev.SaveToRedis(createCtxFromMsg(logs), &logs, &demoReply)
 
        return "OK", nil
    }
}

At this point, a kafkaWorker is basically configured, and the next step is to write specific business logic in your own project

• Serialization

The data transmission between kafkaWorker and kafkaProxy is through bytecode, so we need to encode the structure, msgp is a simple and easy-to-use method .

After defining the structure, add a comment at the top of the structure that needs to be encoded: //go:generate msgp. write multiple files

//go:generate msgp
package proto
type Reply struct {
    
    
    Code int64
    Msg string
    Data []interface{
    
    }
}
type SubmitStuLogs struct {
    
    
    StuId int64    `json:"stu_id"`
    PlanId int64   `json:"plan_id"`
    TestId int64   `json:"test_id"`
    Info string    `json:"info"`
}

Install msgp: go get -u -t github.com/tinylib/msgp

$ cd /结构体存放目录 && go generate

After that, the directory where the structure file is located will automatically generate the file, and msgp has helped us automatically implement the encoding interface

• Compile and install

$ make

After a long wait, the executable is installed in the ./bin directory

• run

$ ./bin/kafkaWorker -h
Usage of ./bin/kafkaWorker:
  -c string
        config path, 注:相对目录是以可执行文件所在目录为参照(如本例的目录 ...../kafkaWorker/bin)
  -cfg string
        json config path (default "conf/config.json")  注:相对目录是以运行目录为参照(如本例的目录 ...../kafkaWorker)
  -f    foreground
  -m    foreground
  -mode int
        mode
  -p string
        config path prefix with no trailing backslash
  -s string
        start or stop
  -v    foreground
 
 
$ ./bin/kafkaWorker -c ../conf/demo/conf.ini -cfg conf/demo/config.json
  
出现如下信息即为成功启动
....
Connected to 10.17.91.13:2181
Authenticated: id=245591183736176647, timeout=10000
Re-submitting `0` credentials after reconnect

Guess you like

Origin blog.csdn.net/zhw21w/article/details/129501566