On Agent delivery mechanism OSSIM sensor

In OSSIM sensor conversion between communication protocols and data formats and OSSIM OSSIM proxy server achieved by GET frame. Let's briefly look at ossim-agent script:
! # / Usr / bin / Python -OOt
Import SYS
sys.path.append ( '/ usr / report this content share / ossim-agent /')
sys.path.append ( '/ usr / local / Share / OSSIM-agent / ')
from ossim_agent.Agent Import - agent
agent - agent = ()
agent.main ()
It should be as OSSIM GET conveying data to the proxy server OSSIM. To achieve two main operations are tightly integrated needed to "generate" (or) OSSIM compatible event of "Mapping Mapping") and such data to OSSIM "Transport" server. It is responsible for operation of the two components such that the framework GET EventHandler and Sender Agent, as shown in FIG.
On Agent delivery mechanism OSSIM sensor
FIG 1 Get frame content into OSSIM

The main task is to map Event Handler event data source plug-in collection of events to OSSIM standardized format SIEM instance alert. To perform such a process, the original message undergoes a transition to a transition from the existing RAW LOG normalized data field format; in the figure above these mechanisms will be represented as "normalized Normalization" and "OSSIM has a message." A part of the log normalized code:
from Logger Logger Import
from the mktime Time Import, the strptime
Logger = Logger.logger
class the Event:
EVENT_TYPE = 'Event'
EVENT_ATTRS = [
"type",
"DATE",
"Sensor",
"interface",
" plugin_id ",
" plugin_sid ",
" priority ",
" Protocol ",
" src_ip ",
" src_port ",
" dst_ip ",
" dst_port ",
" username ",
" password ",
" filename ",
"




"userdata6",
"userdata7",
"userdata8",
"userdata9",
"occurrences",
"log",
"data",
"snort_sid", # snort specific
"snort_cid", # snort specific
"fdate",
"tzone"
]

def __init__(self):
    self.event = {}
    self.event["event_type"] = self.EVENT_TYPE

def __setitem__(self, key, value):

    if key in self.EVENT_ATTRS:
        self.event[key] = self.sanitize_value(value)
        if key == "date":
            # The date in seconds anf fdate as string
            self.event["fdate"]=self.event[key]
            try:
                self.event["date"]=int(mktime(strptime(self.event[key],"%Y-%m-%d %H:%M:%S")))
            except:
                logger.warning("There was an error parsing date (%s)" %\
                    (self.event[key]))
    elif key != 'event_type':
        logger.warning("Bad event attribute: %s" % (key))

def __getitem__(self, key):
    return self.event.get(key, None)
# 事件表示
def __repr__(self):
    event = self.EVENT_TYPE
    for attr in self.EVENT_ATTRS:
        if self[attr]:
            event += ' %s="%s"' % (attr, self[attr])
    return event + "\n"
# 返回内部哈希值
def dict(self):
    return self.event

def sanitize_value(self, string):
    return str(string).strip().replace("\"", "\\\"").replace("'", "")

class EventOS(Event):
EVENT_TYPE = 'host-os-event'
EVENT_ATTRS = [
"host",
"os",
"sensor",
"interface",
"date",
"plugin_id",
"plugin_sid",
"occurrences",
"log",
"fdate",
]

class EventMac(Event):
EVENT_TYPE = 'host-mac-event'
EVENT_ATTRS = [
"host",
"mac",
"vendor",
"sensor",
"interface",
"date",
"plugin_id",
"plugin_sid",
"occurrences",
"log",
"fdate",
]

class EventService(Event):
EVENT_TYPE = 'host-service-event'
EVENT_ATTRS = [
"host",
"sensor",
"interface",
"port",
"protocol",
"service",
"application",
"date",
"plugin_id",
"plugin_sid",
"occurrences",
"log",
"fdate",
]

class EventHids(Event):
EVENT_TYPE = 'host-ids-event'
EVENT_ATTRS = [
"host",
"hostname",
"hids_event_type",
"target",
"what",
"extra_data",
"sensor",
"date",
"plugin_id",
"plugin_sid",
"username",
"password",
"filename",
"userdata1",
"userdata2",
"userdata3",
"userdata4",
"userdata5",
"userdata6",
"userdata7",
"userdata8",
"userdata9",
"occurrences",
"log",
"fdate",
]

class WatchRule(Event):

EVENT_TYPE = 'event'
EVENT_ATTRS = [
    "type",
"date",
"fdate",
"sensor",
"interface",
"src_ip",
"dst_ip",
"protocol",
    "plugin_id",
    "plugin_sid",
    "condition",
    "value",
    "port_from",
    "src_port",
    "port_to",
    "dst_port",
    "interval",
    "from",
    "to",
    "absolute",
"log",
    "userdata1",
    "userdata2",
    "userdata3",
    "userdata4",
    "userdata5",
    "userdata6",
    "userdata7",
    "userdata8",
    "userdata9",
    "filename",
    "username",
]

class Snort(Event):
EVENT_TYPE = 'snort-event'
EVENT_ATTRS = [
"sensor",
"interface",
"gzipdata",
"unziplen",
"event_type",
"plugin_id",
"type",
"occurrences"
]
日志编码代码:
import threading, time
from Logger import Logger
logger = Logger.logger
from Output import Output
import Config
import Event
from Threshold import EventConsolidation
from Stats import Stats
from ConnPro import ServerConnPro
class Detector(threading.Thread):
def init(self, conf, plugin, conn):

    self._conf = conf
    self._plugin = plugin
    self.os_hash = {}
    self.conn = conn
    self.consolidation = EventConsolidation(self._conf)
    logger.info("Starting detector %s (%s).." % \
                (self._plugin.get("config", "name"),
                 self._plugin.get("config", "plugin_id")))
    threading.Thread.__init__(self)
def _event_os_cached(self, event):
    if isinstance(event, Event.EventOS):
        import string
        current_os = string.join(string.split(event["os"]), ' ')
        previous_os = self.os_hash.get(event["host"], '')
        if current_os == previous_os:
            return True
        else:
            # Fallthrough and add to cache
            self.os_hash[event["host"]] = \
                string.join(string.split(event["os"]), ' ')
    return False
def _exclude_event(self, event):

    if self._plugin.has_option("config", "exclude_sids"):
        exclude_sids = self._plugin.get("config", "exclude_sids")
        if event["plugin_sid"] in Config.split_sids(exclude_sids):
            logger.debug("Excluding event with " +\
                "plugin_id=%s and plugin_sid=%s" %\
                (event["plugin_id"], event["plugin_sid"]))
            return True
    return False

def _thresholding(self):
    self.consolidation.process()
def _plugin_defaults(self, event):
    # 从配置文件中获取默认参数
    if self._conf.has_section("plugin-defaults"):

    # 1) 日期
        default_date_format = self._conf.get("plugin-defaults",
                                             "date_format")
        if event["date"] is None and default_date_format and \
           'date' in event.EVENT_ATTRS:
            event["date"] = time.strftime(default_date_format, 
                                          time.localtime(time.time()))
    # 2) 传感器
        default_sensor = self._conf.get("plugin-defaults", "sensor")
        if event["sensor"] is None and default_sensor and \
           'sensor' in event.EVENT_ATTRS:
            event["sensor"] = default_sensor
    # 3) 网络接口
        default_iface = self._conf.get("plugin-defaults", "interface")
        if event["interface"] is None and default_iface and \
           'interface' in event.EVENT_ATTRS:
            event["interface"] = default_iface
    # 4) 源IP
        if event["src_ip"] is None and 'src_ip' in event.EVENT_ATTRS:
            event["src_ip"] = event["sensor"]
    # 5) 时区
        default_tzone = self._conf.get("plugin-defaults", "tzone")
        if event["tzone"] is None and 'tzone' in event.EVENT_ATTRS:
            event["tzone"] = default_tzone

    # 6) sensor,source ip and dest != localhost
        if event["sensor"] in ('127.0.0.1', '127.0.1.1'):
            event["sensor"] = default_sensor

        if event["dst_ip"] in ('127.0.0.1', '127.0.1.1'):
            event["dst_ip"] = default_sensor

        if event["src_ip"] in ('127.0.0.1', '127.0.1.1'):
            event["src_ip"] = default_sensor
    # 检测日志的类型
    if event["type"] is None and 'type' in event.EVENT_ATTRS:
        event["type"] = 'detector'
    return event
def send_message(self, event):
    if self._event_os_cached(event):
        return

    if self._exclude_event(event):
        return
    #对于一些空属性使用默认值。
    event = self._plugin_defaults(event)

    # 合并之前检查
    if self.conn is not None:
        try:
            self.conn.send(str(event))
        except:
            id = self._plugin.get("config", "plugin_id")
            c = ServerConnPro(self._conf, id)
            self.conn = c.connect(0, 10)
            try:
                self.conn.send(str(event))
            except:
                return
        logger.info(str(event).rstrip())
    elif not self.consolidation.insert(event):
        Output.event(event)
    Stats.new_event(event)
def stop(self):
    #self.consolidation.clear()
    pass

#在子类中重写
def process(self):
pass
def run(self):
self.process()
class ParserSocket(Detector):
def process(self):
self.process()
class ParserDatabase(Detector):
def process(self):
self.process()
… …

As can be seen from the above, the normalized sensor is primarily responsible for re-encoding each of the data fields of LOG, it may generate a new event for transmission to complete OSSIM server. To achieve this purpose GET framework contains a number of specific features in order to convert all of the functions required field BASE64 conversion. Original event "OSSIM message" GET generation responsible for filling in fields that do not exist. So speak above plugin_id, plugin_sid is used to indicate the source of the log message type and subtype, which is a required field generation SIEM event. Completeness format for the event, sometimes when you can not confirm the source or destination IP, system default 0.0.0.0 will be used to populate the field.

Note: Required fields we can use this tool to view OSSIM phpmyadmin MySQL database.
Sender Agent is responsible for completing the following two tasks:
to send formatted by GET collected by the Event OSSIM sent to the server, the message created by this task Event Hander up a message queue to send a message to the middleware, the timing diagram shown in Figure 2 shows.
On Agent delivery mechanism OSSIM sensor
FIG sequence diagram 2: Conversion from log detectors to secure server event OSSIM

2) communication, communication port between the management server OSSIM GET framework and implemented as a TCP 40001 by two-way handshake. Normalized raw log is an important part of the normalization process, OSSIM in the normalization process logs while retaining the original logs for log archiving, providing a means to extract the original log events from standardization.
After normalization of EVENTS, stored in a MySQL database, as shown in FIG. Then proceeds by the association engine parameters according to rules of precedence, reliability cross-correlation analysis, the value of risk and issues a variety of alarm information.
On Agent delivery mechanism OSSIM sensor
FIG 3 log storage mechanism
Next we look at an example, here is a period of the original log Apache, CiscoASA and the SSH, 4, 5, 6 as shown in FIG.
On Agent delivery mechanism OSSIM sensor
Figure 4Apache raw log
On Agent delivery mechanism OSSIM sensor
Figure 5 Cisco ASA raw log
On Agent delivery mechanism OSSIM sensor
Figure 6 SSH raw logs
again show to everyone easy to read format through the Web front end through over the actual OSSIM normalized after. Contrast method return event after handling a raw logs and we in the "open source security operation and maintenance platform OSSIM difficult to resolve: Beginners," a book will explain.
On Agent delivery mechanism OSSIM sensor
7 owned by Apache access log after a treatment

In the example shown in FIG. 7 which, using only Userdata1 and Userdata2, did not use these Userdata3 ~ Userdata9 extension bit are primarily reserved for other devices or services, where the target address is marked as an IP-address for example: Host192.168.11.160. This normalization process actually occurs after collection and storage system events, and the associated data prior to analysis, the SIEM tool acquisition process to convert the data into human-readable format, using the formatted data can be more easy to understand.

Guess you like

Origin blog.51cto.com/chenguang/2439193