Data flow diagram DFD-detailed introduction

Data Flow Diagram (DFD): A diagram representing system requirements with processing, external entities, data flow, and data storage

Features of DFD :

  • Few graphic elements and easy-to-understand symbols
  • Fully express the main requirements of the system: input, output, processing and data storage
  • End users, administrators and system developers only need a little training to understand DFD diagrams, facilitating communication

Symbol description of DFD data flow diagram

Insert picture description here
Insert picture description here
Insert picture description here
Insert picture description here
Data flow diagram example:
Insert picture description here

DFD diagrams can describe high-level system processing with a high level of generalization, and can also describe low-level system processing abstract levels with more detailed decomposition
: a modeling technique that decomposes the system into a gradually refined hierarchical set
Insert picture description here

Insert picture description here

Associated DFD diagram
Associated diagram: A DFD association diagram that summarizes all processing activities in the system in a single processing symbol is
very useful when expressing system boundaries. The scope of the system is defined by a single processing and the things represented by external entities. Data storage is not drawn in the correlation diagram because it is considered to be the internal content of the system. When a system responds to many events, it is often divided into multiple subsystems, and a correlation diagram is created for each subsystem
Insert picture description here

DFD Fragment
DFD Fragment: Use a single processing symbol to represent the DFD of the system in response to an event

  • In the DFD fragment, the interaction details between processing, external entities and internal data storage are shown

  • Each DFD fragment only displays the relevant data stores to respond to the event

  • A DFD fragment is created for each event in the event table
    Insert picture description here

Data flow consistency

  • A "processing" and the "processing" should be consistent in the content of the data stream after being decomposed in detail
  • For a "processing", if there is data inflow, there must be corresponding data outflow
  • For a "processing", if there is data outflow, there must be corresponding data inflow

Black hole : processing or data storage that carries input data but does not use it to produce output data
Insert picture description here

Miracle : A process or data store that does not have enough data elements as input or source of production
Insert picture description here

Summary of typical errors of DFD diagram
Insert picture description here
DFD model:

  • Do not draw data storage in the correlation diagram
  • The data flow does not reflect the processing sequence, it shows the flow of the data through the system, so "processing" can work in parallel
  • "Processing/data storage" requires both input and output
    • If the input data stream is not completely used to generate the output data stream, it is called a black hole
    • If the output data stream is not completely dependent on the input data stream, it is called a miracle

Guess you like

Origin blog.csdn.net/qq_41784749/article/details/112227057