UNIX Programming Art

I- scene

  1. Philosophy
    mechanism rather than policy, laissez-faire style of attention, resulting in diversity. For example, Unix applications offer many options of behavior, so that non-technical users confused and disoriented, and lost a lot of users; but relatively short-lived policies, mechanisms will forever have a lot of flexibility.
    Fun is a sign of peak efficiency. For programmers and developers, if needed to complete a task effort for them is still a challenge and just limits of its resources, they will feel a lot of fun. Hack and UNIX have fun.
    UNIX philosophy is bottom-up, pragmatic, and to encourage feeling prioritize skeptical attitude, and encourage you to humor optimistic attitude towards them.
    Simplicity is the core UNIX-style program. UNIX philosophy of the KISS principle:! Keep it simple, stupid
    first prototype, and then crafted. Before optimization, make sure can.
    2. History - Shuangliu remember
    this chapter review the history of UNIX, UNIX culture today to clarify why the presentation of the current state. UNIX history are two of the origins and history and the origins and history of hacking

The origin and history of UNIX
UNIX ancestors is CTSS- compatible time-sharing system, parents are quite pioneering Multics project (trying to build a system with many features);
Genesis: 1969-1971, UNIX was born at Bell Labs; * exodus: 1971-1980, adding C language, bringing its presentation of readability, portability and can be modified, UNIX to bring some success.
TCP / IP and UNIX Civil War: 1980-1990, TCP / IP is introduced, so that UNIX is more energetic. However, the rise of Microsoft and AT & T split, so that UNIX into purgatory. UNIX industrialization, destruction of the UNIX source code of free exchange, but it is nourishing and the free exchange of UNIX system early vigor.
Counterattack Empire: 1991-1995, Linus Torvalds announced the Linux project, dependent on distributed development and patch (patch) tool, a graphical interface to join the Internet and realize the dream of parity of Unix, and assembled the UNIX traditional elements. After 1995, UNIX story becomes the story of the open source movement.

Hackers origin and history: 1961-1995
games in the forest between the campus: 1961-1980 years, the Artificial Intelligence Laboratory at MIT programmer should be the first self-proclaimed "hacker" of the people;
the big Internet integration with the free software movement: 1981-1991. 1983, BSD implanted with TCP / IP, UNIX culture and cultural ARPANET began to integrate. RMS created the GNU project, dedicated to compile a complete free operating system. Propose free software terminology, so that the hacker culture is more self-conscious.
Deal with Linux and pragmatist:. 1991-1998 Linus Torvalds cleverly across the GPL and GPL anti-faction battle. He used the GNU toolkit built own Linux kernel, it is protected with contagious nature of the GPL. Torvalds clear that he believes the free software is usually very good, but he occasionally use proprietary software, he refused to become fanatics.

Open source movement: 1998 and after
a tribal code base they may come from Zero maintenance, or with one or more extraordinary influential leader, or a language, a development tool, or a particular software license, or a technology standard or manage organizational infrastructure of a section. After 1995, Linux plays a special role: the majority of the community is both a unified platform software, the hacker is the most recognized brands. The whole hacker culture started to gather in a common goal: to promote Linux development model and market development forward.
Another intention of "open source" behind the method is the hope of the hacker community and in a more pro-market, less confrontational style introduced to the outside world.

Unix history lessons
from the open source closer the more prosperous;
over-reliance on any kind of technology or business model is wrong - on the contrary, to keep track design software and its traditional flexibility is to survive;
other low-cost and flexible and program rivalry;
a true professional and dedication, what we did before succumbing to the secular concept of "reasonable commercial practices".

  1. Contrast: Unix philosophy comparison with other philosophical
    operating system style elements
    unifying idea of the operating system. For example, UNIX systems, "everything is a file" concept models and thus build the pipeline on the basis of.
    Multi-tasking capabilities. Unix systems have preemptive multitasking capability. Multi-tasking and multi-user is not the same thing.
    Cooperating processes: UNIX's IPC (interprocess communication) is very flexible.
    Internal border: Unix is believed programmers, but programmers can not destroy other people's data, Unix established an internal boundary to prevent a malicious user or defective procedures.
    Development threshold: Unix scripting tools will be compiled and placed in the default installation, the machine supports a large number of players across cultural development.

Comparison of the operating system
lists the advantages and disadvantages of several operating systems.

What kinds of seeds, fruit was what
some disadvantages competitors: for example non-portable, do not have a good support network capacity.
The Windows flaw on the server (security reasons), making Linux a major breakthrough.
Like a house built, repair superstructure on a solid foundation certainly better than replacement without disrupting the foundation superstructure easier.

II- design
4. Modular: Keep clear, keep it simple.
The early Unix programmers are good because they are forced modular so, if there is no good architecture, operating system will collapse.
Good package module, not too much to disclose details of their outside, it does not call directly implement code to other modules, not casually share global data.
Module having optimum size does not mean that a high quality code is, we have to consider the compactness and orthogonality.
Compact design is the ability to load a human brain. For example, a design experienced users do not need manual operation, the design is compact.
Orthogonality means any operation had no side effects, every action only change one thing, does not affect the other.
Do not repeat its own (do not repeat yourself).
Layered Software: top-down and bottom-up. When the top-down and bottom-conflict occurs up, and the top level domain application logic primitives bottom must be carried out with an impedance matching layer of glue logic. Glue layer is a very nasty thing, we must be as thin as possible.
OO languages make abstraction easier, but too much damage to the level of transparency, it is difficult to see the level, you can not sort out the process of running the code.
Single function calculation is not so much the number of rows, as it is a question of the degree of internal complexity (for example, local variables too much, there are too many indented code).

  1. Textual: good agreement to produce good practice
    design file format and application protocols to consider important aspects: interoperability, transparency, scalability and economy.
    The only justification for using binary: processing large quantities of data sets or care about time or instruction overhead.
    Data file metadata format with multiple different metadata formats, such as DSV, RFC 822, Cookie-Jar , Record-Jar, XML, Windows INI etc;
    application protocol if it is in text format, with the naked eye can easily analyze a lot of things change too easily. Can see SMTP, POP3 and IMAP three classical application protocol.
    Application Protocol yuan formats: Although many network bandwidth more expensive than storage, handling things need to focus on the economy, but the transparency of the text formats and interoperability advantage is very significant, most designers chose to employ more readable text format. Such as HTTP protocol, BEEP protocol, XML-RPC / SOAP / Jabber protocol.

6. Transparency: to point the light
in the fifth chapter discusses the application protocol data format and text of the importance of the text so that transparency and dominant quality has improved. * If you can actually predict the behavior of the entire program or in most cases, the procedure is transparent; * If the program can help people establish "what to do, how to do", the software system is developable. For example, the user, can help documentation dominant; for programmers, good naming convention can help improve dominant. * To pursue transparent code, the most effective method is very simple, it is not too much of an abstraction layer stacked on the specific operation of the code. * Transparent system in the bug attack, easier to implement recovery measures; at the same time, transparent system easier to understand, and thus easier to maintain.

  1. Multi-channel programming
    UNIX program most characteristic modular approach is to a large program into multiple cooperating processes.
    Multiple concurrent processes in addition to bringing the benefits of modularity addition, another reason for stronger security.
    UNIX IPC: The Task transferred to a special program (shell out), pipes / redirection (pipes main drawback is unidirectional, named pipe can be used as an adapter between the two), from the process, peer processes (temporary files, signal, sockets, shared memory).

  2. Miniature language: Finding a musical note singing
    UNIX class has a long tradition, there is a small, specialized applications for the special, a significant reduction in the number of language program line. For example, many Unix typesetting language (troff / pic), shell using the program (awk / sed / dc / bc ) and software development tools (make / yacc / lex, etc.). The boundaries between micro language and scripting language very vague. * All computable problems can be calculated, called Turing complete. * Micro understand the language used at what time and under what scenarios. * In some cases, we need to design a miniature of our language. First of all to keep the language as simple as possible miniature (complexity), thinking ability by expanding existing or embedded scripting language to implement their own mini language (this is the correct way to achieve command of language), caution macro.

  3. Generation: raising the level of specification of
    data than the program logic control easier, more intuitive, transparency, and superior in terms of clarity.
    Data driven programming, the code and data structures clearly defined, when changing the program logic, simply changing the data structure without modifying the code.
    Generating a specific code, such as HTML code is generated by the tool. Work as little as possible, constructive laziness is one of the basic virtues of master programmers.

10. Configuration: the first step towards the correct
configuration Where? In the following where the query is usually carried out in the following order, provided the latter overwrite the previous settings.
Run control files in / etc / directory.
Environment variables, system environment variables, the user environment variables.
Command line options
above arrangement is arranged from the most difficult to change the order most likely to change.

  1. Interface: a user interface design patterns in the Unix environment
    summation method is an interface with the user program or programs to communicate with other programs.
    Minimum innovative principles: to create something less, is a universal principle that all interfaces in the design, and is not limited to software design. As one can not with two, should center on the task interface belongs. And this principle and should not be construed as conservatism machinery. If possible, try to allow the user to interface functions delegated to the familiar procedures to complete, it can not be delegated to emulate.
    There is rich interface style Unix utilities: line-oriented, character-oriented screen and an array of X-based, different interface styles for different tasks.
    Several metrics interface: simplicity, performance, ease of use, transparency and scripting capabilities.
    Tradeoff between CLI and visual interfaces. Command line more expressive, especially for complex tasks with a high degree scripting capabilities, but need to effortlessly CLI memory (ease of use is low), and the transparency is very low. For example, SQL statements and graphical interfaces for operation of the database, you can clearly see the difference between the two, calculators compare the same command-line calculator and the graphical interface is also more obvious.
    Unix interface design patterns. The input and output mode can be divided into these: mode filters (such as grep), Cantrip mode (such as Clear), a source mode (no need to enter, for example LS), receiver mode (only receives no output, less of example, such LPR), the compiler mode (neither standard nor input standard output, sends an error message to the standard error end), ED patterns (such as ftp, sh), Roguelike mode (such as character array VI), "engine Interface and separation" mode (MVC GUI prototype model as recommended), CLI server mode
    web browser as a universal front-end, without writing a custom GUI front end.

  2. Optimization
    premature optimization is the root of all evil
    as a last resort, try not to optimize the system work, but wait a few months and look forward to better hardware performance.
    First estimate, re-optimization. By profiler, to clear bottlenecks, and the profiler tool itself is the presence of errors.
    The most efficient code optimization is to keep the code short and simple. Target machine is hierarchical, core data structures and code in the fast cache instruction.
    Another effect is that fast processor performance is often limited by I / O, and network overhead affairs, to try to avoid and from the agreement.
    There are three general strategies to reduce delays: batch operation, overlapping operations, the cache operating results.

  3. Complexity: as simple as possible, but not simpler too far
    "Keep it simple, stupid", for the simple understanding is actually very complex.
    Three sources of complexity: the complexity of the realization of programmers, customers and the complexity and amount of code used by the user, with regard to how to compromise, there is no standard answer.
    By five story editor, choose to understand the complexity of the different levels of the editor generated when dealing with more complex tasks.
    Stingy principle: only the other empirical method fails when writing large programs.

III- tool
14. Language: C or non-C
Unix there is a lot of language below. First, because Unix is widely used in research and teaching platform, and second, because the application design and implementation languages with a reasonable productivity has greatly promoted.
C language is very powerful, very economical. But requires the programmer to complete memory management, very complex, and with the performance of the hardware upgrade, the main bottleneck concentrated in the I / O wait events, network latency and cache line fill on other restrictions, so Python, Java and other languages slowly rise .

  1. Tools: Development of Art
    This chapter describes the development strategy under Unix - compiled code, code configuration management, performance analysis, debugging and automate a variety of dirty work. This set of tools is more flexible than the IDE. * Selection editor: vi and Emacs * private code generator: yacc, lex * automated compilation: make. A joke: Enter the "make love", output is "Do not know how to make love ". * Version control system * Performance Analysis: gprof

  2. Reuse: Do not reinvent the wheel on
    a lot of people like to build their own wheels, because the library may not be transparent, full of bug, not as their own to the enjoyment of all.
    With open source suggested that we should choose a good wheels, so you can save time and improve efficiency.
    Many open-source sites like github and so on.
    Note that some of the issues the license.

IV- Community
17. Portability: Portability and follow standard software
portability has been a major advantage of Unix, portable precepts often exerted influence into a simplification in the architecture, interface and implementation, improving project success also reduces the chance of life-cycle maintenance costs.
C language, Unix standard, IETF, RFC standard, a series of draft standards and make the interface more standardized, making it easier to transplant.
Portable programming languages: Java and Python have good portability.
Portable Tools: autoconf to deal with migration issues, configure / make / make install to compile cleanly.
Portability required standards, and open source to the same standardized process brought important influence.

  1. Document: World elaborate network codes to
    documents generally divided into two categories, WYSIWYG (word) and mark the center (XML, markdown), their advantages and disadvantages;
    write Unix documentation best practices: information density is moderate, not too large , functional details and do not omit the problems, the document on the web.

  2. Open Source: Programming in Unix new community
    rules open source development is very simple: open source, released early - often released, in recognition to the contribution (material rewards or spiritual rewards)
    best practices and open source workers work together: version control system (Git, svn, etc.), good code comments, good coding standards and file naming conventions, a good test before posting aunt, good communication practices (mailing list, website, etc.).
    License logic, the selection of a suitable license.

  3. Future: Crisis and Opportunity
    In retrospect, networking, bitmap graphics display and a personal computer these three special technological change driving major changes Unix-style design.
    Though with a lot of innovation, but all the responses to these three technologies have maintained Unix design criteria - Modular, transparency, separation mechanisms of the strategy as well as the previously mentioned qualities.
    Unix support is weak for the GUI. Unix's API does not use exceptions (C language of the lack of mechanisms for throwing an exception).
    From a historical point of view, as long as we can learn lessons from our mistakes, culture Passing the torch, Unix is not going to lose.

Published 47 original articles · won praise 2 · Views 2923

Guess you like

Origin blog.csdn.net/m0_46560389/article/details/105396639