(2) Master the most basic Linux server usage - simple C/C++ program and project compilation under Linux

1. Static library and dynamic library 

 

 Static Library : A static library is a compiled library file in which the code is linked into the program during compilation, so it will form an independent executable file together with the program. Each program that uses a static library will have its own copy of the library, which can lead to wasted memory. Advantages of common suffixes (.a)
:
·Independence: The library code will be statically linked into the program, making the program independent and able to run independently.
·Version control: The library code is embedded in the program and is not affected by the version of the external library.
Disadvantages:
·Waste of memory: Each program that uses a static library will have a copy of the library, which may result in wasted memory.
·Updating is difficult: Updating the library requires recompiling the entire program, and deployment and maintenance will be relatively complicated.

Dynamic Library : It is a compiled library file whose code is loaded into memory by the operating system when the program is running.
Multiple programs can share instances of the same dynamic library, thereby reducing memory usage and improving program execution efficiency. Dynamic library files usually have a .so (Shared Object) suffix and are a shared component that is dynamically loaded at runtime.
Advantages:
Memory efficiency: Multiple programs can share instances of the same library file, reducing memory usage.
Update and maintenance: Library updates only require replacing the library files and do not require recompiling the entire program.
Flexibility: The program can dynamically load and unload libraries, improving the flexibility and maintainability of the program.
Disadvantages:
Runtime dependency: The program requires dynamic library support when running. If the library file is missing, the program will not run.
Deployment complexity: Need to ensure that the correct version of the dynamic library is available on the target system.

2. Software source code compilation and installation

Check the compilation tools:
Before starting compilation, make sure that the tools required for compilation, such as compilers (such as gcc), build tools (such as make), and development libraries (such as libc), are installed in the system. These tools can be installed through package management tools.
Configure compilation options:
Enter the source code directory and run the configuration script to configure compilation options. Usually, you can use the ./configure command to configure, but some software may have its own configuration script. You can use different options to enable or disable features, specify the installation path, and more.
Compile the source code:
Run the make command to compile the source code. This will generate the executable and other necessary files. This step is critical.
Install the software:
Run the sudo make install command to install the compiled files into your system. This will copy the files to the system's standard installation path, usually /usr/local.
If you are interested, you can try some simple installation packages from the Internet.

3. C and C++ compilation process

Compilation process, various files,

gcc related

Use gcc to compile:

Method 1, use gcc single-step generation
#Preprocessing, generate intermediate files (.i)
        gcc -E source.c -o source.i
#Compile, generate assembly code (.s)
        gcc -s source.i -o source. s
#Assemble, generate target file (.o)
        gcc -c source.s -o source.o
#Link, generate executable file
        gcc source.o -o my_program
method 2. Use gcc to retain the intermediate compilation results
gcc -save-temps source.c -o my_program

Common compilation options for gcc

Complete gcc usage can be viewed through the man gcc command

Header file path options:
        -I: Specify the search path for header files.
Multi-threading options:
        -pthread: Enable POSIX thread support.
Code generation options:
        -fPIC: Generate position-independent code.
        -fno-stack-protector: Disable stack protection.
        -fno-exceptions: Disable C++ exception handling
Compile target architecture options:
        -march: Specify the target architecture, such as -march=native.
        -m32: Compile for 32-bit target.
        -m64: Compile for 64-bit target.

 Warning related:

—Wall: Turn on most warnings
—Werror: Treat warnings as errors


Advanced gcc usage - creation of static libraries and dynamic libraries:

1. Basic steps to create a static library:
        1. Compile source files to generate target files (.o files):
        gcc -c file1.c file2.c
        2. Create static libraries:
        ar rcs libmylib.a file1.o file2.o
2 , Basic steps to create a dynamic library:
        1. Compile source files to generate location-independent target files:
        gcc-fPIC-c file1.c file2.c
        2. Create a dynamic library:
        gcc -shared -o libmylib.so file1.o file2. oNote
        that you need to add the -PIC option when creating a dynamic library to generate position-independent code so that it can be loaded at different memory addresses.

4. make command

make is an automated build tool that manages the compilation and build process of source code. It can automatically determine which files need to be recompiled based on rules and dependencies, making the entire build process more efficient and automated. make uses a text file called a Makefile to describe compilation and build rules.

Makefile:

Makefile defines how to compile source code, how to generate object files, and how to generate the final executable file or library. Makefiles use a script-like syntax that contains targets, dependencies, and commands.

Makefile basics:
        Target: The target is a name in the Makefile that represents the file or operation you want to build. Targets can be executable files, library files, pseudo-targets (used to perform specific operations, such as cleaning files), etc.
        Dependencies: Each target can have zero or more dependencies, representing other files or targets required to generate the target. Dependencies tell make which files need to be updated or regenerated before building the target.
        Rules (Ruls): Rules define the process of how to generate target files from dependencies. Rules include targets, dependencies, and build commands.
        Commands: A command is a series of steps defined in a rule that are used to generate targets from dependencies. The command usually starts with Tab and lists the actual compilation, linking and other operations.
        Variables: Variables are used to store and transfer values, making the Makefile more maintainable. You can use variables to store compiler options, source file lists, and more.
        Comments: Comments are used to add instructions to the Makefile so that others can understand the build process and rules.

Makefile content example:

CC=gcc
CFLAGS =-Wall
myprogram: main.c   utils.c
        $(CC)$(CFLAGS)-o myprogram main.c
utils.c
clean:
        rm -f myprogram 

Common commands for make:

make-f: Used to specify a Makefile with a different name than the default to perform the build operation.
make-j N: Build using multiple parallel tasks, where N is the number of parallel tasks. For example make-j4 will use 4
tasks to build in parallel.
make-C dir: Execute the make command in the specified directory. For example make-C src will perform the build in the src directory.
make -B or make --always-make: Force a rebuild of the target, even if the target is already up to date. This is useful in situations where a forced rebuild is required.

cmake、gmake、qmake:

CMake (requires installation) is a cross-platform build tool used to generate build files (such as Makefiles, Visual Studio projects, etc.) for different compilers and operating systems. Use a script-like language to describe the project's build process and generate corresponding build files. A major advantage of CMake is that it can generate build files required by many different compilation systems.
gmake (GNU make) also exists in different operating systems, but its Makefile may need to be modified on different platforms to adapt to different compilers and operating systems.
qmake is a build tool included with the Qt framework and is used to generate build files for Qt projects. It uses .pro files to describe the project's configuration and build rules. qmake can generate Makefiles or Visual Studio project files to build Qt projects on different platforms.

5. Common errors during software compilation

        Dependency problem: The required dependent libraries or tools are missing, causing the compilation process to fail. Solutions include installing missing dependencies, updating versions, or specifying correct dependency paths.
        Compiler error: The compiler reports errors or warnings, which may be due to syntax errors, type mismatches, etc. The solution involves modifying the source code to fix the problem, ensuring that the code complies with the compiler specifications.
        Library path problem: The compiler cannot find the required library file, possibly because the library path is not configured correctly. Solutions include specifying the correct library path, updating library links, etc.
        Version incompatibility: The compiled code may be incompatible with a specific version of a library, compiler, or operating system. Workarounds may involve updating or downgrading the software to meet compatibility requirements.
        Missing or damaged files: Source code or dependent files may be missing, damaged, or incompletely downloaded, causing compilation to fail. Solutions include re-downloading files, repairing damaged files, etc.

……

6. Introduction to the basic concepts of conda

conda is an open source package management and environment management tool, mainly used in fields such as data science, machine learning and scientific computing. Allows users to easily create, manage and share different virtual environments, as well as install and manage different versions of software packages and libraries.
conda was originally part of the Anaconda distribution and later became widely used as a standalone tool.


1. Package version management : conda allows users to install specific versions of software packages and switch between different versions. This is useful for ensuring project consistency across different environments and platforms.
2. Environment management : conda supports the creation and management of independent virtual environments. Each environment can have its own dependencies, configuration, and Python version. This makes it easier to manage multiple projects simultaneously on the same machine.
3. Virtual environment integration : conda can create and manage virtual environments, but it can also be integrated with other virtual environment tools (such as virtualenv) to provide greater flexibility.

Basic commands for conda:
Create a virtual environment: conda create --name myenv
Activate or enter a virtual environment: conda activate myenv

Install packages in a virtual environment: conda install numpy
List installed packages: conda list
Uninstall packages :conda remove numpy
exit the virtual environment: conda deactivate

Delete the virtual environment: first execute the command to exit the virtual environment, and then execute conda env remove --name virtual environment name.
Create an environment configuration file: conda env export > environment.yml
Create an environment from a configuration file: conda env create -f environment.yml

The path of the virtual environment I created: /home/server name/anaconda3/envs/environment name You
can view more detailed help information through conda-help or conda<subcommand>-help 

Similarly , virtualenv is a very useful tool, especially suitable for multi-project development environments. It makes Python project development more flexible, reliable and maintainable by isolating environments, managing dependencies and providing independent Python versions.

Guess you like

Origin blog.csdn.net/weixin_48060069/article/details/132276001