CURL Library & Makefile Directives
41 Questions
0 Views

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to Lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

Within the context of the curl library and its real-world application in command-line URL data transfer, which of the following architectural decisions would most critically influence the library's ability to handle an exceptionally large number of concurrent connections without a degradation in performance, especially considering the inherent limitations of system resources such as file descriptors and memory?

  • Implementing a hybrid approach that dynamically switches between blocking and non-blocking I/O based on the detected network conditions and system load, aiming to optimize resource utilization in real-time.
  • Employing a purely blocking I/O model with a thread-per-connection architecture to ensure simplicity of implementation and debugging, relying on the operating system's scheduler for concurrency.
  • Utilizing a non-blocking I/O multiplexing scheme (e.g., `epoll`, `kqueue`, `select`) combined with an event-driven architecture to efficiently manage numerous sockets within a single thread or a small pool of threads. (correct)
  • Relying on asynchronous I/O operations (AIO) where available, coupled with a completion-based model to offload I/O processing to the operating system kernel, thereby minimizing userspace overhead.

Given the modular structure advocated for the CIS*2750 course, where code is in src/, headers in include/, and a Makefile in the main directory, and considering the complexities of managing dependencies in large software projects, which of the following Makefile directives would most effectively ensure that object files are recompiled only when necessary, minimizing build times during iterative development, particularly when header files are modified?

  • Employing a simple rule that recompiles all source files whenever any file in the `include/` directory is modified, ensuring that all dependencies are always up-to-date, regardless of the actual impact of the changes.
  • Explicitly listing each header file as a dependency for every source file in the `Makefile`, resulting in verbose and potentially error-prone dependency specifications, but ensuring that all possible dependencies are accounted for.
  • Using pattern rules with automatic variables (e.g., `$@`, `$<`, `$^`) to define dependencies between object files and their corresponding source and header files, such that changes to a header file only trigger recompilation of source files that directly include it. (correct)
  • Implementing a recursive `Makefile` structure where each subdirectory has its own `Makefile` that handles the compilation of files within that directory, relying on the top-level `Makefile` to orchestrate the overall build process without detailed dependency tracking.

When designing a generic List ADT in C, intended to store arbitrary data types, and considering the trade-offs between type safety, performance, and memory management, which of the following approaches would provide the most flexible and robust solution for handling data of varying sizes and structures, while minimizing the risk of memory leaks and segmentation faults, especially in a multi-threaded environment?

  • Utilizing a macro-based generic programming technique to generate specialized list implementations for each data type at compile time, achieving optimal performance and type safety but increasing code duplication and compilation time.
  • Using a `void*` pointer in the list nodes to store data, requiring explicit type casting and size management by the user, thereby maximizing flexibility but increasing the risk of type errors and memory corruption.
  • Employing a tagged union (variant record) to store different data types directly within the list node, eliminating the need for dynamic memory allocation but limiting the range of supported types and potentially wasting memory due to the fixed size of the union.
  • Leveraging opaque pointers and function pointers to abstract away the underlying data type and provide a type-safe interface for list operations (e.g., insertion, deletion, comparison), allowing the user to define custom data types and associated operations without exposing the internal representation of the list. (correct)

Considering the fundamental operations of a List ADT (create, insert, remove, retrieve, iterate, clear/delete), and their implications for algorithmic complexity and memory management, which of the following implementations would offer the best average-case performance for frequent insertion and removal of nodes at arbitrary positions within the list, while also minimizing memory fragmentation and overhead, particularly in scenarios with a large number of list modifications?

<p>A skip list, providing logarithmic time complexity for insertion, removal, and search operations with probabilistic balancing, but requiring more complex implementation and potentially higher memory overhead due to the multiple levels of linked nodes. (C)</p> Signup and view all the answers

Given the inherent limitations of fixed-length arrays when dealing with dynamic data sets, and the need for a flexible data structure like a linked list in C, which of the following strategies would be most effective in mitigating the risks associated with memory leaks and dangling pointers when implementing list operations such as insertion, deletion, and clearing of the list, especially in a long-running application with frequent list modifications?

<p>Employing smart pointers (e.g., from C++ or a custom implementation in C) to automatically manage the lifetime of list nodes and ensure that they are deallocated when the last reference to them is removed, preventing memory leaks and dangling pointers, but requiring careful consideration of ownership and circular references. (A)</p> Signup and view all the answers

In the context of makefiles, under which precise circumstance is the command line associated with a target executed?

<p>When any prerequisite file has a modification timestamp that is chronologically later than the target file's modification timestamp. (D)</p> Signup and view all the answers

What is the essential syntactical requirement that prefixes the command line within a makefile, and what potential pitfall does this present?

<p>A single tab character, the incorrect use of which is a common source of errors and frustration. (A)</p> Signup and view all the answers

When the make command is invoked without specifying a target, what is the utility's default behavior regarding target selection?

<p>It builds the first target that appears within the makefile, assuming a top-down parsing order. (A)</p> Signup and view all the answers

Consider a large-scale software project managed by make. What ramifications would arise if the makefiles were designed to recompile all files regardless of dependency modification times?

<p>The efficiency of the build process would be drastically reduced, leading to longer compilation times, especially detrimental for large projects. (D)</p> Signup and view all the answers

Examine the following makefile snippet:

obj/%.o: src/%.c include/%.h
	$(CC) -c -o $@ $< $(CFLAGS)

If the source file src/main.c and its corresponding header include/main.h are both newer than obj/main.o, which statement accurately describes the expected behavior of make?

<p><code>make</code> will recompile <code>src/main.c</code> into <code>obj/main.o</code> because both <code>src/main.c</code> and <code>include/main.h</code> are prerequisites and newer than the target. (C)</p> Signup and view all the answers

Given the inherent limitations of the make utility as described, what advanced build system paradigm addresses the challenges of scaling to extremely large, heterogeneous projects with complex dependency graphs and toolchain requirements?

<p>Declarative build systems, employing domain-specific languages to define build rules and dependencies abstractly. (D)</p> Signup and view all the answers

In the intricate dance of software compilation, what is the subtle but profound distinction between a 'prerequisite' and a 'dependency' within the conceptual framework of make?

<p>A 'prerequisite' is a source file that needs to be compiled if modified, while dependency is any file that has impact on the target. (header files included) (A)</p> Signup and view all the answers

Consider a scenario where a makefile target relies on a dynamically linked library (.so file) as a prerequisite. If the shared library is updated, but the target is not recompiled, what potential runtime issues might arise, and how would you diagnose them?

<p>The program will exhibit undefined behavior stemming from ABI (Application Binary Interface) incompatibilities between the target and the updated library. Use <code>ldd</code> to inspect library dependencies and <code>objdump</code> to compare symbol versions. (B)</p> Signup and view all the answers

Delving into the nuances of parallel builds with make -j, under what conditions might the theoretical linear speedup with respect to the number of cores be unrealizable, and what strategies can mitigate these bottlenecks?

<p>All of the above. (D)</p> Signup and view all the answers

Given a scenario where a makefile is intended to manage the build process across multiple platforms with varying toolchains and library locations, what advanced techniques can be employed to ensure portability and adaptability without manual intervention?

<p>Employing Autoconf and Automake to generate platform-specific makefiles by probing the build environment for available tools and libraries. (D)</p> Signup and view all the answers

Within a large-scale software project employing a complex build system, what is the most critical implication of violating the principle of including header files solely by their name (e.g., #include "MyClass.h") and instead embedding explicit relative paths (e.g., #include "../include/MyClass.h") throughout the codebase?

<p>It introduces 'brittleness' into the build process, such that seemingly minor refactoring of the directory structure or changes to build configurations can lead to widespread compilation failures that are difficult to diagnose and resolve. (D)</p> Signup and view all the answers

In the context of C preprocessor directives for header file inclusion, if a header file my_header.h is included using angle brackets (#include <my_header.h>), what is the guaranteed behavior regarding the search path used by the preprocessor to locate the file, and what are the security implications?

<p>The preprocessor searches only the standard system directories, providing a secure method to prevent including user-supplied headers that might shadow system headers, thus mitigating potential vulnerabilities. (B)</p> Signup and view all the answers

Consider a scenario where header1.h includes header2.h, and header2.h includes header1.h. Without include guards, what specific compilation error is most likely to occur, and how do include guards prevent it?

<p>A preprocessor error due to infinite recursion will occur. Include guards prevent this by ensuring that a header file is only processed once, thus breaking the circular dependency. (A)</p> Signup and view all the answers

You are tasked with optimizing the build process of a large C project. The build times are excessively long due to redundant recompilation of source files caused by unnecessary header file inclusions. Which strategy provides the most effective and maintainable solution for minimizing these unnecessary dependencies and reducing build times?

<p>Employ 'forward declarations' in header files instead of including full header files whenever possible, and refactor the code to minimize the number of header files each compilation unit includes directly. This is combined with a strict policy enforcing minimal header inclusion. (B)</p> Signup and view all the answers

Within a highly modular C project, strict coding standards mandate the use of opaque pointers to enforce information hiding and maintain abstraction barriers between modules. How should header files be structured to declare and use these opaque pointers correctly without exposing internal implementation details to client code?

<p>In the header file, forward declare the structure (e.g., <code>struct OpaqueType;</code>) and typedef a pointer to this incomplete type (e.g., <code>typedef struct OpaqueType *OpaqueTypePtr;</code>). Provide accessor functions to manipulate the structure. (C)</p> Signup and view all the answers

You're developing a cross-platform library that must compile correctly on systems with varying file system case sensitivity (e.g., Windows vs. Linux). The library relies on several header files, and you want to ensure portability without requiring users to modify include paths. What is the most robust strategy to handle potential case mismatches in header file names during inclusion?

<p>Enforce a strict naming convention in coding standards where all header file names are consistently lowercase. Then provide build scripts that check and enforce this convention before compilation, flagging any violations as errors. (B)</p> Signup and view all the answers

In the context of C preprocessor, how does the #pragma once directive differ fundamentally from the traditional #ifndef include guard mechanism in preventing multiple inclusions of header files, and what are the critical trade-offs to consider when choosing between them in a large, complex project targeting diverse compilers and platforms?

<p><code>#pragma once</code> relies on compiler-specific implementation details and filesystem features (inode), offering faster compilation speeds but potentially sacrificing portability across different compilers and systems, whereas <code>#ifndef</code> is universally supported but incurs a slight performance overhead due to preprocessor symbol checks. (B)</p> Signup and view all the answers

In the compilation process facilitated by gcc, what is the most salient reason for its involvement in invoking tools such as cpp and ld beyond mere execution?

<p>To abstract the complexities of direct tool interaction, ensuring commands remain concise and portable across diverse system architectures, preempting verbose and non-portable command structures. (A)</p> Signup and view all the answers

When a program using dynamically linked libraries is executed, ld.so resolves library references on demand. What constitutes the most critical rationale for this lazy resolution approach?

<p>To reduce the initial memory footprint of the process by only loading libraries and resolving symbols as they are actively invoked, optimizing memory usage and startup time. (D)</p> Signup and view all the answers

In the context of dynamic linking, what is the most precise technical distinction between the roles of ld.so and dlopen in managing shared object dependencies during program execution?

<p><code>ld.so</code> resolves dependencies at program startup or the first call, whereas <code>dlopen</code> allows explicit loading and linking of shared objects at arbitrary points during runtime. (C)</p> Signup and view all the answers

Consider a scenario where a software application utilizes dlopen to load a shared object plugin. What inherent risk is most significantly amplified by this practice, necessitating rigorous mitigation strategies?

<p>Increased vulnerability to arbitrary code execution due to the potential for loading malicious or compromised shared objects at runtime. (C)</p> Signup and view all the answers

Given an application that utilizes dlopen to dynamically load modules, and further considering the possibility of multiple modules exporting symbols with identical names, which resolution strategy would most effectively prevent symbol collision and ensure proper module isolation?

<p>Employing unique namespace prefixes for all exported symbols within each module, thereby creating distinct symbol identifiers that prevent naming conflicts at runtime. (C)</p> Signup and view all the answers

Imagine a performance-critical application that relies heavily on dynamically loaded shared objects. What optimization strategy would have the most profound impact on minimizing latency associated with dlopen calls and subsequent function invocations?

<p>Implementing a caching mechanism for recently loaded shared objects, allowing subsequent <code>dlopen</code> calls to retrieve modules from the cache instead of loading them from disk. (A)</p> Signup and view all the answers

In the context of ensuring the integrity and security of dynamically loaded shared objects, which mechanism offers the most robust protection against unauthorized modification and tampering?

<p>Employing cryptographic signatures and verification upon loading, ensuring that only shared objects signed by trusted authorities are loaded, thus preventing the execution of tampered code. (C)</p> Signup and view all the answers

Considering an application that frequently loads and unloads shared objects using dlopen and dlclose, what is the most critical measure to prevent memory leaks and ensure long-term stability?

<p>Implementing a reference counting mechanism to track the number of active references to each shared object and unloading the module only when the reference count reaches zero. (E)</p> Signup and view all the answers

A developer reports that a program crashes when attempting to dlopen a certain shared library. Using tools like ldd, it's noted that all dependencies appear to be resolved. However, the crash persists specifically when the library is being loaded, prior to any function calls. What is the most probable underlying cause of this crash?

<p>The shared library contains static initializers that execute during the loading process and trigger a segmentation fault or unhandled exception due to environmental assumptions. (B)</p> Signup and view all the answers

Within the context of a complex, multi-module C project using make, which of the following scenarios MOST accurately exemplifies the strategic use of CPPFLAGS to ensure both code portability and maintainability across diverse development environments?

<p>Centralizing the management of include paths for external libraries shared across multiple modules by setting <code>CPPFLAGS</code> at the project level, thereby preventing redundant specifications in individual module Makefiles and facilitating easier updates when library locations change. (A)</p> Signup and view all the answers

Consider a scenario where a Makefile needs to conditionally compile different source files based on the operating system. Which approach demonstrates the MOST robust and portable method to achieve this using Make's built-in features and conditional directives?

<p>Leveraging Make's conditional functions (e.g., <code>$(findstring ...)</code>) in conjunction with predefined macros that expose system-specific information, allowing for platform-aware compilation rules within the Makefile itself. (C)</p> Signup and view all the answers

In a complex software project utilizing multiple external libraries, what is the MOST effective strategy for incorporating these libraries into the build process using a Makefile, ensuring both ease of maintenance and flexibility in library versioning?

<p>Defining Make variables to represent the include paths and linker flags for each library, and then using these variables consistently throughout the Makefile, allowing for easy modification of library locations and versions. (C)</p> Signup and view all the answers

Given a scenario where a Makefile is intended to manage the build process for a project involving both C and Fortran code, what is the MOST strategic approach to ensure seamless interoperability between the compiled objects, particularly with regard to name mangling and calling conventions?

<p>Developing a thin C wrapper layer that encapsulates the Fortran code, exposing a C-compatible API that can be easily called from other C modules, thereby isolating the Fortran-specific aspects of the project. (B)</p> Signup and view all the answers

Consider a Makefile designed for a large-scale, parallel build process on a multi-core system. Which of the following strategies would MOST effectively mitigate the risk of race conditions and ensure data integrity during concurrent compilation of multiple source files?

<p>Organizing the project's source code into a directed acyclic graph (DAG) of dependencies, and then using Make's dependency tracking features to ensure that source files are compiled in the correct order, preventing premature access to incomplete or inconsistent data. (A)</p> Signup and view all the answers

You are tasked with optimizing a Makefile for incremental builds in a project where header files are frequently modified. Which strategy MOST effectively minimizes unnecessary recompilations when changes are made to header files?

<p>Using the <code>-MMD</code> flag with GCC to automatically generate dependency files that track header file dependencies, and then including these dependency files in the Makefile to ensure that source files are recompiled only when necessary. (B)</p> Signup and view all the answers

In the realm of cross-compilation using Makefiles, what constitutes the MOST robust approach to configuring the build environment to target an embedded system with a different architecture and operating system than the host machine?

<p>Employing a containerized build environment (e.g., Docker) that encapsulates the cross-compiler toolchain and all necessary dependencies, thereby ensuring a consistent and reproducible build process across different host machines. (C)</p> Signup and view all the answers

When designing a Makefile to manage a project that includes generated source code (e.g., code generated by lex or yacc), what is the MOST critical consideration to ensure that the generated code is always up-to-date before compilation?

<p>Explicitly defining dependencies between the generated source files and the generator tools (e.g., <code>lex</code>, <code>yacc</code>) in the Makefile, ensuring that the generators are invoked whenever their input files are modified. (A)</p> Signup and view all the answers

In the context of using Makefiles for continuous integration (CI) and continuous deployment (CD) pipelines, what is the MOST effective method to ensure that builds are reproducible and isolated from the host environment's configuration?

<p>Using a virtual machine or container to encapsulate the build environment, ensuring that all dependencies and tools are explicitly defined and managed within the virtualized environment. (C)</p> Signup and view all the answers

Considering a complex Makefile that manages the build process for a project with numerous conditional compilation flags, what is the MOST maintainable approach to manage these flags across different build configurations (e.g., debug, release, profiling)?

<p>Employing a configuration management tool (e.g., <code>autoconf</code>, <code>cmake</code>) to generate the Makefile based on the desired build configuration, thereby abstracting away the complexity of managing conditional flags directly. (A)</p> Signup and view all the answers

Flashcards

cpp meaning

In the context of this course, "cpp" refers to the C preprocessor, not the C++ programming language.

"Best practice".h pattern

A "best practice" pattern in header files to prevent multiple or circular includes.

#ifndef

The #ifndef preprocessor directive checks if a macro is not defined.

#define

The #define preprocessor directive creates a macro.

Signup and view all the flashcards

#endif

Indicates the end to a preprocessor conditional block.

Signup and view all the flashcards

#include

Used for system header files; searches standard system directories.

Signup and view all the flashcards

#include "file"

Used for header files of your own program; searches current and 'quote' directories.

Signup and view all the flashcards

-I flag in GCC

Specifies paths for the preprocessor to find header files (for #include directives).

Signup and view all the flashcards

-L flag in GCC

Specifies paths for the linker to find libraries.

Signup and view all the flashcards

-l flag in GCC

Specifies the names of libraries that the linker must link with.

Signup and view all the flashcards

GCC's Role

GCC acts as a front end, invoking cpp (preprocessor), gcc (compiler), and ld (linker).

Signup and view all the flashcards

ld.so

The dynamic linker/loader that loads shared libraries when a program starts.

Signup and view all the flashcards

Loader Function

Copies segments of the program file into different regions of the process's virtual memory (text, data, stack, heap).

Signup and view all the flashcards

dlopen

Loads shared object 'plugins' on demand at runtime.

Signup and view all the flashcards

Use case of dlopen

Used by programs to load shared object plugins on demand at runtime.

Signup and view all the flashcards

ldd command

A command to see the shared library dependencies of a shared object on Linux.

Signup and view all the flashcards

Prerequisite (in Make)

A file that must exist for a target to be buildable.

Signup and view all the flashcards

Command Line (in Make)

Executed to build the target using the dependencies.

Signup and view all the flashcards

Tab Character in Makefiles

The command line must start with this character.

Signup and view all the flashcards

Typing make (no target)

Looks for Makefile or makefile and builds the first target.

Signup and view all the flashcards

Typing make <target>

Builds the specified target.

Signup and view all the flashcards

Makefile Recompilation Trigger

If any prerequisite is newer than the target, triggering a rebuild.

Signup and view all the flashcards

Efficiency of Makefiles

Recompiles only the necessary files instead of everything.

Signup and view all the flashcards

Makefile

A file containing instructions for the make command.

Signup and view all the flashcards

Target (in Make)

The output or executable file that make aims to create.

Signup and view all the flashcards

Dependencies (in Make)

Files used to create the target. If these change, the target may need to be rebuilt.

Signup and view all the flashcards

Course code structure

Convention for organizing code: Source files go in 'src', headers in 'include', Makefile in the main directory.

Signup and view all the flashcards

Makefile macros

Macros in Makefiles define locations of code (SRC) and headers (INC).

Signup and view all the flashcards

Why Linked Lists?

Linked lists provide a flexible alternative to fixed-length arrays for handling optional data.

Signup and view all the flashcards

List Operations

Common operations include create, insert (front/back), remove (front/back), retrieve, iterate, clear/delete, and create a human-readable representation.

Signup and view all the flashcards

Data Structure Genericity

A well-designed data structure should be generic, accommodating various data types.

Signup and view all the flashcards

Make's built-in rules

Built-in rules that use the correct compiler to turn source code into an executable.

Signup and view all the flashcards

Flags in Makefiles

Options for the tools in the C toolchain, set in Makefiles.

Signup and view all the flashcards

CC variable in Makefile

Specifies the C compiler to be used, e.g., CC=gcc.

Signup and view all the flashcards

-Iinclude_file_dir flag

Adds a directory to the include paths for both <> and "" includes.

Signup and view all the flashcards

Comments in Makefiles

A line starting with # in a Makefile that will be ignored by make.

Signup and view all the flashcards

all: target

Target that represents all other targets in the Makefile, useful to compile all programs or libraries.

Signup and view all the flashcards

-lm flag

Link math library using the -lm flag.

Signup and view all the flashcards

-std=c11 flag

C standard version to adhere to during compilation.

Signup and view all the flashcards

-Wall flag

Enables compiler warnings to catch potential issues with the code.

Signup and view all the flashcards

CPPFLAGS=

Specify additional flags for the C preprocessor

Signup and view all the flashcards

Study Notes

C Tool Chain

  • Volunteer Note Takers are needed for students who require them.
  • The informal term describing the sequence of software tools that converts your source code into binary code is the Tool Chain.
  • IDEs use the same tools but manage them for the user.
  • Different languages use different Tool Chains.
  • Use "C" for the "native" portion (A1-A2), and Python/SQL for the UI/database portion (A3).
  • C programmers working with *nix-family operating systems use a variety of tools regularly
  • The tools are exposed to view, even the ones that are hidden inside others.
  • The lecture goes over how to control the tools and how they “chain” together in sequence.
  • The goal is to bring the tools up from the level of ritualized mumbo-jumbo to one of knowledge.
  • The usual suspects are the C Preprocessor, C Compiler, Linker, Loader and Make.
  • IDE = Interactive Development Environment.
  • Windows uses Visual Studio, macOS uses Xcode, and Multi-platform uses Eclipse, IntelliJ, Codelite, and CodeBlocks.
  • Popular FOSS (free, open-source) compilers are the GNU Compiler Collection (GCC) compiler and Clang. Clang is a more recent compiler and an alternative to GCC.
  • We mostly use the C compiler from GCC as the platform in SoCS.

C Toolchain Components

  • The components of the C compiler toolchain include:
    • cpp*, gcc*, gas, and ld*
    • C Preprocessor, C Compiler, Assembler, Linker, Loader, Make, and System Libraries.
  • The intermediate files discussed (.i, .s, and .o) may not always be generated by default.
  • To see the intermediate files, compile your code with the -save-temps flag.

C Preprocessor

  • Purpose: Interpret all the # directives in the .h and .c files before the compiler sees the source code
  • #include: merge in header files
  • #define: macros (with and without arguments), macros are “expanded” by processor through find-and-replace
  • #ifdef, if, else, endif: conditional compilation
  • Exception: #pragma is passed on as compiler directive (ignored if compiler doesn't recognize).
  • The C preprocessor is often abbreviated to "cpp".
  • "cpp" means "C preprocessor"
  • Best practice .h pattern in file headername.h prevents multiple/circular includes
#ifndef HEADERNAME_H
#define HEADERNAME_H
...body of .h file...
#endif
  • If #include occurs again (even indirectly), the file's contents is skipped
  • Different syntax is used for different headers with the #include directive.
  • #include <file> is used for system header files and searches for a file named file in a standard list of system directories.
  • #include "file" is used for header files of your own program and searches for a file named file in the directory containing the current file, then in the "quote" directories, and then the same directories used for <file>.
  • Always include the file itself, e.g. #include "Parser.h", instead of the path to the file in the #include statement, e.g. #include "../includeDir/Parser.h".
  • A directory with a header file can be passed to the compiler by using the -I option, e.g. gcc -I../includeDir myFile.c.
  • Multiple include directories can be specified, including relative or absolute paths, e.g. gcc -I../includeDir -I/home/username/stuff/otherIncludeDir myFile.c

C Compiler

  • It has a purpose to Compile C language source code (.i file) into assembly language (.s).
  • It diagnoses abuses of language, and issues warnings and/or errors.
  • Intermediate .i and .s files are normally deleted after assembly unless requested.
  • Compile with gcc -save-temps test.c

Assembler

  • Assembles assembly code (.s) from the compiler into object code (.o).
  • step, a tool, is normally transparent, but there are options for controlling it.
  • Assembly statements can be inserted into your C program utilizing vector instructions for the CPU's vector processing unit.

Linker

  • Stitches together objects (.o) into a single binary program file (executable or library).
  • Combines user program object files and referenced system libraries.
  • The linker creates static (.a/.lib) or dynamic (.so/.dll) libraries.
  • In object files, there are tables of external references (e.g., call "printf").
  • The libraries are read until a matching external definition is found (the actual code of the printf() function in stdio.h).
  • Static linking pulls the definitions into the program file and fixes all the refs points in the file.
  • Dynamic linking "makes a note” the respective library from run time.
  • The names of all other libraries but C must be explicitly passed in via a -l option.
  • example, to link the C math library, add the -lm option to the compiler, where m is the name of the math library.
  • The linker looks in a small number of directories containing system libraries, like /use/lib, /user/local/lib, etc by default.

Linking and Paths

  • If myCode.c and someLibrary are in the same directory, running gcc myCode.c -lsomeLibrary will give an error because the current directory is not automatically included.
  • The linker can be given more paths using the -L flag in the command gcc myCode.c -L/path/to/a/library -lsomeLibrary.
  • If myCode.c and someLibrary are in the same directory, compiling from that directory requires the command gcc myCode.c -L. -lsomeLibrary

Preprocessor and Linker Paths

  • The flags -I , -L and -l provide paths for the preprocessor, linker, and the names of libraries but be linked with, respectively.
  • GCC acts as a front end for cpp, gcc, and ld.
  • Running “gcc” executes these for the user.
  • is not cpp or ld as it calls them for you.
  • Reason: convenience as the knows where standard libraries are installed so it secretly beefs up FLAGS for you, otherwise your commands would be much longer/messier and less portable.

Loader

  • Examined steps are concerned with creating binary files - e.g., executables using libraries
  • OS opens a fresh process (ref. CIS*3110) and calls loader to “fill it up" by copying the segments of the program file into different regions of the (virtual) m memory for the process then static data, then OS creates a new stack and an empty heap, then transfers control to first instruction of program
  • The ld.so loads dynamically linked libraries upon program start, and references/calls get fixed on load.
  • Some programs use dlopen to load shared object “plugins” on demand at run time.
  • The OS pauses execution while the loader grabs the object and links it into already-loadedcode, then carries on with execution.
  • Use the ldd command ldd a.out, in Linux to see shared object dependencies
  • Loader’s knowledge of specific directories is in directory '/etc/ld.so.conf.d'

Adding a Path for loader

  • To add path for loader update the environment variable LD_LIBRARY_PATH, otherwise, the loader will not look for your shared library.
  • Command export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:..
  • add it to '.bashrc file.' in home directory.
  • An example of this is needing to to load the List API library that we will use in the course and need this when we link the A1 parser library with a main program that would test it

Compiling a List Library

  • Your assignments will use a simple list library, and the sample code file include LinkedLIstAPI.c, LinkedListAPI.h, StructLIstDemo.c.
  • Example:
    • Compile library (more on it in this lecture), and demo which includes library headers location, respective binary file, and respective library binary file.

Further Compilation of List Library

  • For can compile library using provided Makefile which creates binary file called liblist.so .
  • linker refers to list- this convention will explained later as the notes go on.
  • Compiling the executable demo file using gcc StructListDemo.c produces bunch of errors.
  • This the component of the is made for compiler toolchain who generate this is it error.
    • solution have to tell linker what to is link where do.
  • Run gcc StructListDemo.c -l -llist and where
    • -L tells linker to look for 'a library" the at
    • Linker called in library called -list "

Loader and Running Files

  • When execution new gets, or, loader' to current directory.
  • Run the executable via current-directory: './StructListDemo; error while loading shared -liblist.so shared directory

Makefiles

  • The make utility executes a sequence of commands from a description file
  • It is generally used to create executables, but it can perform other tasks such as removing the status of a project, packaging multiplefiles,files, installing fi filesdirectories,a or building libraries
  • The make utility is just one of many tools for generating executables as the ant utility for Java is similar and Cmake is another improved make utility that developers can use. and the compiler for Java is similar and Cmake is another improved make utility that developers can use.
  • However, make are iscommon, so this stick with it

Compiler Toolchain and Makefiles

  • make utility is a completely separate, unrelatedentity.
  • Then utility passflags, to compiles and libraries

Makefiles and The Vanilla Apporoach

  • makefiles have a many effective ways to achieve tasks.
  • Programmers' ways of coding then and good. to makefiles is incantations
  • We straightforward and so. to you
  • In a bad

Examining Makefiles

  • Examined make, date is file
  • Build then make.
  • The compiled file that is of a now with

Makefile Structure

  • Each entry in the make file consists of three parts:
    • Targets
    • Pre-requesites or dependencies
    • Command Lines
myprog: myprog.c myprog.h
gcc myprog.c -o myprog

Target

  • A target is a particular make file entry aims to build
  • a target is a can be, or e.g. librecord. so and later

Prerequisites

  • Files to to be "buildable"
  • if want to if file's then

Efficiency of Makefiles

  • a. Recompiling slow

Command Line

  • The to it utility with.
  • with tab
  • Single tab characters

Invoking Make

Typing only: make at the command line will look for either a file named Makefile or makefile and build the first target that appears in the file.

Example of a Specific Target makeFile and typing

myprog: myprog.c
gcc ...

fred: fred.c
gcc ...

Typing • will a. • will fred. • will a • Because is it is what be file

Other Common Targets

  • Do are
clean:
rm *.o core
  • These can be to clean up

Checking Commands

• can -n is

make someTarget -n
   

Command Lines

• To to by- a colon

libawesome.a: awesome.c
gcc awesome.c -o awesome.о -с ;\
ar cr libawesome.a awesome.o

• The is the. To it

Using Makefiles Macros

Macros avoid typing in make files. Flags' files then retype. is macros replace that of with Macros longstrings macros short text

  • Are with equal sigh
LIBS = -L/usr/local/lib -lm -llibname
$(LIBS) or $(LIBS) or

Example of SampleMakeFile

CC = gcc LIBS = -lm -L/usr/local/lib -L. -lmyLib

prog: prog.c
$(CC) prog.c -o prog $(LIBS)
  • This to compilers or by Lib

Compiler and Paths

  • Macros a nothing.
a1: a1.c a1.h
gcc a1.c -o a1 $(A)
  • with what
  • You can use them and You

Predefined Macros

  • Macro is as command is an symbollink to the default C compiler.
  • the C compiler or C is pre defined.

Macro Strings Substitutions

  • String this e.g.
SRCS = a.c b.c c.c
all: $(SRCS: .c = )
`

Suffix Rules

  • Suffix rules will be defined rules .a,,. c++ with suffix .oc to with all to to
SRCS = a.c b.c c.c
all: $(SRCS.c=)

Code Comments

  • Comments and continue end

all: a1 a2
a1: a1.c a1.h

Tools and Flags

  • Toolchain makefiles. are so multi are flags

Select The Compiler In MakeFile

  • C Select C as front end Mac use is unless are Various are Chain

Examples of Preprocessor Flags

  • -I~/myproj/include in directive
- Dsymbol[=value]
- DNDEBUG
  • Equivilent define and disable assertions

Compiler Flags

  • -Wall
- g 
- On , ,
  • save optimization code

LinkerFlags

• -Llibrary_dir Pass a library path to the linker E "~/myproj/lib Add ~/ "link"

  • -ilibrary a link
  • cal file

C Code

  • c directories naming' of the the include code

###Real World

  • Code

Standards

  • Is updated

StandarMacros

  • Then macros .e.g.

List API And Libraries

  • To something
  • a are use to hash list

Iterating Through A LIst

  • We. can is a that It of is for a. A is is a collect and collection
  • a list rely on a. way It to, a collect

Example of C++ Iterators

  • In in be more that

Store of

  • be. so a that is to is string data
  • a are to to with to is

List API

  • is .zip in a a data is of has and of

List Operations

  • List functions to front / are
  • and List it List

Pointers and their Function

  • A pointers can If a be then we to for a do

Pointers Allocating in Main()

  • In this function P allocated in a do it

Memory and Valgrind

  • a this the to that that do that not we must to if pass its. Then be so

Pointers

  • can it with of an

Bad Code

  • a can by it and by in code

Allocate Function and Pointers

  • the .c file is P in the the or memory is can be to is is

Types And Functions

  • is a has over of do to are a local block the

Variable In Function

  • or it's

Predicates and Searching

  • Data by the is with with with function to the with the data. is the a data data data.

Search and Predicates

  • is what if do you

Errors and Valgrind

  • Valgrind leak check

Pointers and Valgrind

  • by

Memory and Functions

  • And will

C Programming

Scope Class Storage

  • Access control scope be the

Storage Classes

  • Variables can local to the the more is to the are

Types Definitions

  • The with and functions in

Symbol and Function

  • Each scope that
  • if, in.

The Scope in the code

  • program is a

The Storage used

  • Static class can storage has

Problems with Precedence

  • a is a a is a in

Coding tips

  • Coding or keep code or or the is it

Odds and Ends in Enums and Types

Types

You the and in of of of of system

Data Pointers

  • In to that is that can can use them

Rules in Types

  • be in can be to is and of
  • and and and by that

Code

  • In file by is data list by. With

Review and Types

  • Types. a short new to and write.
  • Types will .s and. h all. and

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

Description

This quiz covers concurrent connections in the CURL library, and optimization of Makefile directives for efficient recompilation. It also touches on Abstract Data Types (ADTs) in C.

More Like This

Impact of Chilli Leaf Curl Virus on Fruit Yield
40 questions
Soft Curl Perm Flashcards
32 questions
Soft Curl Permanent Wave Flashcards
7 questions
Hair Styling: Pin Curl Techniques
8 questions

Hair Styling: Pin Curl Techniques

CostEffectiveGeometry7047 avatar
CostEffectiveGeometry7047
Use Quizgecko on...
Browser
Browser