🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

ISWEEK3-4CONT (1).pdf

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Full Transcript

Parsing Logs and File Systems with Python Automated Evidence Collection 1. Parsing logs refers to the process of Examples extracting relevant information from log Troubleshooting user login issues: By parsing server...

Parsing Logs and File Systems with Python Automated Evidence Collection 1. Parsing logs refers to the process of Examples extracting relevant information from log Troubleshooting user login issues: By parsing server logs, IT personnel can identify any errors or files created by software applications or anomalies that occurred during a user's login systems. attempt, such as incorrect credentials or server connectivity issues. - It involves searching through these log files, identifying the data of interest, and Monitoring website performance: Web developers can parse access logs to track website traffic, identify extracting it in a readable format for slow-loading pages, and pinpoint any errors or issues analysis. that users may encounter while navigating the site. Log files typically contain detailed records Detecting security breaches: Security analysts can of events, errors, activities, and other parse system logs to detect any unauthorized access attempts, unusual system activity, or suspicious relevant data that can be used for network traffic that may indicate a security breach or troubleshooting, monitoring, and analysis cyber attack. purposes. Automated Evidence Collection Examples Log Parsing: It involves analyzing log files generated by systems, servers, or applications to detect errors, Analyzing application errors: Software developers track performance, and troubleshoot issues. can parse application logs to identify bugs, errors, Example: Parsing through server logs to identify and performance issues that may be impacting the security breaches, monitor website traffic, and functionality of the application. analyze application errors. Monitoring network traffic: Network administrators XML Parsing: It involves processing Extensible can parse firewall logs and network device logs to Markup Language (XML) documents to extract and monitor network traffic, identify potential threats, manipulate structured data. and troubleshoot connectivity issues. Example: Parsing through XML data from APIs to retrieve and update information for integration with Types of Parsing other systems or applications. Text Parsing. Involves breaking down and analyzing text data to extract relevant information or identify JSON Parsing: It involves extracting and manipulating patterns. data in JavaScript Object Notation (JSON) format. Example: Parsing through customer feedback Example: Parsing through JSON data from web APIs comments to extract key insights and sentiment to display dynamic content on websites or mobile analysis for product improvements. applications. Automated Evidence Collection CSV Parsing: It involves processing Comma-Separated FAT32 uses a simple file allocation table to manage Values (CSV) files to extract data and import it into file storage. databases or other applications. Example: USB flash drives or external hard drives Example: Parsing through CSV files containing sales formatted with FAT32 for compatibility with various data to import into a CRM system for analysis and devices. reporting. NTFS (New Technology File System): Developed by 2. File systems. These are software components Microsoft, NTFS is a modern file system with features that manage how data is stored and organized on such as file compression, encryption, and storage devices such as hard drives and SSDs. File permissions management. systems provide a way to control access to files and Example: Windows operating systems use NTFS as directories, manage storage space, and ensure data the default file system for internal drives and integrity. Common file systems include FAT, NTFS, network storage. ext4, and HFS+. HFS+ (Hierarchical File System Plus): Developed by Types of File systems Apple for Mac OS, HFS+ is a file system that supports FAT32 (File Allocation Table): A widely used file features like journaling and large file support. system in older Windows operating systems, Automated Evidence Collection Example: Mac computers use HFS+ for storing files Each row in the table represents a specific on internal hard drives and external storage employee and their corresponding information. devices. By organizing the employee data in a structured APFS (Apple File System): Introduced by Apple as a format within a relational database, the company replacement for HFS+, APFS is optimized for flash can easily query and retrieve specific information, storage and offers features like snapshots and perform data analysis, generate reports, and cloning. efficiently manage the workforce. Example: Mac computers and iOS devices running newer versions of the operating system use APFS This structured data allows for standardized data for improved performance and data management. entry, data consistency, and optimization of data storage and access. Example of Structured Data a relational database used by a company to store employee information. In this database, the data is structured into tables with defined columns for attributes such as employee ID, name, department, position, and salary. Automated Evidence Collection In Python, you can use various libraries and - It provides support for large, multi-dimensional modules to parse logs and interact with file arrays and matrices along with a collection of systems. mathematical functions to operate on these arrays efficiently. In parsing logs, the re module can be used for NumPy is widely used for numerical computations, regular expression-based parsing, while libraries linear algebra operations, random number like pandas and numpy are useful for processing generation, and more. and analyzing structured log data. Example of NumPy in Python: In parsing specific log formats, there may be specialized libraries available, such as loguru for import numpy as np parsing log files in a more efficient way. # Create a NumPy array Pandas and NumPy are two popular libraries in arr = np.array([1, 2, 3, 4, 5]) Python used for data manipulation and analysis. # Perform operations on the array print(np.mean(arr)) # Calculate the mean of the array 1. NumPy (Numerical Python): It is a fundamental package for scientific computing in Python. Automated Evidence Collection Commonly used commands in Numpy Example code using numpy to compute the sum of two variables: 1. np.array() - creates a numpy array 2. np.zeros() - creates an array filled with zeros import numpy as np 3. np.ones() - creates an array filled with ones 4. np.linspace() - creates an array of evenly spaced # Define two variables numbers var1 = 10 5. np.sum() - calculates the sum of values in an var2 = 20 array # Compute the sum of the two variables 6. np.mean() - calculates the mean of values in an sum_variables = np.sum([var1, var2]) array 7. np.max() - finds the maximum value in an array print("The sum of var1 and var2 is:", sum_variables) 8. np.min() - finds the minimum value in an array 9. np.sort() - sorts the elements of an array Discussions: 10. np.concatenate() - concatenates arrays along a 1. Import numpy as np, specified axis. 2. Define two variables var1 and var2 with values 10 and 20 respectively. Automated Evidence Collection values = [] 3. Use the np.sum() function to compute the sum of the two variables and store it in the variable # Get input values from user sum_variables. for i in range(num_values): 4. Print out the result. value = float(input("Enter value {}: ".format(i+1))) values.append(value) Output: # Convert the list of values into a numpy array The sum of var1 and var2 is: 30 values_array = np.array(values) Discussion: This is because the sum of var1 (10) and var2 # Display the output (20) is computed using np.sum() function, which gives print("Numpy array of user input values:") the result of 30. print(values_array) Example of numpy code with user input. # Output will depend on user input. import numpy as np # Get user input for the values num_values = int(input("Enter the number of values: ")) Automated Evidence Collection 2. Pandas: It is a powerful data manipulation and # Perform operations on the DataFrame analysis library built on top of NumPy. print(df.head()) # Display the first few rows of the It provides easy-to-use data structures like DataFrame DataFrames and Series that allow you to work with tabular data efficiently. Example # 2: Pandas provides functionalities for data import pandas as pd cleaning, data manipulation, merging and joining datasets, time series analysis, and more. # Creating a dataframe using a dictionary data = {'Name': ['Alice', 'Bob', 'Charlie', 'David'], Example of Pandas in Python: 'Age': [25, 30, 35, 40], 'Gender': ['Female', 'Male', 'Male', 'Male']} import pandas as pd df = pd.DataFrame(data) # Create a DataFrame data = {'Name': ['Alice', 'Bob', 'Charlie'], # Displaying the dataframe 'Age': [25, 30, 35], print(df) 'Salary': [50000, 60000, 70000]} df = pd.DataFrame(data) Automated Evidence Collection Commonly used commands in Pandas. NumPy and Pandas are widely used in data analysis, machine learning, and scientific 1. pd.DataFrame(): Create a new DataFrame. computing in Python. 2. df.head(): Display the first few rows of the DataFrame. 3. df.tail(): Display the last few rows of the DataFrame. - They provide powerful tools and functionalities 4. df.info(): Get a concise summary of the DataFrame. that make handling and analyzing data easier and 5. df.describe(): Generate descriptive statistics of the more efficient. DataFrame. 6. df.shape: Get the dimensions of the DataFrame. Output: 7. df.columns: Get the column labels of the DataFrame. 8. df.groupby(): Group the data based on specified criteria. Name Age Gender 9. df.sort_values(): Sort the DataFrame by specified 0 Alice 25 Female column(s). 1 Bob 30 Male 10. df.drop(): Drop specified rows or columns from the 2 Charlie 35 Male DataFrame. 3 David 40 Male 11. df.iloc[]: Select rows and columns by integer position. 12. df.loc[]: Select rows and columns by labels. Automated Evidence Collection Python can be used to interact with file systems in Discussion #1: In the snippet code, a variety of ways, such as reading and writing files, a. Open the file named "example.txt" in read creating directories, and listing file and directory mode ("r"). contents. b. Read the contents of the file using the read() method and store it in the content variable. Examples of how Python can be used in file c. Display the content of the file to the console. systems: 2. Writing to a file: 1. Reading a file: # Open a file for reading # Open a file for writing file = open("example.txt", "r") file = open("example.txt", "w") # Read the content of the file # Write content to the file content = file.read() file.write("Hello, this is an example text.") # Close the file # Close the file file.close() file.close() # Print the content of the file print(content) Automated Evidence Collection This function creates a new directory if it does not The function returns a list of all the files and already exist. directories present in the current directory, which we then print to the console. 4. Listing file and directory contents: Python's built-in os and io modules provide a wide import os range of functions and methods for working with files and directories. # List all files and directories in the current directory contents = os.listdir() # Print the contents print(contents) Discussion #4: In the snippet code, a. Import the os module and use the listdir() function to list all files and directories in the current working directory. Automated Evidence Collection QUESTIONS? Automated Evidence Collection Topic activity: 1. Discuss how parsing logs and analyzing file systems can be used to detect and prevent security breaches in a digital environment. 2. Give examples of specific techniques and tools that can be utilized for this purpose. THANK YOU!

Use Quizgecko on...
Browser
Browser