C414 Database Systems Normalization Lecture Notes PDF
Document Details
Uploaded by UpbeatJasper6329
AASTMT
Prof. Amani Saad & Dr. Dalia Sobhy
Tags
Summary
These lecture notes cover the concept of database normalization, focusing on 1NF, 2NF, and 3NF. It details the steps involved in converting data to 1NF, 2NF, and 3NF and explains why these procedures are useful.
Full Transcript
C414 Database Systems Normalization Prof. Amani Saad & Dr. Dalia Sobhy Computer Engineering Department, AASTMT Objectives In this lecture, you will learn: What normalization is and what role it plays in the database design process About the normal for...
C414 Database Systems Normalization Prof. Amani Saad & Dr. Dalia Sobhy Computer Engineering Department, AASTMT Objectives In this lecture, you will learn: What normalization is and what role it plays in the database design process About the normal forms 1NF, 2NF, 3NF, BCNF, and 4NF How normal forms can be transformed from lower normal forms to higher normal forms That normalization and ER modeling are used concurrently to produce a good database design That some situations require de-normalization to generate information efficiently Database Tables and Normalization What is Normalization? Process for evaluating and correcting table structures to minimize data redundancies Reduces data anomalies Works through a series of stages called normal forms: First normal form (1NF) Second normal form (2NF) Third normal form (3NF) Database Tables and Normalization (continued) Normalization (continued) 2NF is better than 1NF; 3NF is better than 2NF For most business database design purposes, 3NF is as high as needed in normalization Highest level of normalization is not always most desirable Denormalization produces a lower normal form Price paid for increased performance is greater data redundancy Why do we need Normalization? Example: Consider a company that manages building projects: Charges its clients by billing hours spent on each contract Hourly billing rate is dependent on employee’s position Periodically, report is generated that contains information Why do we need Normalization? ‘Structure is Bad’ - why? Because: Report may yield different results depending on what data anomaly has occurred Relational database environment suited to help designer avoid data integrity problems The Normalization Process – “How to generate a good structure?” Each table represents a single subject No data item will be unnecessarily stored in more than one table All attributes in a table are dependent on the primary key Each table void of insertion, update, deletion anomalies The Normalization Process (continued) Objective: ensure all tables in at least 3NF Higher forms not likely to be encountered in business environment Normalization works one relation at a time Progressively breaks table into new set of relations based on identified dependencies Conversion to 1NF Repeating group Group of multiple entries of same type exist for any single key attribute occurrence Relational table must not contain repeating groups Normalizing table structure will reduce data redundancies Normalization is three-step procedure Conversion to 1NF (continued) Step 1: Eliminate the Repeating Groups Eliminate nulls: each repeating group attribute contains an appropriate data value Step 2: Identify the Primary Key Must uniquely identify attribute value New key must be composed Step 3: Identify All Dependencies Dependencies depicted with a diagram Dependency Diagram: Depicts all dependencies found in table structure Summary: Conversion to 1NF ü 1NF describes tabular format: ü All key attributes are defined ü There are no repeating groups in the table ü All attributes are dependent on primary key ü All relational tables satisfy 1NF requirements ü Some tables contain partial dependencies ü Dependencies based on part of the primary key ü Should be used with caution Conversion to 2NF Step 1: Write Each Key Component on a Separate Line Write each key component on separate line, then write original (composite) key on last line Each component will become key in new table Step 2: Assign Corresponding Dependent Attributes Determine those attributes that are dependent on other attributes At this point, most anomalies have been eliminated Summary: Conversion to 2NF ü Table is in 2NF when: ü It is in 1NF and ü It includes no partial dependencies: ü No attribute is dependent on only portion of primary key Conversion to 3NF Step 1: Identify Each New Determinant For every transitive dependency, write its determinant as PK for new table Determinant: any attribute whose value determines other values within a row Step 2: Identify the Dependent Attributes Identify attributes dependent on each determinant identified in Step 1 Identify dependency Name table to reflect its contents and function Step 3: Remove the Dependent Attributes from Transitive Dependencies Eliminate all dependent attributes in transitive relationship(s) from each of the tables Draw new dependency diagram to show all tables defined in Steps 1–3 Check new tables as well as tables modified in Step 3 Each table has determinant No table contains inappropriate dependencies Summary: Conversion to 3NF üA table is in 3NF when both of the following are true: üIt is in 2NF üIt contains no transitive dependencies How to improve the design? vTable structures cleaned up to eliminate initial partial and transitive dependencies vNormalization cannot, by itself, be relied on to make good designs vIt is valuable because its use helps eliminate data redundancies vIssues to address in order to produce a good normalized set of tables: Ø Evaluate PK Assignments Ø Evaluate Naming Conventions Ø Refine Attribute Atomicity Ø Identify New Attributes Ø Identify New Relationships Ø Refine Primary Keys as Required for Data Granularity Ø Maintain Historical Accuracy Ø Evaluate Using Derived Attributes Surrogate Key Considerations When primary key is considered to be unsuitable, designers use surrogate keys Data entries in the table below are inappropriate because they duplicate existing records No violation of entity or referential integrity Normalization and Database Design ü Normalization should be part of the design process ü Make sure that proposed entities meet required normal form before table structures are created ü Many real-world databases have been improperly designed or burdened with anomalies ü You may be asked to redesign and modify existing databases ü ER diagram ü Identify relevant entities, their attributes, and their relationships ü Identify additional entities and attributes ü Normalization procedures ü Focus on characteristics of specific entities ü Micro view of entities within ER diagram ü Difficult to separate normalization process from ER modeling process Denormalization Creation of normalized relations is important database design goal Processing requirements should also be a goal If tables decomposed to conform to normalization requirements: Number of database tables expands Joining the larger number of tables reduces system speed Conflicts often resolved through compromises that may include denormalization Defects of unnormalized tables: Data updates are less efficient because tables are larger Indexing is more cumbersome No simple strategies for creating virtual tables known as views Summary q Normalization is used to minimize data redundancies q First three normal forms (1NF, 2NF, and 3NF) are most commonly encountered q Table is in 1NF when: q All key attributes are defined q All remaining attributes are dependent on primary key q Table is in 2NF when it is in 1NF and contains no partial dependencies q Table is in 3NF when it is in 2NF and contains no transitive dependencies q Table that is not in 3NF may be split into new tables until all of the tables meet 3NF requirements Summary (continued) q Table in 3NF may contain multivalued dependencies q Numerous null values or redundant data q Convert 3NF table to 4NF by: q Splitting table to remove multivalued dependencies q Tables are sometimes denormalized to yield less I/O, which increases processing speed