Introducing
Your new presentation assistant.
Refine, enhance, and tailor your content, source relevant images, and edit visuals quicker than ever before.
Trending searches
Normalization is defined as a process that helps simplify the database in its most optimal structure in the simplest form by organizing the attributes and relations to reduce data redundancy.
A differnent way to look at it
"Data normalization is generally considered the development of clean data.... Simply put, this process includes eliminating unstructured data and redundancy (duplicates) in order to ensure logical data storage."
https://www.bmc.com/blogs/data-normalization/
It helps eleminates errors and repitition by organizing the data.
It simplifies data like how you simpify a fraction so its easier to understand.
Data integrity can simply be defined as the accuracy of the data.
"Data integrity is a concept and process that ensures the accuracy, completeness, consistency, and validity of an organization’s data. By following the process, organizations not only ensure the integrity of the data but guarantee they have accurate and correct data in their database."
https://www.fortinet.com/resources/cyberglossary/data-integrity
https://www.educba.com/normalization-in-dbms/
https://sis.binus.ac.id/2018/01/17/drawbacks-of-normalization/#:~:text=As%20we%20all%20know%2C%20normalization,it%20achieves%20the%20optimum%20structure.
https://www.talend.com/resources/what-is-data-integrity/#:~:text=Data%20integrity%20is%20the%20overall%20accuracy%2C%20completeness%2C%20and,and%20standards%20implemented%20during%20the%20design%20phase%20.
https://www.fortinet.com/resources/cyberglossary/data-integrity
https://www.bmc.com/blogs/data-normalization/
***Note you'll have to copy and past to access the websites