An encryption technique where a public key available to everyone is used to encrypt the data, and the data is decrypted by the paired private key known only by the recipient
A type of lossless compression where text is searched for entries that match the entries in a dictionary. Entries are substituted by a unique code which can then be translated
A one-way transformation of data into an abbreviated form called a hash value. The hash value is used to validate login credentials like passwords or PINs without revealing the original data to hackers
A compression algorithm that retains all the data in the file by only storing the instructions needed to reconstruct the original file. No data is lost
A compression algorithm that removes non-essential data from a file leading to a noticeable decrease in accuracy of the data. Data lost is non-recoverable
A type of lossless compression where repeated occurrences of the same data (like several pixels of the same colour in an image) are stored as single data values with their counts
A method of abstractly describing the data tables and the relationships between them visually. They can be used to reduce redundancy and construct a relational database
No transaction should overwrite other transactions that are simultaneously occurring. The same results must be obtained if transactions are concurrently or sequentially processed
A technique used to prevent simultaneous access to data in a database by locking a record when it is being edited or updated. Otherwise, inconsistencies may arise in the database
The unnecessary repetition of a field in multiple tables. Databases should have redundancy in the form of identical copies if part of the database gets lost and needs to be recovered
The idea of keeping a database consistent by ensuring that any changes made to data or relationships associated with a table are accounted for in all the linked tables
A table in 1NF that has data that repeats across multiple records removed and put into a new table with appropriate relationships (no partial dependencies)
The idea of making sure that any logical operation or change in state of a database (transaction) conforms to ACID(Atomicity, Consistency, Isolation, Durability) rules for reliable processing
A method for sending data between two nodes on a network by creating a dedicated communication channel first. All data follow this same path for the duration of communication
A type of network organisation where networked computers (clients) connect to one or more powerful central computers (servers) that handles service requests and has resources
The process of converting the original data (plaintext) into a form which cannot be understood by unauthorised users (ciphertext), using an encryption algorithm (cipher)
A security checkpoint application that monitors incoming and outgoing network traffic, designed to prevent external users from gaining unauthorised access between two networks
A method for sending data over a network by breaking data into several data packets which are sent independently and then reassembled once they all reach their destination
A technique that simplifies network design by dividing a complex system into its component functional layers and assigning protocols for each layer to perform tasks and communicate with adjacent layers
A server application that intercepts all data packets entering and leaving a network to hide the true network addresses of the source from the recipient. They also restrict authorised users' access to data and isolates the network from external networks (like the internet)
Transmission Control Protocol / Internet Protocol (TCP/IP) Stack
A suite of networking protocols that allow networked computers to communicate, consisting of 4 connected layers. Incoming and outgoing data packets are passed through these layers
The data processing and operations performed by the client. The client's browser runs the processing using the user's local computer resources. Typically used to run less critical code