assignment_two
"/home/yossef/notes/Su/network_based_multimedia/assignments/assignment_two.md"
path: Su/network_based_multimedia/assignments/assignment_two.md
- **fileName**: assignment_two
- **Created on**: 2025-03-19 17:14:52
INT338 — Network-Based Multimedia
Assignment 2
Question 1: What is the distinction between lossy and lossless data compression?
- Lossless Compression: After decompression, it provides an exact copy of the original data (e.g., ZIP files, PNG images).
- Lossy Compression: After decompression, it gives a close approximation of the original data — often perceptually similar but not identical (e.g., MP3 audio, JPEG images).
Question 2: Give one application each suitable for lossy and lossless compression.
- Lossy compression: Streaming services like YouTube use lossy compression to reduce video size while maintaining watchable quality.
- Lossless compression: Archiving important documents or code repositories using ZIP ensures data integrity.
Question 3: Describe the difference between lossless and lossy compression techniques for encoding information and give an example of each in image coding.
- Lossless: Maintains original quality — e.g., PNG uses DEFLATE algorithm.
- Lossy: Removes less noticeable data — e.g., JPEG compresses images by discarding subtle color details.
Question 4: Define the term "run length coding."
- Run Length Encoding (RLE): Compresses data by storing consecutive repeated values as a single value and count (e.g.,
AAAAABBB
→5A3B
).
Question 5: Give the principle of "differential encoding."
- Differential Encoding: Encodes the difference between successive values instead of the values themselves, reducing redundancy (e.g.,
5, 7, 9, 10
→5, +2, +2, +1
).
Question 6: Given the following Differential Pulse Code Modulated (DPCM) sequence, reconstruct the original signal:
+4 +2 +3 -2 +3 -1 +1 +1
- Solution: Start with 0 and add each number cumulatively:
4, 6, 9, 7, 10, 9, 10, 11
Question 7: Find the Huffman codeword for the given text "AAAAAAAAAABBBBBCCCSS" using a static Huffman tree. Calculate entropy and derive the average number of bits per character.
- Solution:
- Frequencies: A (10), B (5), C (3), S (2)
- Huffman Tree gives codes: A: 0, B: 10, C: 110, S: 111
- Entropy:
( H = - \sum p_i \log_2(p_i) )
- Average bits per character: Weighted average based on frequencies.
Question 8: Audio encoding at 8 bits/sample, 8000 samples/sec — how many bits per second?
- 8 bits × 8000 samples = 64,000 bits/sec (64 kbps)
Question 9: Video encoding at 1000×1000 pixels, 24 bits/pixel, 30 frames/sec — how many bits per second?
- 1000 × 1000 × 24 × 30 = 720,000,000 bits/sec (720 Mbps)
Question 10: Describe LZW encoding algorithm using "ABABBABCABABBA" as an example.
- LZW (Lempel-Ziv-Welch) builds a dictionary of repeating patterns:
A, B, AB, BA, BB, ABC, C, ABBA
Question 11: Describe four basic types of data redundancy for audio, image, and video signals.
- Temporal: Redundant data over time (e.g., unchanged video frames).
- Spatial: Correlation between neighboring pixels.
- Spectral: Exploits color/brightness frequency patterns.
- Psycho-visual/acoustic: Leverages human perception limits.
Question 12: One example each of lossless and lossy compression techniques:
- Lossless: Huffman Coding
- Lossy: Discrete Cosine Transform (JPEG)
Question 13: Basic approach of entropy coding algorithms. Two examples:
- Steps:
- Measure entropy (amount of information).
- Sort data into a binary tree.
- Traverse tree assigning 0/1 at each branch.
- Codes are paths from root to leaves.
- Examples: Shannon-Fano, Huffman Coding
Question 14: Briefly state the Huffman coding algorithm. Encode "AAABDCEFBBAADCDF":
- Steps:
- Count frequencies, build a tree, assign shorter codes to frequent symbols.
- Example codes: A: 0, B: 10, D: 110, C: 1110, E: 11110, F: 11111
continue:[[]]
before:./assignment_one.md